UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Performance management framework for small to medium sized water utilities : conceptualization to development… Haider, Husnain 2015

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
24-ubc_2015_September_Haider_Husnain.pdf [ 14.92MB ]
Metadata
JSON: 24-1.0074432.json
JSON-LD: 24-1.0074432-ld.json
RDF/XML (Pretty): 24-1.0074432-rdf.xml
RDF/JSON: 24-1.0074432-rdf.json
Turtle: 24-1.0074432-turtle.txt
N-Triples: 24-1.0074432-rdf-ntriples.txt
Original Record: 24-1.0074432-source.json
Full Text
24-1.0074432-fulltext.txt
Citation
24-1.0074432.ris

Full Text

  PERFORMANCE MANAGEMENT FRAMEWORK FOR SMALL TO MEDIUM SIZED WATER UTILITIES: CONCEPTUALIZATION TO DEVELOPMENT AND IMPLEMENTATION  by  Husnain Haider   A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF  DOCTOR OF PHILOSOPHY   in  THE COLLEGE OF GRADUATE STUDIES  (Civil Engineering)   THE UNIVERSITY OF BRITISH COLUMBIA   (Okanagan)  May 2015   © Husnain Haider, 2015  ii Abstract  To ensure safe and secure water supply, water utilities are experiencing challenges of climate change, socio-economic viability, and rapid rate of environmental degradation. Core of water utility business deals with managing assets and services which can be divided into functional components, such as water resource management and environmental stewardship, operational practices, personnel productivity, physical infrastructure, customer service, public health security, socio-economic issues, as well as financial viability. To be a sustainable water utility, major impetus is to enhance performance efficiency and effectiveness of the functional components to ensure high level of customer satisfaction.   Due to limited human and financial resources, small and medium sized water utilities (SMWU) are facing even further challenges related to performance enhancement. The participation of SMWU in Canada is almost negligible in National Water and Wastewater Initiative (NWWBI) so far. Consequently, such SMWU are managing their functional components without knowing whether they are meeting their primary performance objectives or not. Hence, there is an urgent need of a comprehensive framework for adopting performance management in SMWU.   In this research, an integrated performance management framework, consisting of five models has been developed. The overall framework initiates with the identification of performance indicators (PIs) based on a critical review, followed by the model using multicriteria decision analysis for the selection of PIs encompassing all the functional components. These PIs are then evaluated through an inter-utility performance benchmarking model (IU-PBM), which efficiently deals with the exiting data limitations in SMWU. Based on the IU-PBM results, an intra-utility performance management model (In-UPM) has been developed to hone in the performance of sub-components and different water supply systems within the utility for decision making under uncertainty. Finally, a risk-based model has been developed to improve customer satisfaction in SMWU.   This research will help utility managers across Canada and potentially other parts of the World to enhance performance management for SMWU. The utility managers can effectively implement this framework, with available resources, to achieve socio-economic benefits, as they can: i) identify the underperforming functional components, and can take corrective actions rationally; ii) manage customer satisfaction with efficient inventory management and data analyses.    iii Preface I, Husnain Haider, conceived and developed all the contents in this thesis under the supervision of Dr. Rehan Sadiq. Third author of the articles from this research work, Dr. Solomon Tesfamariam, has reviewed all the manuscripts and provided critical feedback in the improvement of the manuscripts and thesis. Most of the contents of this thesis are published, under review, or submitted in scientific journals and conferences.  A version of Chapters 2 and 3 has been published in NRC Research Press Journal Environmental Reviews with title “Performance Indicators for Small and Medium Sized Water Supply Systems: A Review” (Haider et al. 2014a).   A version of Chapter 4 has been published in Urban Water Journal with title “Selecting Performance Indicators for Small to Medium Sized Water Utilities: Multi-criteria Analysis using ELECTRE Method” (Haider et al. 2015a).   A portion of Chapter 5 has been published in The proceedings of the Canadian Society of Civil Engineers (CSCE) General Conference (2014) with the title “Performance Assessment Framework for Small to Medium Sized Water Utilities – A Case for Okanagan Basin” (Haider et al. 2014b).  A version of Chapter 5 has been published in the ASCE’s Journal of Water Resources Planning and Management with the title  “Inter-utility Performance Benchmarking Model (IU-PBM) for Small to Medium Sized Water Utilities: Aggregated Performance Indices” (Haider et al. 2015b)   A version of Chapter 6 is under review in the Journal of Cleaner Production with the title “Intra-utility Performance Management Model (In-UPM) for the Sustainability of Small to Medium Sized Water Utilities: Conceptualization to Development” (Haider et al. 2015c).  A version of Chapter 7 is under review in the Risk Analysis with the title “Customer Satisfaction Management Framework for Small to Medium Sized Water Utilities: A Risk-based Approach” (Haider et al. 2015d).  A research article consisting of an overall integrated framework developed in this thesis is under review in the Canadian Journal of Civil Engineering with the title “Multilevel Performance Management Framework: A Case of Small to Medium Sized Water Utilities in BC, Canada” (Haider et al. 2015e). I secured the approval of UBC’s Behavioral Research Ethics Board (UBC BREB No.H14-00668) for one of my article (Haider et al. 2015b). A copy of the approval and the relevant signed performa are attached in Appendix B.   iv Table of Contents  Abstract………. .................................................................................................................................... ii Preface……….. .................................................................................................................................... iii Table of Contents ................................................................................................................................. iv List of Tables…. ................................................................................................................................... xi List of Figures… ................................................................................................................................ xiii List of Abbreviations ......................................................................................................................... xvi List of Symbols ................................................................................................................................... xix Acknowledgements ............................................................................................................................ xxi Chapter 1 Introduction ........................................................................................................................ 1 1.1 Background ........................................................................................................................... 1 1.2 Research Motivation ............................................................................................................. 2  Performance Management .................................................................................. 3 1.2.1 Decision Making at Strategic, Tactical, and Operational Levels ....................... 4 1.2.21.3 Research Gap ........................................................................................................................ 5 1.4 Objectives ............................................................................................................................. 6 1.5 Thesis Structure and Organization ........................................................................................ 6 1.6 Proposed Framework ............................................................................................................ 8 1.6.1 Terminology Adopted in Research ..................................................................... 8 1.6.2 Identification of Potential PIs for SMWU .......................................................... 8 1.6.3 Selection of Performance Indicators................................................................. 10 1.6.4  Inter-utility Performance Benchmarking .......................................................... 10 1.6.5 Intra-utility performance management for SMWU .......................................... 10 1.6.6 Managing Customer Satisfaction in SMWU .................................................... 11 Chapter 2 Literature Review ............................................................................................................. 12 2.1 Background ......................................................................................................................... 12 2.2 Classification of Water Supply Systems ............................................................................. 13 2.3 Size Based Classification of Water Supply Systems .......................................................... 14  Difference between L-WSS and SM-WSS ....................................................... 17 2.3.1 USEPA Drinking Water Infrastructure Needs Surveys – Case Study .............. 17 2.3.2 v  Source Water .................................................................................................... 18 2.3.3 System Age and Pipe Size ................................................................................ 18 2.3.4 Public Health Risk ............................................................................................ 20 2.3.5 Operation and Management Systems ............................................................... 21 2.3.6 Physical, Water Quality, and Environmental Sustainability ............................ 21 2.3.72.4 Literature Review of Performance Indicators Systems for Water Utilities......................... 23  Terminology and Historical Background of Performance Indicators ............... 23 2.4.1 IWA Manual of Best Practice ........................................................................... 25 2.4.2 The World Bank ............................................................................................... 27 2.4.3 National Water Commission (NWC), Australia ............................................... 29 2.4.4 American Water Works Association (AWWA) ............................................... 30 2.4.5 Office of the Water Services (OFWAT) – UK and Wales ............................... 32 2.4.6 National Research Council (NRC) – Canada ................................................... 33 2.4.7 Asian Development Bank (ADB) ..................................................................... 35 2.4.8 Canadian Standards Association (CSA) ........................................................... 37 2.4.92.5 Evaluation of Performance Indicators Systems .................................................................. 38 2.6 Performance Assessment of Water Utilities with Limited Resources –   Some Case Studies .............................................................................................................. 42  South Asia - Bangladesh, India, and Pakistan .................................................. 42 2.6.1 Eastern Europe, Caucasus and Central Asian (EECCA) Countries - Armenia 43 2.6.2 Arab Countries ................................................................................................. 44 2.6.3 Africa - Malawi and 134 Water Utilities .......................................................... 44 2.6.42.7 Selection of Performance Indicators ................................................................................... 46 2.8 Performance Benchmarking for Water Utilities ................................................................. 47 2.9 Customer Satisfaction Assessment ..................................................................................... 49 Chapter 3 Identification of Suitable Performance Indicators ........................................................ 51 3.1 Background ......................................................................................................................... 51 3.2 Categorization of Performance Indicators for SMWU ....................................................... 52  Water Resources and Environmental Indicators .............................................. 52 3.2.1 Personnel/ Staffing Indicators .......................................................................... 55 3.2.2 Physical Assets Indicators ................................................................................ 56 3.2.3 Operational Indicators ...................................................................................... 57 3.2.4 Water Quality and Public Health Indicators ..................................................... 62 3.2.5 Quality of Service Indicators ............................................................................ 64 3.2.6 vi  Financial and Economic Indicators .................................................................. 66 3.2.73.3 Summary ............................................................................................................................. 68 Chapter 4 Selection of Performance Indicators............................................................................... 69 4.1 Background ......................................................................................................................... 69 4.2 Modeling Approach ............................................................................................................ 70  Criteria for Selection of PIs and Ranking System ............................................ 71 4.2.1 Multicriteria Decision Analysis (MCDA) ........................................................ 74 4.2.24.2.2.1 Analytic Hierarchy Process ............................................................................... 74 4.2.2.2 Elimination and Choice Translating Reality ...................................................... 76 4.3 Application of MCDA – An Example of Water Resources and Environmental PIs .......... 80  Estimation of Criteria Weights using AHP ...................................................... 80 4.3.1 Development of Outranking Relationships using ELECTRE .......................... 81 4.3.24.4 Development of Indicators .................................................................................................. 87  Water Resources and Environment (WE) Indicators........................................ 87 4.4.1 Personnel/ Staffing (PE) Indicators .................................................................. 89 4.4.2 Physical (PH) Indicators ................................................................................... 91 4.4.3 Operational (OP) Indicators ............................................................................. 92 4.4.4 Water Quality and Public Health Indicators ..................................................... 95 4.4.5 Quality of Service (QS) Indicators ................................................................... 97 4.4.6 Financial and Economic (FE) Indicators .......................................................... 98 4.4.74.5 Final Ranking of Selected Indicators ................................................................................ 100 4.6 Summary ........................................................................................................................... 103 Chapter 5 Inter-utility Performance Benchmarking Model (IU-PBM) ...................................... 104 5.1 Background ....................................................................................................................... 104 5.2 Approach and Methodology .............................................................................................. 107 5.2.1 Performance Benchmarking Modeling Approach .......................................... 107 5.2.2 Benchmarking Transformation Functions ...................................................... 107 5.2.3 Performance Aggregation Indices .................................................................. 109 5.2.4 Simos’ Method for Estimating the Weights of PIs ......................................... 109 5.2.5 Aggregating Performance Indicators using TOPSIS Method ........................ 109 5.3 Development of Benchmarking Transformation Functions .............................................. 112 5.3.1 Water Resources and Environmental Sustainability....................................... 117 5.3.2 Personnel Adequacy ....................................................................................... 117  vii 5.3.3 Physical Assets Efficacy ................................................................................. 118 5.3.4 Operational Integrity ...................................................................................... 119 5.3.5 Water Quality and Public Health Safety ......................................................... 120 5.3.6 Quality of Service Reliability ......................................................................... 121 5.3.7 Economic and Financial Stability ................................................................... 121 5.4 A Case Study of the Okanagan Basin ............................................................................... 122 5.5 Summary ........................................................................................................................... 131 Chapter 6 Intra-utility Performance management Model (In-UPM) .......................................... 132 6.1 Background ....................................................................................................................... 132 6.2 Establishing Performance Assessment Criteria ................................................................ 133  Water Resources and Environmental Sustainability....................................... 134 6.2.1 Personnel Productivity.................................................................................... 138 6.2.2 Physical Assets Efficacy ................................................................................. 139 6.2.3 Operational Integrity ...................................................................................... 139 6.2.4 Provision of Safe Drinking Water .................................................................. 140 6.2.5 Quality of Service ........................................................................................... 142 6.2.6 Economic and Financial Viability .................................................................. 143 6.2.76.3 Modeling Approach .......................................................................................................... 144 6.4 A Case Study of Okanagan Basin ..................................................................................... 155  Okanagan Basin .............................................................................................. 155 6.4.1 Analysis and Results....................................................................................... 156 6.4.2 Sensitivity Analysis ........................................................................................ 159 6.4.3 Performance Management using In-UPM ...................................................... 162 6.4.46.5 In-UPM for Complex Decision Making ........................................................................... 163 6.6 Summary ........................................................................................................................... 165 Chapter 7 A Risk-based Customer Satisfaction Management Model ......................................... 167 7.1 Background ....................................................................................................................... 167 7.2 Risk Assessment ............................................................................................................... 168  Root Cause Analysis (RCA) ........................................................................... 168 7.2.1 Failure Mode Effect Analysis (FMEA) .......................................................... 169 7.2.2 Fuzzy-FMEA .................................................................................................. 169 7.2.37.3 Modeling Approach .......................................................................................................... 171 7.4 Okanagan Case Study ....................................................................................................... 175  viii  Study area ....................................................................................................... 175 7.4.1 Baseline Data collection ................................................................................. 175 7.4.2 Risk Identification .......................................................................................... 176 7.4.3 Root Cause Analysis....................................................................................... 176 7.4.4 Fuzzy-FMEA .................................................................................................. 179 7.4.5 Risk Prioritization ........................................................................................... 179 7.4.6 Customer Satisfaction Management ............................................................... 184 7.4.7 Discussions ..................................................................................................... 187 7.4.87.5 Summary ........................................................................................................................... 191 Chapter 8 Conclusions and Recommendations ............................................................................. 192 8.1 Summary and Conclusions ................................................................................................ 192 8.2 Originality and Contribution ............................................................................................. 196 8.3 Limitations and Recommendations ................................................................................... 196 References……. ................................................................................................................................ 198 Appendices…… ................................................................................................................................ 214 Appendix A: Summary of Performance Indicator Systems ........................................................... 214 Appendix A-1: IWA manual of performance indicators for  water supply services (IWA 2006) ............................................................................................ 214 Appendix A-2: IBNET system of performance indicators (World Bank 2011) ........................ 215 Appendix A-3: Performance indicators developed by NWC, Australia (NWC 2012) .............. 216 Appendix A-4: AWWA system of performance indicators (AWWA 2004) ............................. 217 Appendix A-5: OFWAT system of performance indicators (OFWAT 2012 ............................ 218 Appendix A-6: Performance indicator system proposed by NRC, Canada (NRC 2010) .......... 219 Appendix A-7: Performance indicator system proposed by ADB (ADB 2012) ........................ 220 Appendix A-8: Performance indicator system proposed by ISO (CSA 2010) .......................... 221 Appendix B: Ethics Approval Certificate and Forms .................................................................... 222 Appendix B-1: Ethics approval certificate ................................................................................. 222 Appendix B-2: Ranking signed performa filled by Utility-I ..................................................... 223 Appendix B-3: Ranking signed performa filled by Utility-II .................................................... 226 Appendix B-4: Ranking signed performa filled by Utility-III ................................................... 229 Appendix B-5: Ranking signed performa filled by Expert-I ..................................................... 232 Appendix B-6: Ranking signed performa filled by Expert-II .................................................... 235 Appendix B-7: Ranking signed performa filled by Expert-III ................................................... 238  ix Appendix C: Fuzzy Rules for Intra-Utility Performance Management Model (In-UPM) ............. 241 Appendix C-1: Matrix defining fuzzy rules for ‘water resources and environmental sustainability ................................................................................................... 241 Appendix C-2: Matrix defining fuzzy rules for ‘environmental protection’ ............................. 242 Appendix C-3: Matrix defining fuzzy rules for ‘impact of flushing water’ ............................... 242 Appendix C-4: Matrix defining fuzzy rules for ‘water resources sustainability’ ..................... 242 Appendix C-5: Matrix defining fuzzy rules for ‘water resources management’ ...................... 243 Appendix C-6: Matrix defining fuzzy rules for restrictions,  ‘conservation and management’ ................................................................................................ 244 Appendix C-7: Matrix defining fuzzy rules for the ‘catchment and treatment employees’ ...... 245 Appendix C-8: Matrix defining fuzzy rules for the ‘metering and distribution employees’ ..... 245 Appendix C-9: Matrix defining fuzzy rules for the ‘loss due to field accidents’ ...................... 245 Appendix C-10: Matrix defining fuzzy rules for the ‘personnel healthiness’ ........................... 246 Appendix C-11: Matrix defining fuzzy rules for the ‘overtime culture’ ................................... 246 Appendix C-12: Matrix defining fuzzy rules for the ‘Productivity ratio’ ................................. 246 Appendix C-13: Matrix defining fuzzy rules for the ‘working environment efficacy’ .............. 247 Appendix C-14: Matrix defining fuzzy rules for the ‘personnel adequacy’.............................. 248 Appendix C-15: Matrix defining fuzzy rules for the ‘personnel health and safety’ .................. 249 Appendix C-16: Matrix defining fuzzy rules for the ‘personnel productivity’ .......................... 250 Appendix C-17: Matrix defining fuzzy rules for the ‘rehabilitation and replacement of pipes’ ....................................................... 251 Appendix C-18: Matrix defining fuzzy rules for the ‘distribution system maintenance’ .......... 252 Appendix C-19: Matrix defining fuzzy rules for the ‘delivery point maintenance’ .................. 253 Appendix C-20: Matrix defining fuzzy rules for the ‘inspection and cleaning routine’ ........... 253 Appendix C-21: Matrix defining fuzzy rules for the ‘distribution system integrity’ ................. 254 Appendix C-22: Matrix defining fuzzy rules for the ‘distribution system failure’ .................... 255 Appendix C-23: Matrix defining fuzzy rules for the ‘distribution system performance’ .......... 255 Appendix C-24: Matrix defining fuzzy rules for the ‘distribution network productivity’ ......... 255 Appendix C-25: Matrix defining fuzzy rules for the ‘operational integrity’ ............................ 256 Appendix C-26: Matrix defining fuzzy rules for the ‘public health safety’ .............................. 257 Appendix C-27: Matrix defining fuzzy rules for the ‘distribution systems water quality’ ........ 258 Appendix C-28: Matrix defining fuzzy rules for the ‘treatment systems water quality’ ........... 259 Appendix C-29: Matrix defining fuzzy rules for the ‘water quality compliance’ ..................... 260   x Appendix C-30: Matrix defining fuzzy rules for the  ‘water quality and public health safety’ ......................................................... 260 Appendix C-31: Matrix defining fuzzy rules for the ‘treated water storage capacity’ ............. 260 Appendix C-32: Matrix defining fuzzy rules for the ‘storage capacity’ ................................... 261 Appendix C-33: Matrix defining fuzzy rules for the  ‘storage and treatment systems capacity’ ....................................................... 261 Appendix C-34: Matrix defining fuzzy rules for the ‘monitoring systems integrity’ ................ 261 Appendix C-35: Matrix defining fuzzy rules for the ‘physical systems efficacy’ ...................... 262 Appendix C-36: Matrix defining fuzzy rules for the ‘customers’ information level’ ................ 262 Appendix C-37: Matrix defining fuzzy rules for the ‘water quality adequacy’ ........................ 263 Appendix C-38: Matrix defining fuzzy rules for the ‘customer service reliability’ .................. 264 Appendix C-39: Matrix defining fuzzy rules for the ‘response to complaints’ ......................... 264 Appendix C-40: Matrix defining fuzzy rules for the  ‘complaints related to system integrity’ ..................................................................................... 265 Appendix C-41: Matrix defining fuzzy rules for the ‘customer satisfaction level’ ................... 266 Appendix C-42: Matrix defining fuzzy rules for the ‘service reliability and customer satisfaction’ .................................................................................................... 267 Appendix C-43: Matrix defining fuzzy rules for the ‘customer water affordability’ ................ 267 Appendix C-44: Matrix defining fuzzy rules for the ‘operation and maintenance cost sustainability’ ................................................................................................. 267 Appendix C-45: Matrix defining fuzzy rules for the ‘economic stability’ ................................ 268 Appendix C-46: Matrix defining fuzzy rules for the ‘revenue collection efficacy’ ................... 268 Appendix C-47: Matrix defining fuzzy rules for the ‘operational cost sustainability’ ............. 268 Appendix C-48: Matrix defining fuzzy rules for the ‘economic and financial viability’ .......... 269   xi List of Tables  Table 2.1  Various basis of sized-based classification of water utilities ........................................... 15 Table 2.2  Number of water supply performance indicators under different categories by various   agencies............................................................................................................................ 40 Table 2.3  Evaluation of different performance assessment systems   for their applicability to SMWU ...................................................................................... 43 Table 2.4  Checklist of key performance indicators used in developing countries ........................... 45 Table 3.1  Proposed water resources and environmental indicators ................................................. 53 Table 3.2  Proposed personnel/ staffing indicators ........................................................................... 56 Table 3.3  Proposed physical/ asset indicators .................................................................................. 57 Table 3.4  Proposed operational indicators ....................................................................................... 58 Table 3.5  Proposed water quality and public health indicators........................................................ 63 Table 3.6  Proposed quality of service indicators ............................................................................. 65 Table 3.7  Proposed Financial/ Economic indicators ........................................................................ 67 Table 4.1  Selected PIs through initial screening .............................................................................. 73 Table 4.2  Scoring system and definition of criteria ......................................................................... 74 Table 4.3  Evaluation scale used in pairwise comparison ................................................................. 75 Table 4.4  Random indices established by Saaty (1980) .................................................................. 75 Table 4.5  Pairwise comparison matrix for weight estimation using AHP ....................................... 80 Table 4.6  Normalized comparison matrix for weight estimation using AHP .................................. 81 Table 4.7  The scoring matrix along with criteria weights ............................................................... 82 Table 4.8  The normalized weighted matrix ..................................................................................... 85 Table 4.9  Concordance and discordance interval sets for performance indicators in   WE category .................................................................................................................... 85 Table 4.10  Net Outranking of selected indicators .............................................................................. 87 Table 4.11  Final ranking of selected PIs under DMB for SMWU ................................................... 101 Table 5.1  Benchmarking transformation functions developed for performance   benchmarking for SMWU .............................................................................................. 113 Table 5.2  Weight estimation using Simos’ Method ....................................................................... 125 Table 5.3 Description of performance levels with proposed actions ............................................. 130 Table 6.1  Performance objectives, performance measures (PMs),   performance indicators (PIs), and data variables ........................................................... 135 Table 6.2  Universe of discourse (UOD) of performance indicators .............................................. 152  xii  Table 7.1  Linguistically defined probability of occurrence (P),   consequence (C), and detectability (D) .......................................................................... 180 Table 7.2  Results of fuzzy-FMEA ................................................................................................. 181 Table 7.3  Priority levels and RPN range ........................................................................................ 183 Table 7.4  Proposed mitigation actions based on risk assessment results ....................................... 185     xiii List of Figures  Figure 1.1  Scope of Asset Management (modified from IIMM 2006) ............................................. 5 Figure 1.2  Thesis structure and organization .................................................................................... 7 Figure 1.3   Proposed research framework for performance management of SMWU………………………………………………………………………………….9 Figure 2.1   Framework showing approach adopted for critical review of performance indicators systems and performance assessment frameworks ........................................................ 14 Figure 2.2   Schematic schemes of water supply systems based on different types of sources; a) Fresh surface water source; b) Fresh groundwater sources; c) Saline groundwater source; d) Saline surface water source .......................................................................... 16 Figure 2.3   Total 20-year financial need by system size and project/ component type (in billions of January 2007 dollars) (developed using USEPA 2009 data) ........................................ 19 Figure 2.4   Variation in 20-year distribution system investment needs by system size from 1995-2007, according to USEPA infrastructure needs survey and assessment for United States (developed using USEPA 1997, 2001, 2005, 2009 data).................................... 19 Figure 2.5   Percent of pipe system by age class in different system sizes (developed using AWWA 2007data) ....................................................................................................................... 20 Figure 2.6   Average miles of pipe per system by diameter in different system sizes (developed using AWWA 2007 data) .............................................................................................. 21 Figure 2.7  The concept of the Faria and Alegre (1996) level of service framework ...................... 25 Figure 2.8   Concept of layer pyramid used by Algere et al. (2000) for calculating performance indicators ....................................................................................................................... 26 Figure 2.9   An example of interaction between various indicators and sub-indicators in the NWC (2012) performance assessment framework “A partial reproduction of Fig 1 Interrelationship between NFP indicators, p. 26 of National Performance Framework: 2010-11 urban performance reporting indicators and definitions handbook” ............... 31 Figure 2.10  ADB (2012) Project Design and Monitoring framework (DMF) ................................. 36 Figure 2.11  Content and application of the ISO (2007).................................................................... 38 Figure 3.1   Proposed system of PIs to start, proceed and improve the performance evaluation mechanism in SMWU ................................................................................................... 52 Figure 3.2  Components of water balance for calculation of water losses in water distribution defined by Farley and Trow (2003)  .............................. Error! Bookmark not defined. Figure 3.3  The four basic methods of managing real losses (Source: Lambert et al. (1999) ......... 61  xiv Figure 4.1  Distribution of PIs in different categories by various agencies, a graphical representation of Table 2.2 ............................................................................................ 70 Figure 4.2  Modeling approach for selection of PIs for SMWU ..................................................... 72 Figure 4.3  Outranking relations of water resources and environmental PIs showing DMB .......... 80 Figure 4.4  Outranking relations of personnel PIs showing DMB .................................................. 90 Figure 4.5  Outranking relations of physical PIs showing DMB ..................................................... 92 Figure 4.6  Outranking relations of operational PIs showing DMB ................................................ 94 Figure 4.7  Outranking relations of water quality and public health PIs showing DMB................. 96 Figure 4.8  Outranking relations of quality of service PIs showing DMB ...................................... 98 Figure 4.9  Outranking relations of financial and economic PIs showing DMB ............................. 99 Figure 4.10  Net Concordance (C) and discordance (D) indexes for all seven categories of PIs; (a) Water resources and environment; (b) Personnel; (c) Physical; (d) Operational; (e) Water quality and Public Health; (f) Quality of Service;    (g) Financial and Economic ........................................................................................ 102 Figure 4.11  An example of cognitive map for estimation of water resources and environmental sustainability index ...................................................................................................... 103 Figure 5.1  Graphical description of Equation [5.1] showing misleading calculation of performance score ............................................................................................................................ 106 Figure 5.2  Relative performance of water utilities in terms of performance gap between the calculated PI values and benchmarks using performance score .................................. 106 Figure 5.3  Modeling approach of performance benchmarking model for SMWU ...................... 108 Figure 5.4  Examples of performance benchmarking relationships (a) per capita water consumption of residential consumers (WE2), a water resources indicator, (b) percentage of service connection repairs in a year (OP5), an operational indicator ...................................... 116 Figure 5.5  Aggregated performance indices for all the functional components, (a) performance indices of Utility A, (b) performance indices of Utility B........................................... 129 Figure 6.1  Methodology for performance management of SMWU ............................................. 145 Figure 6.2  A conceptual hierarchical structure for performance assessment of SMWU - An example of functional component of water resources and    environmental sustainability........................................................................................ 146 Figure 6.3  Standard trapezoidal membership functions used in this study; e.g., b1, b2, b3 and b4 are used to define the ranges of fuzzy numbers for 'Medium' ..................................... 151 Figure 6.4  Reported pressure, water quality and service connection complaints FY 2012 for different WSSs in the utility under study .................................................................... 157  xv Figure 6.5  In-UPM results for the utility for assessment year 2012 ............................................. 158 Figure 6.6  Results of primary and secondary level PMs for ‘quality of service’ component....... 158 Figure 6.7  Sensitivity analysis results for all the functional components ..................................... 161 Figure 6.8  Secondary level performance measures for service reliability and customer satisfaction in three systems within the water utility ...................................................................... 163 Figure 6.9  In-UPM results showing overall performance of the utility for year 2014 after the implementation of improvement action....................................................................... 164 Figure 7.1  Standard trapezoidal membership functions used in this study ................................... 170 Figure 7.2  Chen (1985) defuzzification method for trapezoidal fuzzy numbers .......................... 171 Figure 7.3  Risk-based modeling approach for assessment of customer satisfaction .................... 172 Figure 7.4  A vignette of customers’ complaints with respect to their causes ............................... 177 Figure 7.5  Root cause analysis (RCA) for customer complaints in SMWU ................................ 178 Figure 7.6  Number of failure modes with risk priority levels ...................................................... 184 Figure 7.7  Risk clustering for customers’ satisfaction assessment for the existing   situation to take Action 1&2 ....................................................................................... 185 Figure 7.8  Results of risk mitigation, A-1: Automation of booster stations, A-2: Source water change, A-3: Implementation of a routine service connections inspection program, A-4: Increasing level of water treatment…………………………………….……………. 186 Figure 7.9  Cumulative risk reduction vs. cumulative cost of mitigation actions.......................... 190 Figure 8.1  Integrated framework for performance management of SMWU ................................ 194   xvi List of Abbreviations  ACWUA Arab Countries Water Utilities Association  ADB  Asian Development Bank AHP  Analytical Hierarchical Process APRH  Portuguese Association of Water Resources  AWWA American Water Works Association AWWARF American Water Works Association Research Foundation BC  British Columbia BCMOH British Columbia Ministry of Health MoE  Ministry of Environment  BTFs  Benchmarking Transformation Functions C  Consequence CAN  Canadian CARL  Current Annual Real Losses CCW  Consumer Council of Water CI  Consistency Index CR  Consistency Ratio CSA  Canadian Standards Association CS  Customer Satisfaction CWW  Columbus Water Works CWWA Canadian Water and Wastewater Association  D  Detectability DBPs  Disinfection By Products  DM  Decision Maker DMB  Decision Makers Boundary DMF  Design and Monitoring Framework ELECTRE Elimination and Choice Translating Reality EPA  Environmental Protection Agency FCM  Federation of Canadian Municipalities FCs  Fecal coliforms FDS  Financial Debt Service  FE  Financial and Economic FMEA  Failure Mode Effect Analysis FMs  Failure Modes FRBM  Fuzzy Rule Based Modeling FTEs  Full Time Employees  GHG  Green House Gases  GIS  Geographic Information System HO  Home Owner  IBNET  International Benchmarking Network for Water and Sanitation Utilities IIMM  International Infrastructure Management Manual ILI  Infrastructure Leakage Index In-UPM Intra-utility Performance Management Model IU-PBM Inter-utility Performance Benchmarking Model IWA  International Water Association IWSA  International Water Services Association KPIs  Key Performance Indicators KWA  Kiwa Water Research   KWH  Kilowatt Hour  xvii LOS  Level of Service L-WSS  Large Water Supply Systems LWU  Large water utilities MCDA  Multicriteria Decision Analysis MISO  Multi Input Single Output MHLS  Ministry of Healthy Living and Sport MGD  Million Gallons per Day MRR  Maintenance, Rehabilitation and Renewal NCEL  National Civil Engineering Laboratory NCR  National Research Council NGO  Non-governmental Organization NIS  Negative-ideal Solution NRW  Non-revenue Water NTU  Nephelometric Turbidity Units NWC  National Water Commission NWWBI National Water and Wastewater Benchmarking Initiative  NWWBI-PR National Water and Wastewater Benchmarking Initiative Public Report OFWAT Office of Water Services OP  Operational  OPM  Office of Public Management New South Wales O&M  Operation and Maintenance P  Probability of Occurrence PE  Personnel PH  Physical PIs  Performance Indicators PIS  Positive-ideal Solution PMs  Performance Measures PPMS  Project Performance Management System PRD  Preservation, Renewal, and Decommissioning  PRV  Pressure release valve PSAB  Public Sector Accounting Board QS  Quality of Service QUU  Queensland urban utilities  RCA  Root Cause Analysis RI  Random Index RPN  Risk Probability Number SA-WRC South Africa Water Research Council SC  Service Connection SIM  Service Intensive Mechanism SoSI  Security of Water Supply Index SISO  Single Input Single Output S-WSS  Small Water Supply Systems SM-WSS Small to Medium Sized Water Supply Systems SMWU  Small to Medium Sized Water Utilities THMs  Trihalomethanes  UARL  Unavoidable Annual Real Losses UFW  Unaccounted for Water UOD  Universe of Discourse  USEPA  United National Environmental Protection Agency WB  World Bank WDS  Water Distribution System  xviii WE  Water Resources and Environment WHO  World Health Organization WOP  Water Operators Partnership Program  WP  Water Quality and Public Health WQ  Water Quality  WSP  Water and Sanitation Program WSS   Water supply system WSSC  Washington Suburban Sanitary Commission WSSs   Water Supply Systems                              xix List of Symbols  A  Pairwise comparison matrix (ELECTRE) A  Linguistic constant (FRBM) Ap  First alternative in a pair of alternatives Aq   Second alternative in a pair of alternatives B  Fuzzy output B’  Crisp output of B Bj  Fuzzy subset C   Quantitative composite measure C(p,q)  Concordance set D(p,q)  Discordance set C   Average values of all Cpq CL   Composite measure minimum CH   Composite measure maximum D   Average values of all Dpq HS   Satisfaction maximum i  Component j  Child J1  Set of benefit attributes J2  Set of cost attributes k  Parent l  Generation Lmin  Minimum possible RPN value Lmax   Maximum possible RPN value LS   Satisfaction minimum m  Previous generation n  number of criteria (ELECTRE) P  Performance of a factor Pi*  Performance index of each functional component rij  Normalized criteria values R  Fuzzy rule set Ri  Rule number Rij  Normalization matrix S   Satisfaction score achieved from a qualitative customer survey UT(x)  Defuzzified value of risk factor vpj  Weighting normalized rating vij  Weighted value of each performance score (TOPSIS)  xx Vij  Normalized weighted matrix WS   Survey weighting w  Criteria weights wij  Corresponding weight of the indicator (TOPSIS) WC   Quantitative composite measure weighting xij  Assigned value to the degree of alternative Ai with respect to the criteria Xj xij  Performance score (TOPSIS) X  Criteria X  Linguistic input (antecedent) variable (FRBM) X*  Positive-ideal solution X-  Negative-ideal solution Y  Output (consequent) variable (FRBM) Yi*  Distance from X* for each performance indicator Yi-  Distance from X- for each performance indicator µa(x)  Membership function µBk(y)  Output fuzzy set λ  Eigenvalue     xxi Acknowledgements  All praises be to the ALLAH Almighty who gave me the strength to materialize this research in the present form. I would like to express my sincere appreciation and thanks to my advisor Professor Dr. Rehan Sadiq - you have been a tremendous mentor for me during this journey. Dr. Sadiq is very patient person, a high quality researcher, and a dedicated teacher. I would like to thank for his encouragement that allowed me to grow as a researcher. Your advices throughout my studies both on research and towards my career were priceless.   I would like to thank Dr. Solomon Tesfamariam for his valuable guidance and critical comments throughout my research. I also thank Dr. Cigdem Eskicioglu for her suggestions in the improvement of my work. I would also like to thank all the staff of various water utilities in Okanagan Basin, who helped me in collection of baseline data and shared their experiences with me.   This research could be impossible without the financial support from the Natural Sciences and Engineering Research Council Collaborative Research and Development (NSERC CRD). I also acknowledge the University of British Columbia for providing me the International Partial Tuition Scholarship and University Graduate Fellowship to cover a part of my tuition fee.  A special thanks to my family. Words cannot express how grateful I am to my mother, father, mother-in-law, father-in-law, my brothers (Ali and Hassan), my sister (Ayesha), and my nephews (Ahmed, Fatima, Ismail, and Umer) for the sacrifices that you’ve made on my behalf. Your prayers helped me to go that far.  I would also like to thank all of my friends who supported and incented me to strive towards my goal. Finally, I want to express my appreciation to my beloved wife Nazish Haider who always supported me in the moments when there was no one to answer my queries. Nazish, without your love and support, it would’ve been impossible to complete this work.  The appreciation is certainly due to faculty and staff at the University of British Columbia who supported me at several occasions during the study period. I would also like to thank my friends and colleagues Golam Kabir, Hassan Iqbal, Gizachew Demmissi, and Gyan Kumar Sharma who were always there for me. I would also like to thank Amanda Brobbel, who affectionately guided me in thesis writing.      xxii Dedication    Lovingly dedicated to my mother,  Mrs. Saeeda Ziai.       Chapter 1     Introduction  1.1 Background   Access to safe drinking water in sufficient quantity and at an affordable cost is the basic human right, irrespective of the geographical location and size of their community (WHO 2012). Like all other infrastructure systems, the water supply systems face a number of unique challenges in the 21st century, such as rapid population growth, uncertain climate, socio-environmental issues, limited water resources, and ongoing economic crises (Berg and Danilenko 2011). Water utilities need to perform proficiently in order to face these challenges, manage their assets, and increasing customer satisfaction. These utilities consist of different functional components, such as water resource management & environmental stewardship, operational practices, personnel productivity, physical infrastructure, customer service, water quality and public health safety, socio-economic issues, and financial viability. Each one of these components may consist of several sub-components, e.g., the component of personnel productivity may include staff adequacy, health and safety, working environment, etc. Moreover, a water utility may consist of one or more water supply systems (WSSs) which have their specific geographical characteristics. A utility will only attain high sustainability objectives, when all of its WSSs, functional components and sub-components are performing efficiently.   Water supply systems can be divided into vertical and linear components. The vertical components consist of treatment plants, pumping stations, and storage facilities, whereas the linear components are transmission mains and distributions system pipelines. Generally, linear components are more expansive and their value can be 60 to 80% of the overall cost of the WSS (Stone et al. 2002). All these components may face a number of problems associated with their continuous aging process, including low pressure, water loss, and water quality deterioration (Alegre 1999, Alegre et al. 2006). As per the Canadian Infrastructure Report Card (2012), 15.4% of the water mains possess fair to very poor condition. Moreover, 14.4% of vertical assets were found to be in fair to very poor condition. The estimated replacement cost of these infrastructure is 25.9 billion $CAN (i.e. 2,082 $CAN per household) in Canada (CIRC 2012). Keeping in view the existing infrastructure condition and investment needs, in Civil Infrastructure Systems Technology Road Map 2003–2013, the federal, provincial, territorial and municipal governments and industry partners were requested to allocate funds to infrastructure research and development (CCPPP 2003).    2 Alegre and Coelho (2012) defines asset management for urban water utilities as “the set of processes that utilities need to have in place in order to ensure the performance of the asset in line with the service targets over time, that risks are adequately managed, and that the corresponding costs, in a lifetime cost perspective, are as low as possible”. The first step towards effective asset management is assessing the performance of above stated components of a water utility. Subsequently, based on the performance assessment results, the utility management can establish desirable level of service (LOS) with acceptable risk, and can develop future financial plans. Even smaller utilities can adopt sustainable asset management strategies to enhance their effective service life (Brown 2004).   About 95% of the water supply systems in Canada are being operated by small and medium sized water utilities (SMWU) serving population less than 50,000 (Statistics Canada 2009). National Water and Wastewater Benchmarking Initiative (NWWBI), Canada was established in 1997. As per the recent public report published in 2013 (stated performance of water, wastewater and storm water utilities FY 2011), the wastewater and water utilities have been participating in NWWBI since 2003 and 2005, respectively. However, most of them are large water utilities (LWU) with population more than 50,000 i.e., 50% of Canadian utilities covering more than 60% of the population. So far, the participation of SMWU has almost been negligible in NWWBI. The possible reasons seem to be: i) there is no well-structured performance assessment framework available for such utilities which can simply (though comprehensive) be implemented under given technical and financial constraints, and ii) due to less economies of scale, SMWU may avoid to participate with large utilities which may delineate deficiency performance. Therefore, SMWU in Canada are managing their assets without knowing whether they are meeting their primary performance objectives or not.   1.2 Research Motivation  Over the last several years, the water utilities have been encouraged to effectively manage their assets due to several emerging factors, such as increasing number of customers and their expectations, more awareness towards water resources conservation, emerging environmental impacts, climate change issues, lack of trained personnel, increasing energy and regulatory requirements, and increasing financial stresses. Based on Public Sector Accounting Board (PSAB) Canada guidelines, applicable from 1st January 2009, all the municipalities should account for their tangible capital assets and amortize their financial statements regularly (Zoratti 2009). Subsequently, based on the remaining service life, and existing condition of assets, maintenance, rehabilitation and renewal (MRR) strategies for linear and vertical assets can be planned. Several studies have been conducted, in the past, on risk based MRR for  3 physical assets of WSSs (Francisque et al. 2014, Liu et al. 2010, Lounis et al. 2010, Loganathan et al. 2001). However, primary objective of a water utility is much broader than just meeting its physical infrastructure needs.   A comprehensive performance management can help the utility to achieve its overall sustainability objectives, such as: i) optimization of human and financial resources, ii) conservation of water resources, iii) protection of environment, iv) provision of safe and productive working environment for personnel, v) protection of public health, vi) provision of safe drinking water for the community, and vii) achieving customers’ reliability through efficient operations and response to their complaints. Therefore, an overall goal of this research is to develop a comprehensive and practical performance management framework for SMWU.   Performance Management  1.2.1 The performance improvement process in any water utility initiates with an effective performance assessment, i.e., comparing the utility performance with other similar utilities (in size and geographical location) and with the standards established by various regulatory agencies (Marques and Witte 2010; Alegre et al. 2006). Various agencies around the world have developed systems for inter-utility performance assessment (benchmarking) based on performance indicators (PIs) (Coelho 1997, Alegre et al. 2006, Berg and Danilenko 2011, NWC 2012, AWWA 2004, OFWAT 2012, NRC 2010, ADB 2012). In general, larger utilities are older than SMWU and contain much larger and expansive physical infrastructure, e.g., water mains, treatment plants, etc. Also, they have to satisfy a large number of concerned and responsive customers. The performance related issues (e.g., extensive energy requirements, widespread environmental impacts, large pipe bursts, and loss of amenities during vandalism) of larger utilities were recognized decades ago (Stone et al. 2002). Consequently, most of these performance assessment systems were primarily developed for LWU.    SMWU are different than large utilities in several ways, including but not limited to: less efficient management information systems; serve fewer people with small-sized pipelines; less involved in management activities; and inefficient working as well as utilization of technical, human and financial resources (Ford et al. 2005, Braden and Mankin 2004, Brown 2004, Hamilton et al. 2004). Consequently, SMWU are facing several technical, socio-economic, and environmental challenges to meet regulatory guidelines (Dziegielewski and Bik 2004). For instance, according to Water Canada (2013), water utilities in British Columbia have gone through maximum number of boil water advisories as compared to other  4 provinces, and most of them are SMWU with population less than 50,000. Interior Health Canada (2013) has reported various reasons for these advisories, such as source water contamination, flushing of hydrants, construction, repair and maintenance works, equipment failure, and inadequate treatment.   SMWU have some advantages over LWU; for example, they have smaller, relatively less complex and newer physical structures and have simple organizational structures which provide more opportunity for change management. Also, SMWU have fewer impacts on natural systems due to smaller withdrawals, and produce lesser carbon emissions. Hence, SMWU cannot adopt the above mentioned systems of PIs as such due to data limitations and intricacies of the existing systems. According to European Project (COST Action C18: Performance assessment of urban infrastructure services), there is an urgent need for comprehensive research to improve performance management in SMWU (Alegre 2010).    Decision Making at Strategic, Tactical, and Operational Levels  1.2.2 The scope of asset management shown in Figure 1.1 consists of three levels, i.e., strategic, tactical, and operational (Alegre and Coelho 2012). Strategic level planning involves a review of the functional environment to ensure all the elements such as corporate, community, environmental, financial, legislative, institutional, and regulatory factors are appropriately considered in asset management. It further includes a clear statement of strategic objectives, policies, anticipated outcomes, and risk management plans (Haider 2007).  The tactical planning is the application of detailed asset management procedures, and standards in a cost-effective way to achieve strategic goals by meeting desired service levels. Moreover, the objectives related to technical and customer service standards and financial requirements are also set at this level (Foster et al. 2000). Asset management information systems including condition and performance evaluation, capacity availability, and lifecycle costs are also managed in this level (Karababas and Cather 1994). Operational plans at level 3 include controls to confirm delivery of asset management policies, strategies, legal requirements, objectives and plans developed at Levels 1 & 2. Activities at this level include condition monitoring, process control, staff training, communication with stakeholders, information and data control, and emergency response (IIMM 2006).  In SMWU, senior management is involved in both the strategic and tactical level decision making simultaneously; similarly, technical management is responsible for operational level decisions along with their primary responsibilities (i.e., tactical level decision making). Practically, from an engineering management point of view, the tactical level is the most critical one in overall planning, and thus is primarily addressed in this research.   5                 Figure 1.1 Scope of Asset Management (modified from IIMM 2006)   1.3 Research Gap  In existing NWWBI pubic report, the calculated values of different PIs are just compared with minimum, average, and maximum values of the participating utilities (i.e., LWU). Such simple comparison of individual PIs does not provide information about the overall performance of a water utility. Secondly, all these benchmarks are available for larger utilities, due to inherent less economies of scale in SMWU, the application of these benchmarks for inter-utility benchmarking of SMWU needs extensive efforts. The benchmarking process needs to be practical (for measurable PIs), besides being comprehensive enough to cover all the functional components. When one or more functional components are underperforming, the tactical level decision making can be improved by honing in the sub-components. Such analysis need to be performed at intra-utility to evaluate the performance of different WSSs operating within a utility.   Moreover, there are research gaps that exist in terms of addressing specific performance related issues (at component level) in SMWU, for instance, customer satisfaction  is a primary objective of a water utility to provide reliable services. Existing methods based on customer interviews might not be practically Strategic level   AM goals & policy ownership definition Tactical level Assets  Management  Purchasing Marketing Finance Engineering Resource  Planning Human  resources Customer service Operational level External Factors Operation Management  Maintenance Management  Condition Monitoring  Reliability Management  Risk Management  Location Management  Inventory  Control  Work Management   Regulations   Legislation   Government Agencies   Business stakeholder   Economic  Forecast  Pressure  Groups   External Consultants  Suppliers  Contractors  Auditors  Contract Management  Registry Management   6 possible for smaller utilities with limited resources; therefore the operational personnel strive hard to respond to the complaints without any management strategy. With the result, there is no structured mechanism available to evaluate the risk of customer dissatisfaction.    A comprehensive assessment of the SMWU over their entire lifecycles (i.e., continuous benchmarking), answering the above stated research gaps followed by effective asset management plans, can help the utilities for attaining sustainability. There are several models, guidelines and decision support tools proposed and developed by various agencies and organizations around the world to serve this purpose. Most of these tools are based on extensive, long-term, and expansive (requiring large human and financial resources) database, which is presently not available for SMWU. Therefore, a substantial knowledge gap still exists on performance management, particularly for SMWU (NRC 2010, Alegre 2010).  1.4 Objectives  The primary goal of this research is to develop a comprehensive and practically applicable decision support framework for performance management of SMWU operating in Canada and other parts of the world. The specific objectives of this research are to develop:  1. potential performance indicators for SMWU through a screening process and the state-of-the-art literature review, 2. a model for the selection of the most suitable PIs for SMWU based on their, applicability, measurability, understandability, and comparability using multicriteria decision analyses, 3. an inter-utility performance benchmarking model for SMWU,  4. an intra-utility performance management model for sustainability of SMWU, 5. a comprehensive risk based customer satisfaction management model for SMWU, and  6. apply all the models developed in this research on actual case studies for the proof-of-concept.  1.5 Thesis Structure and Organization  Figure 1.2 illustrates the organization of thesis containing eight chapters to achieve the above mentioned objectives. Chapter 2 includes the review of literature for existing frameworks for PIs and performance assessment of water utilities. In Chapter 3, potential PIs are identified, which are subsequently evaluated on the basis of different criteria using multicriteria analysis for final selection of PIs in Chapter 4. An inter-utility performance benchmarking model is developed in Chapter 5. In Chapter 6, a detailed  7 performance management model is developed for SMWU under uncertainty. The final model is developed in Chapter 7 for managing risk of customer dissatisfaction in SMWU. Finally, Chapter 8 contains summary of research outcomes and recommendations for the future research.                              Figure 1.2 Thesis structure and organization  OBJECTIVE 1 Chapter 1: Introduction and modeling framework Chapter 2: Literature review  Chapter 3: Identification of suitable PIs for SMWU OBJECTIVE 2 Chapter 4: Selection of important PIs using multicriteria analysis for performance assessment of SMWU OBJECTIVE 3 OBJECTIVE 4 Chapter 5: Inter-utility performance benchmarking model for SMWU  Chapter 6: Intra-utility performance management model for SMWU  OBJECTIVE 5 Chapter 7: A risk based framework for customer satisfaction management in SMWU OBJECTIVE 6 Application of model on a case of Okanagan Basin for proof-of-concept Chapter 8: Conclusion and Recommendations  8 1.6 Proposed Framework   In order to attain the objectives outlined above, a framework is presented in Figure 1.3. As described earlier, a water utility consists of different functional components, and sub-components, which needs to operate efficiently to attain an overall sustainability. If one or more of these components and/ or sub-components are underperforming (i.e., not meeting desired benchmarks, criteria, and standards), the utility managers need to improve the performance of the respective component. Moreover, all the WSSs operating within the utility should also perform efficiently. The main purpose of this framework is to facilitate the management of SMWU for short-term and long-term decision making to achieve: i) the sustainability performance objectives for all the functional components of the utility as a whole and for its WSSs, and ii) customer satisfaction by providing a reliable quality of service.  A brief description of the models developed in this research is given below, and the details are provided in the following chapters.  1.6.1 Terminology Adopted in Research  There are different models, techniques and methods are used in this research. It is important to understand the terminology developed for this research in order to appreciate the integrated concept of performance management for SMWU. The term ‘framework’ is used for the holistic approach developed in Figure 1.3. The term ‘model’ is used for the components of this framework, when detailed modeling approaches are used to solve the research gaps for performance assessment or management of SMWU, e.g., model for selection of PIs, intra-utility performance management model, etc. The term ‘method’ is used for compensatory and non-compensatory MCDA methods used in this research. The term ‘technique’ is used for applied mathematical procedures, such as fuzzy set theory, failure mode effect analysis, sensitivity analysis, etc.  1.6.2 Identification of Potential PIs for SMWU  Although, a breadth of literature is available on performance indicators, performance benchmarking and performance assessment for water utilities, there is still a gap exists for specific interest of SMWU. In this chapter, a comprehensive review of the literature has been carried out to rationally assess the suitability of reported performance evaluation systems for SMWU in terms of their simplicity (easy and simple data requirements) and comprehensiveness (i.e., all the components of a WSS) using expert judgment. The review also evaluates the individual PIs with respect to their understandability, measurability, and comparability (i.e., within and across utility comparisons).   9                                  Figure 1.3 Proposed research framework for performance management of SMWU Risk Management  Selection of suitable PIs using MCDA  Inter-utility performance benchmarking of SMWU  Identification of potential PIs for SMWU Are all the components performing ‘High’? Maintain performance for sustainability of SMWU  Yes  No Intra-utility performance assessment for components and sub-components at utility level  Are all the sub-components are performing ‘High’? Yes ‘Low’ detailed investigation required Performance Management (Data/ Decision Variables) ‘Medium’ utility level improvement required  Intra-utility performance assessment for components and sub-components at system level  Are all the systems are performing ‘High’? Yes No Assessing risk of customer satisfaction in SMWU Are all the customers are satisfied with utility’s performance?  Yes  No Objective 1 Objective 2 Objective 3 Objective 4 Objective 5 Objective 6 END CHECK Sustainability objectives achieved for SMWU  10 On the basis of this detailed review, a conceptual performance evaluation system, consisting of a list of PIs grouped into their respective categories, has been developed in Chapter 3. The system provides a stepwise approach, starting the performance evaluation process with the most significant and easy to measure PIs, and moving to a relatively complex set of PIs depending on the availability of resources and specific operating conditions. However, this list of initially identified PIs needs to be further investigated for final selection of PIs for SMWU.  1.6.3 Selection of Performance Indicators  In Chapter 2 and 3, potential PIs were identified for SMWU in water resources and environment, personnel, operational, physical, water quality, quality of service and financial categories through literature review and initial screening. These PIs are evaluated against applicability, understandability, measurability and comparability criteria using multicriteria decision analysis (MCDA) based on experts opinion and experienced judgment of personnel involved in water utilities management in Chapter 4. The MCDA method adopted in this research provides an opportunity to the utility management to encompass the most suitable PIs based on data availability, and specific needs of their utility.  1.6.4  Inter-utility Performance Benchmarking  In Chapter 5, the finally selected PIs in Chapter 4 are used to develop an inter-utility performance benchmarking model (IU-PBM) for SMWU. Different (linear, exponential, logarithmic and polynomial) transformation functions have been established to translate the calculated PIs into performance levels, which is based on literature, NWWBI reports and expert judgment. The weights are estimated through group decision making by ranking of PIs by different water utilities in the Okanagan basin, British Columbia, Canada, and the opinion of experts working in water infrastructure management. Finally, performance indices have been established by aggregating the transformed performance levels. The IU-PBM results presented in the form of a web diagram demonstrate the utility’s performance to the top level management for pragmatic decision making. The proposed model has also been implemented for two SMWU operating in Okanagan Basin to demonstrate its practicality.  1.6.5 Intra-utility performance management for SMWU  If the results of IU-PBM show that all of the functional components are not meeting desired LOS, there is a need for detailed investigations at utility level. In Chapter 6, an intra-utility performance management model (In-UPM) is conceptualized and developed for effective decision making. A hierarchical based top- 11 down approach is used; starting from overall sustainability performance objectives of the functional components at the top, followed by primary and secondary performance measures of the sub-components, and indicators (basic building blocks) receiving inputs from data/ decision variables at the bottom. The issues related to data scarcity are addressed by utilizing benchmarking data from larger utilities, peer-reviewed literature, and expert elicitation from local municipalities. In-UPM is robust enough to deal with temporal and spatial variations, i.e., it can assess the performance of a water utility as a whole and/ or different water supply systems operating within a utility for a given assessment period. System level assessment is required when one or more functional components or sub-components are either performing ‘medium’ or ‘low’. Sensitivity analyses are performed to rank the performance indicators based on their percent contribution to each functional component. In-UPM is implemented for a medium sized water utility containing three sub-systems in the Okanagan Basin (British Columbia, Canada).   1.6.6 Managing Customer Satisfaction in SMWU Literature review carried out above reveals that the conventional customer satisfaction (CS) assessment methods based on performance benchmarking and customer interviews might not be technically and financially sustainable for SMWU. A risk based model is developed in Chapter 7 for managing CS in SMWU, primarily based on the evaluation of customer complaints. The proposed model also includes the experience of the operational staff to support decision making for effective improvement actions. Customer dissatisfaction is evaluated in terms of risk of CS, which starts when a customer reports a complaint to the utility; however, a complete evaluation of CS depends on the duration between the time of the report and response up to the complete resolution of the complaint.   Different categories of complaints are identified from an exhaustive record of customer complaints obtained from a medium sized utility in Okanagan Basin, British Columbia, Canada. All possible modes of failures are identified using root cause analysis for detailed risk assessment. The inherent uncertainties associated with data limitations and experts opinion have also been addressed.    12 Chapter 2     Literature Review   A part of this chapter has been published in Environmental Reviews, an NRC Research Press Journal as a review articles titled “Performance Indicators for Small and Medium Sized Water Supply Systems: A Review” (Haider et al. 2014a).  This chapter contains two main sections. With regard to the objectives defined in Chapter 1, the first section includes a state-of-the-art literature review for, system of PIs, selection of indicators, performance assessment models, and methods of assessing customer satisfaction for water utilities. Based on the review, to practically achieve the research objectives, a framework for the overall research is presented in the last section of this chapter. A water utility may contain one or more than one WSSs. However, in this chapter the terms small to medium sized water utilities (SMWU) and small to medium sized water supply systems (SM-WSS) are used interchangeably depending on the context of discussion.  2.1 Background  A systematic approach is adopted to comprehensively review the reported PI systems, keeping in view the specific requirements of SMWU. The main components of the overall review process are shown in Figure 2.1. Discussion on the major differences between LWU and SMWU is followed by the review of various PI systems proposed by different organizations.   A PI system usually consists of general asset information, baseline data variables, the categorization of given PIs, linkage between data variables and PIs, the formulae or equations to calculate the PI, and the comparison with benchmarks. Appropriate grouping of selected PIs provides an effective mechanism to assess the category-wise performance of the utility, i.e., water resources and environment, personnel, operational, physical assets, water quality, quality of service, and financial. This categorization further aids decision makers in identifying and then rectifying the weaknesses of the utility in a more systematic manner. For example, a relatively new system is performing worse than an old system of the same size and socio-economic conditions, due to staff and management inefficiencies. In this case, the utility may receive a large number of customer complaints, even though all the other categories might be working efficiently. An extremely complex or detailed PI system might not be feasible for the SMWU. In this respect, PIs systems proposed by various agencies have been evaluated based on their simplicity, comprehensiveness (i.e., all aspects of a WSS are covered), and the potential to be applicable to SMWU. Moreover, the data required to estimate the PIs are either not available or missing in most situations.  13 Suitability of PIs for SMWU is assessed on the basis of their understandability, availability of existing data, and the data that can be frequently collected in future. It is also important for the utility operators and managers to establish the important PIs for which data is available or attainable with limited resources.   This chapter is sub-divided into six sections. The following section consists of size-based classification of water supply utilities and the main differences between the SM-WSS and the large WSSs (L-WSS). A state-of-the-art review is then carried out in Section 2.3 of the available PI systems being utilized worldwide by different agencies. The review also evaluates each system in terms of measurability and understandability of the PIs, and the PIs are grouped into various categories. The review further explains the specific objectives and conditions in which these systems were developed, and thereby describes their limitations. In the next section, some case studies of different countries having limited data resources for performance evaluation are presented to identify the simplest PIs (though not necessarily the most important ones). The following sections review the methods for selection of PIs, and performance assessment of water utilities. In the last section the overall framework of this research is presented.  2.2 Classification of Water Supply Systems  In order to evaluate the existing PI systems, it is important to understand the difference between SM-WSS and L-WSS, and how and on what basis these systems have been distinguished from each other in various countries. The components of a WSS greatly depend on the water source (i.e., fresh surface water, ground water, sea water, or saline ground water) as shown in Figures 2.2 a,b,c,&d. Figure 2.2 shows that the overall water supply scheme could be as simple as Figure 2.2b (in a case of a fresh groundwater source) and as complex as Figure 2.2d (a case of sea water source). To some extent, the selection of PIs (particularly water quality) depends on type of the source type and water quality. A more detailed framework, including a larger number of PIs, might be required for the same size WSS with a different water source.  Although the situation in case of private SM-WSS (e.g., England and Wales) could be different, the review covers the performance evaluation system adopted for private WSSs as well; however, the main focus is on public sector SM-WSS.       14                         Figure 2.1 Framework showing approach adopted for critical review of performance indicators systems and performance assessment frameworks   2.3 Size Based Classification of Water Supply Systems  The WSSs are categorized on the basis of their sizes to efficiently perform their organizational, financial, and operational activities. The criterion of system size classification varies around the world (Ford et al. 2005) (Table 2.1). In most parts of the world, including Central and North America, the utilities are commonly classified as small, medium, and large based on the volume of supplied water, number of connections, and population served (Corton and Berg 2009). In New Zealand’s and South Africa’s Water  Difference between  Small & Medium sized WSS  Vs.  Large sized WSS - Population served - Number of connection - Use of smaller dia. Pipes - System age - Investment needs - Public health risk - O&M systems PIs Systems Framework and Classification of PIs Developed by Various Agencies - Simplicity - Comprehensiveness - Applicability to SM-WSS PIs Proposed by Various Agencies - Understandability - Measurability - Comparability DESCRIPTION OF REVIEW ITEMS  BASIS OF REVIEW  PERFORMANCE INDICATORS FOR SM-WSS (Chapter 3) BASIC – START UP  (applicable to both developing and developed countries) ADDITIONAL (applicable to both developing* and developed countries) ADVANCED (applicable to medium sized WSSs in developed countries) a in case where data is available or possible to collect  15 Research Councils, the basis of size classification is the number of connections (Lambert and Taylor 2010; Mckenzie and Lambert 2002).   Table 2.1 Various basis of sized-based classification of water utilities System Size Country/ Agency with basis of classification USEPA1 (2002) USEPA (2009) WB2 (2002) New Zeeland (2010) SA-WRC3 (2002) AWWA4 (2004) (Population served) (Population served) (Population served) (No of service connections) (No of service connections) (Flow MGD) Large >50,000 >100,000 >500,000 >10,000 > 50,000 > 50 Medium 3,300 – 50,000 3,300 – 100,000 125,000 – 500,000 2,500 – 10,000 10,000 – 50,000 5 – 50 Small <3,300 <3,300 <125,000 < 2,500 <10,000 < 5 1United states environmental protection agency; 2World Bank; 3South Africa Water Research Council; 4American Water Works Association  According to the Irish EPA, the small system is the one that serves less than 5,000 people (Ford et al. 2005). The Province of British Columbia, Canada, has a tiered classification for small water systems (WS) based on the number of connections, ranging from 1 connection for a restaurant or a resort (i.e., WS4) to more than 20,000 connections (i.e., WS1c) (MHLS 2010). Sometimes, one utility is operating two or more WSSs. In this case, the performance of the WSS should be evaluated separately if the water source, geographical characteristics, and land uses are different.   Each classification system presented in Table 2.1 has its own constraints. For example, there is a large difference between the population served and the number of connection ratio in developing countries as compared to the developed world which is due to high population densities and larger number of persons per connection, particularly in urban areas (WSP 2009). Secondly, the variations in per capita water consumption are also large enough to relate the community size with the flow requirements due to type of supply and standard of living. Every country or region should characterize the size of a municipality keeping in view all the relevant factors discussed above. In general, utilities having population greater than 50,000, number of connections greater than 10,000, and demand higher than 50 million gallons per day (MGD) have been considered as large ones.        16           (a)         (b)        (c)           (d)  Figure 2.2 Schematic schemes of water supply systems based on different types of sources; a) Fresh surface water source; b) Fresh groundwater sources; c) Saline groundwater source; d) Saline surface water source    P RO OHR DISTRIBUTION DISTRIBUTION P OHR LEGEND: Vertical components: Storage  Treatment Pumping Station Horizontal components: Transmission main Distribution main P DISTRIBUTION LAKE/ OCEAN RIVER/ ESTUARY  TREATMENT PLANT OHR P TRANSMISSION RO THERMAL DESALINATION  OR  STORAGE LAKE RIVER  TREATMENT PLANT OHR P TRANSMISSION TRANSMISSION DISTRIBUTION STORAGE  17  Difference between L-WSS and SM-WSS 2.3.1 There are several factors that make the SM-WSS different from the L-WSS, some of which are directly related to the size of the system. Factors specific to the SM-WSS are as follows:    relatively new inclusions to developments in the proximity of large cities;   pipe sizes are small (thus less costly) due to less water demand;   less impact of natural resources due to relatively small water withdrawals; and  less production of greenhouse gases (GHG) emissions from energy consumption.   In other cases, factors are site specific and might not be applicable to all SM-WSS, particularly in the case of privately-operated, organized water supplies in developed countries. These factors may include:   lack of financial resources;  low capital and operation costs (if compared on the basis of the same type of water source);   lack of technical staff, equipment, and vehicles;   lack of awareness and access to recent technologies (true for both public and water supplier); and  less intention to manage and more to replace and/or renew the system components.  To qualitatively and quantitatively justify these differences, a detailed inventory of different types and sizes of utilities operating around the globe is required, which unfortunately is not readily available in literature and/or might have not been reported due to lack of attention given to performance evaluation of SM-WSS utilities to date. A notable exception is the periodic reporting for drinking water infrastructure future needs in the United States by the USEPA. Therefore, a brief review of these reports is presented below as a useful case study to understand the differences between L-WSS and SM-WSS.    USEPA Drinking Water Infrastructure Needs Surveys – Case Study 2.3.2 The United States Environmental Protection Agency (USEPA) has been publishing “Drinking water infrastructure needs survey and assessment for next 20 years” reports periodically for the years 1995, 1999, 2003, and 2007, which were published in years 1997, 2001, 2005, and 2009, respectively. Figure 3.2 shows the findings of a recent report, published in 2009 (showing financial needs for the year 2007) for all types of systems. According to this report, the percentages of the population living in small WSS (S-WSS) and L-WSS are 9% and 45%, respectively. The figure also reveals that medium WSS (M-WSS)  18 have proportionate (i.e., 45%) total financial needs for 45% of the total population (USEPA 2009), whereas the S-WSS account for almost double (18%) of the community’s water system financial needs per percentage of population (i.e., 9%) due to low economies-of-scale. Figure 2.3 also shows that maximum (more than 60%) financial needs have been estimated for improvements of distribution systems, and the remaining 40% is for treatment (filtration, disinfection, and corrosion control), and source (surface water intake structures, drilled wells, and spring collectors). A shift in trend of the distribution system investment needs for all sizes of WSS can be seen in Figure 2.4. The reason for SW-WSS having the highest investment needs could be the change in population limits from 50,000 to 100,000 people. It is possible that some L-WSS may have moved into the M-WSS category since 2007.   According to the USEPA evaluation study on S-WSS conducted from June 2005 to December 2005, these systems are facing many challenges (as described above) along with regulatory/compliance challenges. Massachusetts officials stated that if the required amount for system improvement is less than $100,000, it is not cost-efficient for a small system to furnish a loan application (USEPA 2006). Moreover, S-WSS lack in economies of scale (lower capital cost but high O&M cost per household) as compared to large systems, and thus are likely to be lacking in their technical, financial, and management capacities (USEPA 2001, Dziegielewski and Bik 2004, Brown 2004, Maras 2004, Braden and Mankin 2004).    Source Water 2.3.3 The selection and application of different PIs related to various components of a WSS depend on the type of source water, regardless of the size of the system. A small WSS relying on surface water source (Figure 2.2a) has to consider PIs related to water storage and treatment facilities; on the other hand, performance of a large WSS with a fresh ground water source (Figure 2.2b) can be evaluated without such PIs.    System Age and Pipe Size 2.3.4 American Water Works Association AWWA (2007) conducted a Community Water System Survey in 2000 on 1,806 systems of all population sizes. A summary of the percentage of pipe per system, by age, for each size of system (population served) is shown in Figure 2.5. It can be observed that most of the pipes (more than 80%) in SM-WSS are less than 40 years old. Therefore, the operational difficulties related to pipe age might be less than with L-WSS.    19              Figure 2.3 Total 20-year financial need by system size and project/ component type (in billions of January 2007 dollars) (developed using USEPA 2009 data)                 Figure 2.4 Variation in 20-year distribution system investment needs by system size from 1995-2007, according to USEPA infrastructure needs survey and assessment for United States (developed using USEPA 1997, 2001, 2005, 2009 data)    04080120160L-WSS (>100,000) M-WSS (3,300 - 100,000)* S-WSS (<3,300)Investment Needs (Billion US $) Distribution & Transmission Treatment Storage Source Total* Population 0204060801001995 1999 2003 2007Distribution system Investment Needs (Billion US $) S-WSS (<3,300) M-WSS (3,300 - 100,000)* L-WSS (>100,000)* change in population limit from 50,000 to 100,000 in  2007  20               Figure 2.5 Percent of pipe system by age class in different system sizes (developed using AWWA 2007data)  The AWWA’s 2007 report also included results for an average length (miles) of each size pipe, from less than 6 in. to greater than 10 in., in different system sizes; these results are shown in Figure 2.6. The figure shows that water mains having sizes greater than 10 in. exist mainly in L-WSS. Most of the pipe diameters in SM-WSS are less than 6 in.; therefore, conventional and expansive condition based assessment techniques applicable to L-WSS cannot be applied to SM-WSS. In order to make on-time and optimized decisions, rational PIs integrated with risk based methods can play an effective role for such systems.   Public Health Risk 2.3.5 According to Ford et al. (2005), the public health problems faced by S-WSS are different than those experienced by L-WSS. The following reasons for this have been reported in the literature (USEPA 2006, Hamilton et al. 2004):   higher exposure of pathogens due to operational issues in chlorination practices;  inadequacy of treatment systems to deal with outbreaks;  limited financial resources for monitoring and mitigation measures;  lack of trained laboratory staff;  higher concentrations of nitrates and pesticides in source water due to more farming activities in rural proximity; and  overall mismanagement such as lack of preventive maintenance, poorly designed and constructed facilities, and inefficient operation and maintenance. 0102030405060708090100<100 101-500 501-3300 3300-1000010-50000 50-100000 100000-500000> 500000S-WSS M-WSS L-WSS% pipe per system by age (yrs) < 40 years40 - 80 years> 80 years 21 Meeting drinking water quality standards is most difficult for smaller systems. USEPA (2006) statistics over the years have shown that noncompliance with drinking water regulations increases as the size of the system decreases due to inadequate resources in terms of both equipment maintenance and qualified operators.               Figure 2.6 Average miles of pipe per system by diameter in different system sizes (developed using AWWA 2007 data)   Operation and Management Systems  2.3.6 Management capacity is one of the most important components of successful operations of a WSS. The main problem particularly with S-WSS is a lack of fully-qualified technical individuals (i.e., in many cases, drinking water operations are not their sole occupation). According to Braden and Mankin (2004), many small communities are struggling to evaluate required improvements, generate funds, and manage the more advanced systems required to meet drinking water standards. They also reported that most of the small systems cannot attract and/or retain officials with the required knowledge due to poor pay scales and low job recognition (Ford et al. 2005). Consequently they are operating with part-time technical personnel and few staff members to plan, oversee, and manage infrastructure improvements. Moreover, most of the systems do not have effective leadership.   Physical, Water Quality, and Environmental Sustainability  2.3.7 In a distribution system the common problems are associated with loss of water or pressure due to leakage and pipe breaks. There is always a trade-off between the cost of increasing water production and the cost 02004006008001000<100 101-500 501-3300 3300-1000010-50000 50-100000 100000-500000> 500000S-WSS M-WSS L-WSSAverage length of pipe per system by dia (miles) < 6 in Dia.6 - 10 in Dia.> 10 in Dia. 22 of repair; however, this might not be practical for SM-WSS facing water scarcity issues or limited capacity at source waters. Another point of view is that it is cost-effective for SM-WSS to simply fix lines when they break. In Australia, the best risk management approach for S-WSS is to focus on responsiveness to failure – to reduce the cost per break repair and the time-out-of-water (Cromwell et al. 2001).  Detailed condition assessments of small mains having lower priority risks, using statistical analysis of raw data on certain parameters such as break trends segmented by pipe material and soil type to evaluate overall replacement needs, would not cost as much as other forms of actual condition assessment of specific lines (Sadiq et al. 2010). On the other hand, non-destructive testing (e.g., closed-circuit television CCTV cameras) are limited to larger diameter pipe (i.e., typically 24 in. diameter or greater in the US) due to high costs involved in testing procedures (Liu et al. 2012). Most condition assessment methods are expensive for SM-WSS, therefore only reliable and effective PA can help with decisions regarding their practical application.  Performance indicators related to environmental and socio-economic sustainability of SM-WSS should also be considered in the overall PA framework. In any WSS, energy is consumed in several operation and maintenance activities. Moreover, higher water usage not only impacts water resources directly but also generates large wastewater volumes, which eventually leads to huge energy requirements for treatment, reuse, or final disposal. GHG generated by all these activities are responsible for climate and hydrological changes (Parfitt et al. 2012).   The components of a WSS shown in Figure 2.2 are inter-dependent; inefficient working of any of these components leads to the loss of either water or energy, or both. Sometimes SM-WSS are located close to each other and rely on the same water source. In this case, sustainability indicators including climate change impact on water availability in source water should also be included in the PA framework, as a larger population can be affected in case of drought. Other environmental concerns such as impacts of chlorinated water on aquatic life due to leakage in a water mains and flushing of pipes passing through or nearby natural water bodies, need to be addressed. In general, the concentration of residual chlorine is water distribution systems ranges between 0.5 to 1mg/L; while, much smaller concentrations can significantly affect the fish and other aquatic micro-organisms (Donald et al. 1977).  PIs can be used to describe the overall LOS for various physical components (storage, treatment, distribution, etc.) as well as functional components (physical, customer services, environmental, financial,  23 etc.) of a water utility. In the following sections, a state-of-the-art review is carried out on existing PIs systems developed by various agencies around the globe.  2.4 Literature Review of Performance Indicators Systems for Water Utilities  Performance is the degree to which infrastructure provides the services to meet the community expectations, and is a measure of effectiveness, reliability and cost (NRC 1995). The performance of a water utility depends on the efficient and reliable working of all of its functional components, including water resources, physical assets, operational, customer service, personnel, public health, environmental, and financial. The performance of a WSS is evaluated to indirectly estimate the conditions and rehabilitation needs in order to ensure continuous and reliable working of all of these components of a WSS during their entire service life before the occurrence of a failure. Once a failure has occurred, the cost of corrective action is much more than the planned preventive action would have been. The difference in the planning and management cost and the cost of corrective actions justifies the need for performance assessment.   There are several methods of performance evaluation for water utilities given in the literature, e.g., PIs, total factor productivity indexes, production frontiers, etc. (Coelli et al. 2003). However for methods other than PIs, more sophisticated data is required, which is difficult to acquire or is sometimes even missing for SM-WSS, due to on-going restructuring and expansions, and inadequate information technology. Review of literature in the following sections starts with a brief overview of commonly used terminology and history of PIs followed by a comprehensive review on recent developments.   Terminology and Historical Background of Performance Indicators 2.4.1 The Canadian Water and Wastewater Association (CWWA) (2009) briefly defined the terms, PIs, variables, benchmarks, and target as follows:  Performance Indicator - A performance indicator is a parameter or a value derived from other parameters, which provides information about the achievements of an activity, a process, or an organization with a significance extending beyond that directly associated with the calculated value of the parameter itself. For example, the average number of liters of water supplied per person per day. Indicators are typically expressed as commensurate or non-commensurate ratios between the variables.    24 Variables - Performance indicators involve the measurement of data variables generated by analysis of the service performed. The selected variable should be easy to understand; accurately measureable with available equipment, staff, and funds; easily reproducible or comparable; should refer to the geographical area and reference time of the study area; and be relevant to the indicator to be developed. These are basically the baseline data required to determine the associated value of a performance indicator, e.g., number of service connections, population served, total water main length, and annual costs.   Benchmark – This is a numerical point of reference generally for the past or present. For example, in 2008, the average supply of water to residential customers was 350L/p/d. The benchmark values established for the future should be considered as targets.  Target - A target in reference is a determined value for the PI, which is to be achieved over time (future) through the conduct of a program. For example, a target for average water supply would be to reduce average demand to 300L/p/d by the year 2012.  The International Water Supply Association (IWSA) selected the topic “Performance Indicators” for one of its world congresses during the early 90’s, but the concept could not garner much interest. However, three to four years later this concept was highlighted by a number of senior members of water utilities. A good PI system is the one that contains PIs which are fewer in number, clearly defined, non-overlapping, useful for global application (i.e., wider applicability), easily understandable, refer to a certain time period (i.e., preferably one year), address a well-defined geographical area, and represent all the relevant aspects of water utility performance (Alegre 1999).  The National Civil Engineering Laboratory (NCEL) in Portugal proposed a hierarchical structure of PIs, including four main groups, for the first time in 1993. Each group consists of a number of PIs and a sub-set of indicators in the upper layer (Matos et al. 1993). Later in 1994 and 1995, the Portuguese Association of Water Resources (APRH) and NCEL jointly proposed a hierarchical structure of LOS into four categories (i.e., organization, engineering, environmental, and capital categories) (Defaria and Alegre 1996) (Figure 2.7). This structure was more efficient as it provided the basis of PA of the organization staff and also incorporated consideration of environmental impacts on water resources.   Alegre (1999) carried out a comprehensive review for the development of conceptual framework and PIs for WSSs. The work done by various agencies and researchers on PA of WSSs before the year 2000 was covered in that review (National Civil Engineering Laboratory - NCEL, Portugal , Matos et al 1993;  25 American Water Works Association Research Foundation – AWWARF, Alegre 1999; Portuguese Association of Water Resources - APRH, Portugal, Defaria and Alegre 1996; Malaysian Water Association 1996; Water and Sanitation Division, World Bank – WB, Yepes and Dinderas 1996; Asian development Bank – ADB, McIntosh and Ynoguez 1997; Dutch Contact Club for Water Companies and The 6-cities group of the Nordic Countries, van der Willigan 1997; and, International Water Services Association – IWSA, Alegre 1999). The findings of this review reveal that water resources, water quality, and environmental indicators were not given sufficient importance until 1997. One of the reasons could be fewer environmental issues due to relatively lower population and low water demands; moreover, awareness about eagerly addressing environmental problems has also been accelerated during the 21st century. State-of-the-art developments in PIs for WSSs by different international agencies in various parts of the world are presented in the following section.                    Figure 2.7 The concept of the Faria and Alegre (1996) level of service framework   IWA Manual of Best Practice 2.4.2 According to Alegre (1999), PIs are used to assess the performance of a WSS in terms of efficiency and effectiveness. The efficiency is a measure of the extent to which the resources of a WSS are utilized optimally to provide the service (i.e., ratio between input consumed and output achieved); and Global Technical Environmental Financial Coverage Reliability Water Quality Hydraulic Performance Continuity of supply Complaints related to pressure Minimum Pressure violations Maximum Pressure Violations  L1: Global L2: Main  L3: Pivotal L4: Auxiliary L5: Calculation level  26 effectiveness is a measure of the extent to which the targeted objectives (specifically and realistically defined) of the utility as a whole and its management units are achieved. The first edition of International Water Association (IWA) Manual of Best Practices was published in July 2000. The PIs system was developed through close collaboration with experienced managers, practitioners, and researchers (Nurnberg 2001). This manual was applied for the benchmarking water utilities particularly in Europe, including Austria and Germany (Theuretzbacher-Fritz et al. 2005). An overview of performance benchmarking studies (about 30 large scale companies, 270 medium and small-sale utilities, and about 20 bulk supply companies), including legal framework, and technical standards, was presented in IWA-World Water Congress-Berlin 2001 in Germany. It was observed that all of them were using different approaches for their PA and thus there was a lack of a standardized assessment system. The use of IWA Manual of Best Practice (Performance Indicators for Water Supply Systems) for standardized and comparable PA was recommended by the congress (Nurnberg 2001).  The general concept of this manual was layered pyramid structure, starting from raw data at the bottom, feeding the PIs on the above layers (Figure 2.8). This structure consists of a theme of indicators, sub-indicators, and variables. Alegre et al. (2000) structured more than 150 PIs in six categories (i.e., water resources, personnel, physical, operational, quality of service, and economic and financial). These groups help the user to identify the purpose of a specific indicator and are given a two letter code (e.g., ‘Pe’ represents “Personnel” group). Each group is sub-divided into sub-groups for further understanding (e.g., total personnel, personnel per main function, personnel qualification, etc.). These sub-groups consist of a number of PIs calculated from variables; where variables are the baseline data elements consisting of measured or recorded values in specific units. All these measured values are required to be used with certain accuracy and reliability bands to indicate the quality of the observed data (Alegre et al. 2000).          Figure 2.8 Concept of layer pyramid used by Algere et al. (2000) for calculating performance indicators  Variable (Data Elements)  Indicators/ sub-indicators Indicator Sub-group Indicator Group  27 These PIs were further prioritized on the basis of their relative importance (level of priority) as level 1,2&3. Level-1 is the first layer of indicators that provides a synthetic global overview of the efficiency and effectiveness of the water undertaking. Level-2 consists of additional indicators, which provide a more detailed insight than the Level-1 indicators for users who need to go further in depth. Level-3 indicators provide the referred comprehensive global assessment of the undertaking (Alegre et al. 2000, 2006). This level-based-structure can provide a basis to start a performance evaluation process for SM-WSS with Level-1 indicators; in the case of SM-WSS it could be possible that Level-3 indicators are not important.   The IWA published the 2nd edition of its manual in 2006. In this edition, the total number of PIs increased to 170; however, no changes were made in categories of PIs (Alegre et al. 2006). This reference in this review has been used as IWA (2006) for the purpose of comparison between different agencies. A summary of PIs along with a number of sub-indicators is presented in Appendix A-1. For detail of sub-indicators and allocated levels, interested readers are referred to Alegre et al. (2006). It can be seen in Appendix A-1 that the system seems to be well balanced and covers all aspects of a WSS. Indicator sub-groups (in each category) provide a way to effectively interpret an indicator; for example, the sub-group of ‘pressure and continuity’ is placed under the category of quality of service. This sub-group further contains 8 PIs related to pressure and continuity of supply, such as adequacy of supply pressure at delivery points in terms of percentage, and number of water interruptions per thousand connections, respectively. The proposed system seems to be well structured with wider applicability and can include new indicators as well. Water quality has also been given sufficient importance by explaining aesthetic, physical-chemical, and biological indicators separately. However, more recently identified environmental indicators such as greenhouse gas emissions were missing. SigmaLite (2.0) software was developed on the basis of the framework proposed in the 2nd edition of the IWA manual. Details can be seen at www.sigmalite.com. The software also generates charts showing comparisons between the calculated PIs and the benchmarks.     The World Bank 2.4.3 The International Benchmarking Network for Water and Sanitation Utilities (IBNET) was launched in 1996 under the water and sanitation program of the World Bank (WB). The IBNET was developed by the Energy and Water Department of the WB with the aim to provide access to comparative information and to promote best practices among water supply and sanitation providers worldwide. The IBNET provides standardized measurements (i.e., a set of tools) to the water suppliers to assess their own operational and  28 financial performance and against the performance of similar utilities at national, regional, and global levels. Through the IBNET platform the utilities around the world, including South Asia and Africa, found an opportunity to compile and share their operational performance and costs (Berg and Danilenko 2011).   This reference has been used as WB (2011) for the purpose of comparison between different agencies. The IBNET is a toolkit having a set of financial, technical, and process indicators, which provides a gradual approach to utilities having little or no variables data to calculate the indicators, by providing a “Start-up kit” to move slowly towards a more advanced benchmarking system. In this way, this kit could be a useful tool for SM-WSS and utilities in developing countries with data limitations. However, the toolkit does not cover a large set of indicators that have been used by water supply agencies in developed countries, such as IWA (2006).   The IBNET categorizes the 80 selected PIs into 12 groups as presented in Appendix A-2 (Berg and Danilenko 2011). Along with the categories mentioned in Appendix A-2, the manual also address 8 normalizing factors, including operating cost, staff, revenue, system failure indicators, population, number of connections, volume of water used, and network length. Service coverage is taken as a separate category in IBNET, whereas in IWA (2006) the coverage indicators are considered in the quality of service category. Water losses are taken up as non-revenue water. The IBNET system assesses the network performance in terms of number of breaks per km per year; this indicator along with other water loss indicators is considered under the operational category in the IWA manual.   In IBNET, only metering is considered amongst all the physical components of the WDS, whereas other important indicators including pumping, valves, hydrants, treatment, and storage have not been included. Other categories mentioned in Appendix A-2 (i.e, operating costs and staff, billing and collections, financial performance, and assets) are mainly financial and/ or economic indicators and could be addressed under one category as sub-components to simplify the overall structure of the system. However, water affordability is an important indicator for future investment planning in the water sector, particularly in developing countries and SMWU. In water quality parameters only residual chlorine is considered. Overall, the IBNET system appears to be suitable for cross-utility and cross-country performance comparisons in developing countries.     29  National Water Commission (NWC), Australia 2.4.4 The Australian Government’s National Water Commission (NWC) has developed a national performance framework (NPF) for performance evaluation for both the urban and rural water utilities in Australia. The commission used only 32 selected PIs grouped into four categories (i.e., characteristics, customer service level, environmental and water management, and financial) for comparative analysis of rural water utilities. For urban water utilities with complex WSSs and large administrative structures, 116 PIs have been grouped into seven categories (i.e., water resources, asset data, costumers, environment, public health, finance, and pricing). The commission has been publishing annual reports on performance reporting indicators and definitions since 2005 (NWC 2012). Unlike the IWA manual, here all the indicators are not calculated as ratios, divisions, and percentages. The reason could be that this framework is used to compare the performance of utilities operating specifically in Australia under similar environmental conditions. A summary of the PIs is presented in Appendix A-3.  The dry conditions resulting in water shortages have led Australia to partially rely on ‘rainfall independent supplies to enhance the water security in the country’ such as desalination of sea water, recycled water, etc. (NWC 2012, Chartres and Williams 2006, Bari et al 2005). Moreover, per capita water consumption in Australia is one of the highest in the world (Stoeckel and Abrahams 2007). Due to these issues, water resource indicators have been given primary importance in Appendix A-3. Twenty-three PIs of water supply provide detailed insight to all possibilities of water resources (e.g., groundwater, surface water, desalinated marine water, desalinated groundwater); the types of supplies (e.g., residential, commercial, industrial, etc.); and the water quality (potable and non-potable).   An example of the indicator of “Total sourced water” is presented in Figure 2.9. The indicators selected in this system are not defined as data elements or variable, sub-indicators or process indicators. For example, all the indicators given in Figure 2.9 are essentially the data values (i.e., volume of water sourced) (see comments in Appendix A-3). These values cannot provide the basis of cross-comparison with other similar WSSs at an international level. These PIs need to be calculated in terms of per thousand of population served or similar units. Indicators like percentage of recycled water can be compared with other WSSs practicing the same concept of reuse of supplied water (after applying desired level of wastewater treatment).   The way environmental indicators are addressed in the NWC (2012) framework, in terms of GHG emissions, is worth mentioning. Most of the development activities are generating various types of GHG  30 such as CO2, CH4 and N2O, which eventually lead to global warming. For example, use of vehicles for water quality monitoring, bills delivery and collection, transportation of construction materials and operational equipment, and operations of pumps and motors for distribution of water at desirable pressures produces significant greenhouse emissions. Moreover, most of the supplied water is converted into wastewater after use, and thus requires a certain degree of wastewater treatment either for reuse (to meet desired irrigation of landscaping standards) or final disposal (depending on assimilative capacity of the receiving water body). In this regard, different wastewater treatment plant operations also utilize energy and thus produce GHG emissions.   The Commonwealth of Australia (CWA) (2011) expresses different emission factors in terms of a quantity of a given GHG emitted per unit of energy (e.g., kg-CO2-e/GJ for energy; tCH4/t coal for fuel; etc.). These emission factors were used to calculate GHG emissions with activity data (e.g., kiloliters multiplied by energy density of petrol used). The overall GHG emissions from each activity of a WSS have been calculated as CO2-equivalent. The CO2-equivalents are the amounts of carbon dioxide that would have the same relative warming effect as the greenhouse gases actually emitted. It is desirable to consider indicators concerning global climate change for sustainable development in water sector.  The indicator system proposed by NWC (2012) could be very useful in areas facing water shortage or droughts. The framework shown in Figure 2.9 is indeed very comprehensive, and provides a deep insight to the interrelationships between different water resource indicators. Similar interactions have been developed between different indicators for other groups. Details can be seen in NWC (2012). However, certain very important indicators such as personnel and operational (i.e., pump, storage, network inspection, leakage inspection, etc.) have not been included. In case of a very detailed organizational and administrative structure for water utilities, like the one in Australia, indicators related to personnel and operational efficiency might not be significantly important for consideration. However, for SMWU around the world where technical staff availability to solve day to day operational problems varies from system to system, these indicators are inevitable for the overall PA.   American Water Works Association (AWWA)  2.4.5 The American Water Works Association (AWWA), established in 1881, is the first scientific research organization of drinking water in North America. AWWA started the performance evaluation of their water utilities in 1995 under the utility quality service program. In one of their early documents, titled “Distribution System Performance Evaluation” published in 1995, criteria consisting of adequacy  31 (quantity and quality), dependability (interruptions), and efficiency (utilization of resources) were used to evaluate the performance of water distribution systems. The main focus of discussion was related to distribution system components, and not on the environmental and water resource components.     Figure 2.9 An example of interaction between various indicators and sub-indicators in the NWC (2012) performance assessment framework “A partial reproduction of Fig 1 Interrelationship between NFP indicators, p. 26 of National Performance Framework: 2010-11 urban performance reporting indicators and definitions handbook”   Later in 2004, the AWWA launched the QualServe Benchmarking Program (originally named as a utility quality service program) which proposed a set of high level (most important) PIs with a goal to facilitate inter-utility comparisons, and intra-utility trends analysis. In this benchmarking program 22 key performance indicators were suggested for water and wastewater utilities. Amongst these, 17 indicators are applicable to water supply. Some of these indicators comprised a set of indicators (or sub-indicators), which makes a total count of 35 indicators (Lafferty and Lauer 2005). The PIs have been grouped into four categories, including organizational development, customer relations, business operations, and water operations. The indicator system showing groups of PIs along with individual PIs is presented in Appendix A-4. Using these PIs, the AWWA published a survey data and analysis report in 2005 for benchmarking 202 water utilities across all over North America, including Canada. Amongst all the participating utilities in the survey, only 21% were SMWU with a population less than 100,000. It is also reported that SMWU were operated at more residential cost (customer relation indicator) than the LWU. Conversely, water distribution pipelines renewal and replacement rates (business operations indicator) were much lower in SMWU than the larger ones. Environmental and water resources indicators have also not been addressed. Ground Water Potable       Total  Sourced Water Surface Water Marine desalination Recycled Volume of water received from bulk supplier Storm water used Non-Potable Recycled Storm water Received  32  Office of the Water Services (OFWAT) – UK and Wales 2.4.6 The technical evolution of the water industry gives an insight into the history of privatization of water utilities in United Kingdom (UK). In 1970, the British government decided to merge hundreds of medium and/or small sized water companies into 10 water authorities to solve the problems associated with lack of human resources and equipment. Later, the water authorities found that the existing systems had become very old and required massive investments for improvements; therefore, it was decided in 1990 to privatize all water and wastewater services in England and Wales. Moreover, the consumers had higher expectations due to increasing water rates; thus the Office of the Water Services (OFWAT) monitors and compares the performance of these water companies to ensure that consumers are getting what they pay for and also to check the compliance of the companies with their legal obligations (OFWAT 2010).  Recently, OFWAT has reported a minimum set of 14 key PIs (grouped into 4 categories) to review the prices and to check the regulatory compliance of water companies in UK and Wales (OFWAT 2012, 2010, and 2010a). The information regarding these indicators is given in Appendix A-5 along with description of some specific indicators such as Service Intensive Mechanism (SIM) and Security of Water Supply Index (SoSI). SIM is used to control the water rates, whereas SoCI indicates the guarantee a supplier can give to ensure the level of service. Moreover, sufficient importance has been given to environmental indicators.   The application of OFWAT indicator system seems to be suitable for a regulatory authority dealing with a number of utilities, irrespective of their size. At an individual level, companies might have been using a much wider set of indicators (e.g., indicators covering the quality of supplied drinking water). The reason for considering such a small number of indicators probably is the overall performance monitoring of different companies, for which financial and customer service indicators are more important to ensure customers satisfaction. Some of these PIs include several variables and provide information regarding performance of more than one indicator qualitatively or quantitatively. For instance, SIM is a financial mechanism to incentivize optimum levels of customer service through the price control process. It is comprised of a quantitative indicator that measures complaints and unwanted contacts; and a qualitative indicator that measures how satisfied customers are with the quality of service they receive, based on a survey of consumers who have had direct contact with their water company.     33 SIM can be calculated to assess the performance of the company for customer satisfaction using the following formula (OFWAT 2012 and 2010):   SIM = {[(S – LS) / (HS – LS)] x WS} + {[1 – ((C – CL) / (CH – CL))] x WC} [2.1] Quantitative indicator Qualitative indicator   where, S is the satisfaction score achieved (qualitative survey); LS is the satisfaction minimum (1); HS is the satisfaction maximum (5); WS is survey weighting (50); CL is composite measure minimum (0); CH is composite measure maximum (600); WC is the quantitative composite measure weighting (50); and C is the quantitative composite measure, which can be calculated by using the following equation:  C = (All lines busy + calls abandoned + unwanted telephone contacts + (written complaints*5) + (escalated written complaints*100) + (Consumer Council of Water (CCW) investigated complaints*1,000)) / (connected properties/1,000)   [2.2]  Each of the elements in Equation (2.1 and 2.2) have been given weights (i.e., the values being multiplied with the elements; for example all lines busy weighs “1” and CCW investigated weighs “1000”) to reflect the increasing impact on consumers and the resulting cost to the supplier. In equation (2.2), the weights seem to be established based on the relative importance of the elements and their frequency of occurrence. For instance, the frequency of escalated written complaints is usually lower than the written complaint, but their significance is higher, in result their corresponding weight of ‘100’ is 20% higher than the weight assigned to the written complaints (i.e., 5). The SIM is calculated annually by combining the two indicators (quantitative and qualitative) with equal weightings (i.e., 50). The value of SIM is assessed as “tolerance” designated with different colour, such as green (SIM > 50), amber (SIM = 40-50), and red (SIM < 40). A higher value of SIM indicates better performance (OFWAT 2012, 2010). According to OFWAT (2012), new small size companies are entering into the OFWT system and the set of indicator system proposed in Appendix A-5 has been well established.    National Research Council (NRC) – Canada 2.4.7 NRC (2010) proposed a model framework composed of three building blocks: objectives, assessment criteria, and the PIs. As per NRC, the six key objectives (i.e., public safety, public health, economy, environmental quality, social equity, and public security) can only be satisfied through a feasible and  34 durable initial design, an efficient and continuous maintenance system, the preservation activities to ensure acceptable physical condition, and acceptable levels of functionality during the entire life cycle of these important public assets (NRC 2010).   In the framework proposed by NRC, the PIs are evaluated for eleven assessment criteria to provide the linkage between the PIs and the objectives of the WSS. These assessment criteria are also known as ‘indices’ or ‘indicator domains’. These are the statements or principles used to determine whether the specified, above-stated objectives have been met or not (either qualitatively or quantitatively). Under these objectives, 31 PIs along with their associated groups are presented in Appendix A-6. The allied assessment criteria for each group of indicators are defined in the following (NRC 2010):  - Safety Impacts: How water supply services support the reduction of incidents or accidents that result in death/injury and/or property loss. Health Impacts: The health impacts (both direct and indirect) which are beneficial or detrimental to consumers as well as to the general public. - Security Impacts: The performance of the water supply service in terms of protecting the security of the users, operators, and public at large. - Economic Impacts: The direct and indirect impacts (beneficial or detrimental) of WSS on local, regional, and national economies. Environmental Impacts: The direct and indirect impacts of water supply service on the natural environment (air, water, soil, fauna, and flora) and on climate change. - Quality of Service: An assessment of how well the service meets established levels of service, regulatory requirements, industry standards, and customer satisfaction. - Access to Service: The geographical coverage and affordability of infrastructure services, and provision of access to people with disabilities. - Adaptability: The capacity of the service to adapt to short and long term changes and pressures. - Asset Preservation, Renewal and Decommissioning (Asset PRD): The management of water supply assets to keep the service operational at its intended level of service through inspection, routine maintenance, repair, rehabilitation, renewal, and ultimately, decommissioning. - Reliability of Service: Ability of WSS to perform its required function under stated conditions for specified periods of time.  - Capacity to Meet Demand: The capacity of the service to meet demand under current and future conditions, extreme events, and in emergency situations.  The indicators proposed by NRC (2010) in Appendix A-6 provide detailed information regarding environmental, public health, social, security, and economic performance. On the other hand, PIs related  35 to personnel, physical and operational aspects of a water utility have not been given required importance. Out of the 37 PIs listed in Appendix A-6, 18 PIs are essentially the service indicators; the remainders are related to both the services and the assets. For example, protection against climate change impact can be reduced by maintaining and improving the conditions of the pumps and vehicles (i.e., lowering the emissions from fuel burning and use of electricity), which will also improve the overall service of the WSS. Overall, the PIs seem to be more suitable (directly) in the context of asset management at strategic level and need to be supported with more PIs for practical performance assessment of SMWU at tactical level.    Asian Development Bank (ADB) 2.4.8 The Asian Development Bank (ADB) has developed the project performance management system (PPMS), having a result-based management approach focusing on service targets and outcomes. The concept emerged during the 1980’s, when result-based management was gradually applied to public sector management in the course of public sector reforms. The result-based management needs a series of tools to carry out strategic planning, performance monitoring and assessment, and reporting. However, this management system needs three prerequisites: i) support from the leadership, ii) result-based organizational culture, and iii) improved support systems. Moreover, the approach includes the whole project life cycle encompassing project identification, preparation, appraisal, loan negotiations and approval, implementation, and finally the project evaluation (ADB 2012).   As ADB is a funding agency, it is primarily concerned with the allocation and utilization of resources in a water supply project. After the completion of the project, the ADB’s independent evaluation department appraises the project performance, and also shares the experiences and lessons learned for the planning and design of new projects. The PPMS developed by the ADB is based on a comprehensive project design and monitoring framework (DMF) shown in Figure 2.10. In DMF the cause-effect relationship between inputs, activity, outputs, outcome, and impacts has been established to determine the targets at the result level, and to select the PIs for gauging these selected targets. The selection of the relevant indicators is carried out with the participation of all the stakeholders in the process of problem identification, targets analysis, solution selection, formation of assumption, and risk analysis. The first column in Figure 2.10 is a design summary, which outlines the elements of the project (i.e., inputs, output, outcome, and impact). The other three columns provide a framework for project performance and monitoring. Details can be seen in ADB (2012).   36 To understand the PA mechanism proposed by ADB (2012), the core components of the framework are impacts, outcomes, outputs, activities, and inputs. Impacts are the goals or longer-term targets referred to as the sectorial, sub-sectorial, or in some cases national targets; or the social, economic, environmental, and policy changes brought about by the project. Outcomes are expected targets for realization upon project completion and they should explicitly describe the specific development issues to be addressed by the project. Outputs are the physical assets, tangible goods, and/or services delivered from the project, as well as the descriptions of the project scope. Activities consist of a series of tasks conducted for the realization of outputs from production, and Inputs are the main resources necessary for engaging in activities and generating outputs, including staff, equipment, materials, consulting services, and operating funds.   Design Summary Performance Target/ Indicator Data Sources/ Reporting Assumptions (As)/ Risks (Rs) Impact   As/ Rs Outcome   As/ Rs Outputs    Activities with Milestones inputs   Figure 2.10 ADB (2012) Project Design and Monitoring framework (DMF)  In the DMF system, targets (outcome) can be gauged both quantitatively and qualitatively. In situations when quantitative analysis are not possible, qualitative indicators are determined and then converted to quantifiable data using normalization methods to realize their gauging functionalities. According to this system, a PI should be clear, relevant, economical, adequate, and monitorable (CREAM), where Clear means precise and unambiguous; Relevant means appropriate and timely; Economical means available at reasonable costs; Adequate means sufficient to access performance; and Monitorable defines that the indicator can be independently verified.  Overall, 15 PIs at impact level and 39 at outcome level for urban water supply systems are presented in Appendix A-7. Indicators given in Appendix A-7 are grouped based on specific targets established by the ADB. On the other hand, all these indicators also belong to various groups of indicators as described by other organizations (refer to comments column of Appendix A-7) (IWA 2006, AWWA 2004, NWC  37 2012). The framework proposed by ADB seems to be relatively complex for practical application for SM-WSS. However, its contribution could be useful in order to identify the commonly used important PIs.   Canadian Standards Association (CSA) 2.4.9 The Canadian Standards Association (CSA) is a non-profit organization chartered in 1919. Later, it was accredited by the Standards Council of Canada in 1973. The CSA Technical Committee reviewed and recommended the International Organization for Standardization (ISO) Standards guidelines (i.e., CAN/CSA-Z24510, CAN/CSA-Z24511, CAN/CSA-Z24512) for improvement of service to users for Canadian water utilities in 2007. Amongst these standards, CAN/CSA-Z24510 is service-oriented whereas, both CAN/CSA-Z24511 and CAN/CSA-Z24512 are management-oriented. These guidelines are applicable to both publicly operated and privately owned water utilities. All these standards propose a step-by-step, loop-back approach to establish PIs (CSA 2010).   According to CSA (2010), the main objective is to provide guidelines (consistent with the goals defined by the relevant authorities) to stakeholders for the management, assessment, and improvement of water utilities to ensure desirable service to the users. The contents of the international standards given in ISO 24512: 2007 (CAN/CSA-Z24512-10) are shown in Figure 2.11. The management components of a utility defined by CSA (2010) in CAN/CSA-Z24512 are activities and processes to achieve the principal objectives including protection of public health, meeting user’s needs and expectations, provision of services under normal and emergency situations, sustainability, promotion of sustainable development of community, and protection of environment.   In order to meet these objectives, the CSA (2010) framework provides guidelines at organization, planning and construction, and O&M levels for efficient performance of all the management components. It is obvious that each management component has its own hierarchal structure and specific requirements at all levels. For example, the objective with storage structures is provision of emergency services; therefore, the guidelines applicable to this asset would be related to watershed characteristics.  The next step in the CSA (2010) framework is defining the assessment criteria. One service criterion can be related to more than one objective. For example, the assessment criteria of source protection is associated with protection of public health, provision of services under normal and emergency situations, sustainability, and protection of the environment at the same time. The calculations of PIs proposed in this framework are similar to IWA (2006) based on context information and variables. Details of each component of the framework can be seen in CSA (2010).  38 A practical application of the CSA (2010) system is the Canadian National Water & Wastewater Benchmarking Initiative (NWWBI). The 2012 Public Report prepared by AECOM (2012) summarizes the performance evaluation results of 41 water utilities from the year 2010. In this report, the performance of a utility is addressed for its three major physical components (i.e., utility, water distribution, and water treatment). A total of 62 PIs are reported against all the objectives/goals (Appendix A-8). In CSA (2010) no specific discussion has been made on the performance assessment of SMWU. Although the framework seems to be well structured and comprehensive, its direct application for SMWU needs to be further investigated by involving stakeholders in the benchmarking process (CSA 2010).                    Figure 2.11 Content and application of the ISO (2007)  2.5 Evaluation of Performance Indicators Systems  The PIs are grouped in different categories by various agencies (CSA 2010, NWC 2012, ADB 2012, OFWAT 2012, WB 2011, NRC 2010, IWA 2006, AWWA 2004) as stated above. Different agencies have used different terminologies as per their specific organizational setups and operational requirements. For Identification of Physical and Management Components Definition of Service Objectives  Application of Management Guidelines for water utility’s O&M Definition of Assessment Criteria Definition of Performance Indicators Performance Assessment Vs.  Project objectives  39 example, Alegre et al. (2006) included a water interruption indicator in the operational category, whereas the same PI was grouped into the customer relations category by AWWA (2004). Moreover, various PIs associated with economic performance of a WSS are inter-related (e.g., finance, economic, and pricing). Some of the agencies like NWC (2012) have included PIs of pricing and finance in separate categories, whereas others have grouped them into the same category of economic and finance (AWWA 2004, ADB 2012). Generally in the case of SMWU, due to relatively smaller financial structure than LWU (for which various categories have been developed), the relevant financial PIs can be included in one category.  A comparison of different categories in Table 2.2 presents the way different agencies grouped different PIs in each category; most of the PIs are related to finance, customer service, and operational categories. Data availability might not be consistent among different utilities even within the same geographical region. For example, sufficient data for performance evaluation is available in a small or medium-sized utility operating in the near vicinity of a large urban center; and within the same region a similar sized utility working under the same or different operating conditions away from large cities (less interaction with LWU) might not be having the similar type of the data. Therefore, a set of important PIs should be established for cross-comparison by country or region, and then each utility can include additional PIs according to availability of data, water source, and administrative setup for intra-utility performance management.   The agencies that consider environment as a separate category are mostly the ones responsible for combined PA of water supply and wastewater systems, because most of the indicators are related to discharge of wastewater and sludge disposal issues (NWC 2012, AWWA 2004). Furthermore, most of the agencies included impacts on water resources in the category of water resources rather than environment, and therefore only one parameter is left (i.e., GHG emissions) in the environment category, which is relatively new as well (CSA 2010). Therefore, it is recommended that water resources be included in the category and that the category be renamed “Water Resources and Environment”.  In addition, indicators related to impact of residual chlorine (not addressed so far) on aquatic life should be included in this category.  The PIs related to water quality and public health are considered either under the category of customer services or operational. For example, IWA (2006) considered water quality in the operational category, whereas the WB (2011) considered residual chlorine as the only water quality indicator under the category of service. These are important parameters to ensure supply of safe drinking water to the consumers. It is well established that water quality regulation compliance decreases as the size of the  40 utility decreases (USEPA 2006a), meaning more water quality problems and resulting public health issues in SM-WSS as compared to L-WSS. Therefore, water quality and public health indicators are grouped into a separate category in Table 2.2 to emphasize on their importance.   Table 2.2 Number of water supply performance indicators under different categories by various agencies PI category WB (2011) OFWAT (2012) ADB  (2012) NWC  (2011) NRC  (2010) IWA  (2006) AWWA (2004) CSA (2010) Water resources/ Environmental 111 2 15 23 + 312 315+2 4 - 5 Physical assets 1 - 2 213 - 15 - 7 Personnel/ Staffing (6+5)3 - 1 - - 26 1120 17 Water quality/ public health 24 - 13 7 3 5 121 7 Operational  (3+1)5 48 10 5 7 + 316 3919 822 6 Quality of Service/ Customer satisfaction (9+3+5)6 3+19 2 12 (4+1+3)17 34 2 4 Economic/ Financial/ Pricing 357 410 11 314+18 718 47 923 16 Total 81 14 5411 73 33 170 31 62 1 IBNET includes these indicators under water consumption category (Table 4 for details) 2 IBNET considers only metering level (Table 4 for details) 3 6 under process category and 5 under operating cost and staff category (Table 4 for details) 4 These indicators are considered under quality of service (Table 4 for details) 5 3 Non-revenue water, and 1 of pipe breaks (Table 4 for details) 6 9 indicators from process category, 3 form service coverage and 5 quality of service (Table 4 for details) 7 4 under process indicators, 6 operating costs & staff, 20 billing, 2 financial, 2 assets and one affordability category (Table 4 for details) 8 OFWAT used the term reliability, availability and security for this category (Table 7 for details) 9 SIM includes additional indicators related to customers complaints and 1 hosepipe restrictions from reliability category (Table 7 for details) 10 OFWAT indicators are based on financial performance of companies through more indicators (Table 7 for details) 11 Personnel and customer complaints have not been addressed in detail; due to different structure see Table 9 for details 12 environmental indicators were considered separately by NWC (2012) 13 Indicators of water loss and pipe breaks (5 indicators) were included under asset category (See Table 5 for details), now added to operational 14 3 pricing indicators and 18 finance (Table 5 for details) 15 these water resources indicators were considered under environmental quality by NRC 16 6 indicators considered in public health and 3 under economy (See Table 8 for details) 17 4 public safety; 1 social equity and 3 public security (See Table 8 for details) 18 3 under social equity and 4 under economy category by NRC (2010) 19 water quality monitoring was considered under operational category by IWA (2006) (See Table 3 for detail) 20 AWWA (2004) used the term organizational development (Table 6 for details) 21 drinking water compliance under water operations category by AWWA (2004) 22 water disruptions under customer relations; system renewal rate under business operations; water loss and structural integrity under water operations 23 indicators were distributed amongst customer relations, business operations and water operations categories   Only the IWA (2006) has given desirable importance to the numbers, skills, training, and qualification of personnel (staff) by developing 26 PIs under a specific category allocated to personnel. Some of these PIs are specific to different components of SM-WSS. For instance, consider a small WSS having saline groundwater as the only available water source; the only possible option would be a tertiary-level, highly-technical water treatment facility, such as reverse osmosis. Therefore, the PIs related to hiring of skilled operators, their regular trainings and their salaries will be extremely important in this case. Overall the IWA (2006) system seems to be more balanced with a maximum number (170) of total PIs, followed by WB (2011) with 81 indicators relatively well distributed amongst all categories as compared to the rest of  41 the PIs systems. PIs have also been distributed rationally to cover all physical and management components of a WSS with a total number of 62 PIs by CSA (2010).  It is well recognized that PIs should be clearly defined, easy and economical to measure and verify, easily understandable, and relevant to a specific WSS. Moreover, the overall framework of the PA should be simple, well defined, comprehensive (i.e., covering all components of a water utility), and comparable with similar utilities at regional, national, and international levels. Various systems and PIs are reviewed in the above sections along with their strengths and limitations. Here an effort is made to evaluate these systems for their general application (not limited to a specific region) to SMWU. The findings of the literature review conducted above are summarized in Table 2.3. The PIs defined by each PA system are evaluated in the context of SMWU on the basis of the following criteria:  - Understandability – the indicator should be easily understandable to both the utility operators and the public; - Measurability – the data required to calculate an indicator should be easy to measure and the indicator’s calculation should also be as simple as possible; and - Comparability – the indicator should be comparable across similar utilities in the same region as well as for international comparisons.  The following criterions have been used for the evaluation of the PIs frameworks proposed by these agencies, as mentioned in Table 2.3:  - Simplicity – How simple (i.e., interrelationships between data variables, indicator groups, and PIs) is the framework to be implemented for SMWU? - Comprehensiveness – How and up to what level of details does the framework consider the most important aspects (i.e., personnel, customer service, financial, and environmental) of the WSS? - Overall Applicability – Is the framework applicable to SMWU in its original form with minimum modifications (i.e., by just selecting the relevant suitable PIs w.r.t a specific WSS)?  The IWA (2006) seems to be the most suitable system for SMWU in Table 2.3. This system provides a wide range of PIs with a comprehensive classification system. Moreover, categorization of PIs into different levels may also facilitate the managers of SMWU starting with Level-1 indicators and then including higher levels depending on data availability. The way the data variables (as m3, numbers and cost), and the PIs (in terms of percentages and ratios) are distinguished from each other also provides an  42 opportunity for the regulatory agencies and utility managers to perform cross-comparison with similar sizes and types of utilities. The PIs systems developed by ADB (2012), NWC (2012), and CSA (2010) also seem to be suitable for SMWU with appropriate modifications.  It is very important to mention here that the rationale developed to review the distribution of PIs by various agencies in the above sections is done to evaluate that how the important PIs have been grouped in each category associated with various components of the overall PA framework of SMWU. The purpose of comparison between different PIs systems is primarily to meet the above stated objective rather than identifying the strengths and/or the limitations of these PIs systems. It is understandable understand that each system of PIs has been developed to meet the specific objectives of a certain project within its defined geographical boundaries.  2.6 Performance Assessment of Water Utilities with Limited Resources – Some Case Studies   South Asia - Bangladesh, India, and Pakistan 2.6.1 Bangladesh is one of the most densely populated countries in the world with over 146 million residents (WSP 2009). According to the Government of Bangladesh, the urban utilities in Bangladesh are not performing well due to lack of effective management. Under the Bangladesh benchmarking and performance improvement for water utilities project facilitated by Water and Sanitation Program – South Asia (WSP-SA), the concept of performance benchmarking was introduced in 2005-06 for 11 utilities of all sizes (i.e., serving a population ranging from 21,000 to 10,000,000). The Government of Bangladesh took the initiative to introduce benchmarking and performance improvement programing (BM&PIP) tools (i.e., IB-Net) along with other stakeholders.   In a similar study conducted by World Bank, the performance of over 30 urban water utilities across Bangladesh, India, and Pakistan under WSP-SA was compared to evaluate the effectiveness of the program (WB 2010). Almost all of the utilities in these countries are providing intermittent supply with an average duration of 5 hours a day. The selected PIs used in this study are given in Table 2.4. In result of the benchmarking process in Rajkot, India a 48% increase in billing and 31% increase in collection were achieved in a 3 year period between 2006 and 2009. Moreover, 20,000 unauthorized connections have also been regularized in the same time frame (WB 2010). It can be observed in Table 2.4 that the benchmarking process is currently focusing on meeting water demands and revenue collection to meet the financial and operational requirements in these South Asian countries. It is expected that with time, water  43 quality, personnel, environmental, and water resource indicators will also be included in the benchmarking process.  Table 2.3 Evaluation of different performance assessment systems for their applicability to SMWU   Eastern Europe, Caucasus and Central Asian (EECCA) Countries - Armenia 2.6.2 In the past, under the administration of the former Soviet Union (FSU), water was supplied to consumers by the public sector suppliers at very low prices. The gap between service revenues and cost of provision used to be filled by the government budget (Mitrich 1999). Armenia is one of the 12 countries in Eastern Europe, Caucasus, and Central Asia (EECCA). After the collapse of the old administrative system and subsequent financial crises in the FSU, the WSSs completely deteriorated and were unable to meet the demand of residential, commercial, industrial, and institutional consumers in Armenia (Mkhitaryan 2009).   In 2001, the water sector of FSU was decentralized and privatized to improve the situation, as part of institutional, legislative, and regulatory reforms in the country. In this system, priority was given to the customer satisfaction due to higher water rates. Therefore, it was decided that performance of the water suppliers would be assessed and customer feedback should be appreciated for their satisfaction. After reviewing the previous studies and the data availability, the PIs were finalized by the Public Service Regulatory commission (PSRC), Armenia in 2005 and 2008 (PAGS 2008). These selected PIs are listed in Table 2.4. After implementing these PIs, substantial improvements were observed: an almost 50% reduction in energy; a massive drop in per capita consumption from 250 to 87 lit/day due to improved Performance Assessment System Performance Indicators Performance Assessment Framework Understandability Measurability Comparability Simplicity Comprehensiveness Overall applicability to SMWU WB  (2011)       OFWAT (2012)       ADB (2012)       NWC (2012)       NRC (2012)       IWA (2006)                    AWWA (2004)       CSA (2010)       Legend: - High - Medium - Low        44 metering system; and significant improvements in user fee collection efficiency, from 21% to 90% before and after the privatization, respectively (Mkhitaryan 2009).   Arab Countries 2.6.3 Recently in July 2010, the first training course to establish key performance indicators (KPIs) and benchmarks for water utilities in the MENA/Arab region was held in Alexandria, Egypt. The course was organized by Arab countries water utilities association (ACWUA); InWEnt Capacity Building International, Cairo and Germany; and Alexandria Water Company, Egypt. A number of representatives from six Arab countries including Egypt, Jordan, Syria, Yemen, Palestine, and Morocco participated in the course. One of the main objectives was to promote the use of common PIs within the MENA/Arab region. The four categories of PIs including personnel, quality of service, O&M, and finance and economics were proposed (ACWUA 2010a).  These proposed PIs were further discussed in the 1st Arab Water week held in Amman, Jordan during December, 2010. This time 65 participants from 13 countries (in addition to above stated countries, Algeria, Tunis, Lebanon, Bahrain, UAE, Kenya, and Albania) participated in the course and proposed a set of PIs, given in Table 2.4, to start the benchmarking process in the region (ACWUA 2010b).   Africa - Malawi and 134 Water Utilities 2.6.4 Kalulu and Hoko (2010) carried out the PA of a public water utility in the city of Blantyre, Malawi. It is a commercial and industrial city with a population of 661,000 as per 2008 estimates. The baseline data (variables) were collected from legal and policy documents, and Blantyre water. The PIs used in their study are given in Table 2.4. For calculation of water loss, the Unaccounted-for-water (UFW) indicator was used instead of the non-revenue water (NRW) which has been used in other case studies of developing countries. The PIs were then compared to the best practice targets proposed by the WB in 2002 in the form of “A Water Scoreboard” (Tynan and Kingdom 2002). This document is a WB note in which Tynan and Kingdom (2002) used the data from 246 water utilities in 51 developed countries and proposed the KPIs to establish best practice targets for developing countries (Table 2.4).        45 Table 2.4 Checklist of key performance indicators used in developing countries  Water Operators Partnership Program for Africa (WOP-Africa) was initiated in December, 2006 with the Nairobi Workshop by the WB Water and Sanitation Program (WSP) to endorse the idea of involving a number of African utilities (WOP-Africa 2009). Under this program, the self-assessment process of 134 African utilities from 35 countries was started with a comprehensive utility self-assessment questionnaire (USAQ) adopted from the IB-NET assessment tool. The PIs selected from the standard IB-NET assessment tool are listed in Table 2.4.   Table 2.4 indicates that the PIs related to water supplied, water sourced, and the overall water balance are the most commonly used indicators in the water resources category. Number of personnel per 1000 connections is also a very common indicator. Customer complaints were recorded in almost all the cases Indicator Category Performance indicator Arab countries (2010) Africa (2009) South-Asia (2009) Armenia  (2009) Scoreboard (2002) Water Resources/ Environmental Water sourced     Water supplied     Water balance     Water consumption     Personnel Number of staff per 1000 conn.     Personnel Training     Health and safety     Quality of service/  Customer service Supply coverage     Customer complaints     Response to complaints     Water availability/ supply duration     Operational Non-revenue water/ UFW     Metering (new conn & maintenance)     Water treatment plant operation     Water main breaks     Water Quality/ Public Health Water Quality compliance     Financial/  Economic Revenue / financial inputs (bills)     O&M cost     Energy cost     Water charges     Billing Efficiency     Collection period     Working ratio     Tariff structure     Connection charges       46 in Table 2.4. The NRW is the most commonly used indicator in the operational category; and the overall costs of O&M, energy, and water charges are the only financial PIs. The results of these case studies suggest that even in developing countries a set of easily measurable PIs can improve the performance of water utilities. Thus, these PIs also provide a guideline for SMWU with limited available data. A framework consisting of suitable PIs is devised to start, implement, and improve the performance evaluation process for SMWU in Chapter 3.   2.7 Selection of Performance Indicators   PIs should be selected against suitable multiple criteria such as, adequacy, applicability, usefulness, measurability, attainability, understandability, relevancy, comparability, etc. (ADB 2012, Lee 2010, Giff and Crompvoets 2008, Artley and Stroh 2001). From reported PA studies for water utilities, the PIs have either been selected primarily on the basis of data availability or to assess the performance of a specific component (e.g., water quality) (Zhang et al. 2012, Sadiq et al. 2010, Coulibaly and Rodriguez 2004). In a recent study by Shinde et al. (2012), PIs have been revised for small water utilities in Japan using principal component analysis based on the data obtained from 177 utilities. Such approach for selecting PIs primarily based on available data may overlook several important organizational components of a water utility.   Toor and Ogunlana (2010) selected PIs for large scale public sector projects by conducting a survey in which ranking for suitability of PIs was done based on professional judgment of the stakeholders. Wong et al. (2008) developed key indicators for intelligent building systems through a survey based on suitability of each proposed indicator. Ugwu and Haupt (2007) evaluated the PIs for sustainability of infrastructure project adopting the similar ranking approach by involving different types of respondent (i.e., contractor, architect, engineer, consultants, public and private clients, etc.). This approach does not cover other important selection criterions of an indicator, such as measurability, comparability, and understandability.  Besides, participants of the surveys in this approach have conventionally been asked to score the attributes of the indicator on a five-point Likert scale (1= not suitable to 5= most suitable).  These ordinal (qualitative) scales with such a small range of rating generate significantly small differences in final scores. Selection of a set of most suitable PIs based on such scores might not be easy for the decision maker (DM). Secondly, defining the cut-off (for example top 40% or value greater than 3.5) for the list of ranked PIs limit the applicability of the selected PIs only for the specific case and also does not provide a planned opportunity to include additional PIs in the future with improvements in data management practices.  47 A system of initially identified PIs is devised to start, implement, and improve the performance evaluation process for SMWU in Chapter 3. The proposed system mainly consists of a list of most simple and relevant PIs based on the review carried above. For final selection of PIs, a detailed model based on multicriteria analysis is developed in Chapter 4.  2.8 Performance Benchmarking for Water Utilities   Benchmarking has become an essential and continuous activity in several organizations, and has gained strategic importance to improve their performance in today’s competitive environment (Sun 2010). Conventionally, linear regression equations of PIs have also been used for metric performance benchmarking process (Sindhe et al. 2013, AWWA 1996). This approach does not appropriately address the relative performance of the average performing utilities; therefore, the PA results based on such linear relationships might be misleading. A detailed argument on this has been established in Chapter 5.   Lambert et al. (2014), reported 14 years of experience of best practices for water balance developed by the International Water Association (IWA) in 71 water utilities spread over 12 high-income European countries. Singh et al. (2014) used 4 PIs to assess the performance of 12 water utilities in India. Sinde et al. (2013) developed linear regression equations (obtaining data from 199 utilities) for performance benchmarking of small utilities in Japan. Rouxel et al. (2008) used contractual and commercial PIs for customer service management of three privately owned water utilities in Italy. Theuretzbacher-Fritz et al. (2008) discussed the use of different types of denominators for PIs, and their use in performance benchmarking of Austrian water utilities. Plame and Tillman (2008) studied the application of sustainable development indicators; in addition to the financial PIs they also included environmental, operational and social PIs for Swedish water utilities. In the past, Marques and Monteiro (2001), used 50 indicators divided into 5 groups for performance management, including structural, operational, quality of service, personnel and economic by developing regression equations using the data obtained from 25 water utilities in Portugal.   Most commonly used non-parametric methods for performance benchmarking of water utilities, include data envelopment analysis, Malmquist productivity index, and total factor productivity (Berg and Marques 2011). Marques et al. (2011), applied data envelopment analysis using more than 500 observations encompassing 1144 water utilities in Japan. Carvalho and Marques (2011) included exogenous variables in the efficiency assessment of 66 Portuguese water utilities using non-parametric data envelopment analysis. Corton and Berg (2007) used the total factor productivity index for  48 benchmarking Central American water utilities. This method focuses on productivity changes over time and takes several inputs and outputs into account for analyses. Stochastic frontier analysis has also been used for benchmarking economic indicators of water utilities (Antoniolli and Filippini 2001). Correia et al. (2008) applied stochastic frontier analysis to estimate cost functions, using data from 66 water utilities representing 60% of the Portuguese population. Alsharif et al. (2008) also used data envelopment analysis to evaluate the efficiency of water supply systems in Palestine, primarily in terms of water loss. Singh et al. (2014) compared the PIs system and data envelopment analysis for performance benchmarking of 12 water utilities in India; and obtained similar results from both the methods.   Low economies of scale (e.g., highest operating expenses, low debt ratios, low cost recovery, etc.) and data scarcity are the major challenges for SMWU (Worthington and Higgs 2014, Rahill and Lall 2010). The number of PIs should be kept minimum, although balanced to cover all the functional components, to minimize the cost of data collection and validation of performance assessment model (Alegre et al. 2009). The Inter-American development Bank recently developed a universal rating system known as AquaRating for water and wastewater utilities (IDB 2014). The system normalizes the calculated values of 113 assessment elements without addressing the above stated challenges for SMWU. Moreover, existing systems aggregate the normalized PIs scores with the simple weighted average method without considering to what extent this value is close to or away from the desirable performance (Figure 2). The performance evaluation results without considering these issues may not rationally accommodate the sensitivities in the calculated values of the PIs for SMWU.  Most of the above mentioned studies were conducted for the inter-utility performance assessment based on comparison among the utilities operating in the same geographical region. First, such assessment methods are based on involving similar water utilities over several years; which is not the case for Canadian SMWU. An inter-utility performance benchmarking model is developed in Chapter 5 to assess the performance of all the functional components of SMWU.   A water utility may consist of more than one WSSs and each functional component (e.g., personnel, operational, etc.) may include several sub-components. Moreover, aggregating all the PIs to estimate the overall performance of a functional component (i.e., inter-utility benchmarking) can eclipse the underlying processes (sub-components). For an underperforming utility, it is important to identify the lacking processes (within a functional component) for effective decision making. Studies intra-utility performance management have not been frequently reported in literature. A comprehensive model is developed addressing all these issues in Chapter 6.  49 2.9 Customer Satisfaction Assessment  Traditionally, CS for water utilities have been assessed in two ways: i) performance benchmarking, ii) interviews and surveys. The first type is conducted with the help of basic performance indicators, such as, number of customer complaints, response to the reported complaints, unplanned service interruptions, water rates and billing mechanisms (USAID, 2008; Marques and Monterio, 2001). Generally, these indicators are calculated as the number of reported complaints per a specified number of customers (say, 1000), over a specific assessment period, which are then compared with the other water utilities to establish performance benchmarks. In the second type, the utility investigates the customer preferences, and acceptance level (willingness to bear the performance gap). Preference means that the selected option (by the customers) can only be compared with other available options through surveys on willingness to pay (KWR, 2008).    Most of the performance assessment studies carried out in the past were focused on operational, personnel, water quality, and financial components of the water utilities (e.g. Berg and Danilenko, 2011; Sadiq et al. 2010; Corton and Berg, 2009; El-Baroudy and Simonovic, 2006). National Water Commission Australia (NWC), Canadian Standards Association (CSA), American Water Works Association (AWWA), International Water Association (IWA), include complaints of water quality, water continuity, pressure, billing, service, duration and frequency of unplanned interruption, estimates an average targeted time of repose to these complaints, and cost of customer communication (NWC, 2012; CSA, 2010; AWWA, 2008; Alegre 2006). The actual root causes of the complaints and the time to resolve these complaints have not been addressed in these benchmarking processes.  Several agencies and other peer-reviewed research have used customer interviews for the assessment of CS (Hanson and Murrill, 2013; CDM, 2011; KWR, 2008; USEPA, 2003). Recently in the United States, the effective utility management initiatives’ strategic plan for FY 2013-2017 has proposed to promote and enhance community participation, environmental stewardship communications for awareness, collection of customers’ feedback, and analyses of this feedback to prioritize customers’ communication needs (CWW, 2013). In the UK and Europe, for privately owned water utilities, the CS is being assessed using the water rates and service incentive mechanism index, which qualitatively and quantitatively evaluates the CS. In the qualitative analysis, interviews are conducted; whereas the quantitative analysis use records of telephonic and written complaints, complaints responded or abandoned, and the time of response to complaints (WUG, 2014; OFWAT, 2012). Such extensive approaches might not be viable for SMWU on periodic bases where limited personnel expertise and financial constraints are the main challenges.  50 A detailed risk based model is developed in Chapter 7 for the assessment and management of customer satisfaction using a sustainable approach for SMWU based on the record of customer complaints and experience of field personnel.   51 Chapter 3     Identification of Suitable Performance Indicators   A part of this chapter has been published in Environmental Reviews, an NRC Research Press Journal as a review articles titled “Performance Indicators for Small and Medium Sized Water Supply Systems: A Review” (Haider et al. 2014a).  In this chapter, a system is devised based on the initially identified PIs to start, implement, and improve the performance evaluation process for SMWU. The proposed system primarily identifies the relevant PIs for each functional component based on the review carried out in Chapter 2; detailed selection is carried out in chapter 4.  3.1 Background  The proposed system of PIs shown in Figure 3.1 provides a stepwise approach based on three levels of indictors, including start-up, additional, and advanced PIs depending on the availability of resources and site specific requirements of SMWU. Required data variables are presented for the calculation of start-up PIs, here as, additional and advanced PIs are only listed because the detailed data requirement of these PIs cannot be covered here. In this regard, enough data sources have been provided in Appendix-A for consultation to use the advanced PIs. Moreover, the details of selected PIs used for IU-PBM and In-UPM are also given in Chapters 4, 5 and 6.  The utility can evaluate its existing data availability, and then can select the level from which it can start its PA process. A small water utility facing issues related to lack of funding and availability of trained staff in a developing country may start the performance evaluation process with a few of the most relevant start-up indicators. The start-up indicators might be different in the case of a small utility located near a larger city in a developed country, e.g., it is easier for the municipality to hire and retain trained personnel. The number and types of PIs in the latter case would be much higher than the former. Moreover, special care is required in the case of a medium sized water utility, where population is slightly more than that of a small sized water utility. For example, if the maximum population limit for a small sized utility is 3300 persons, the medium sized utility with 5000 persons might be facing all the difficulties as smaller utilities.     52               Figure 3.1 Proposed system of PIs to start, proceed and improve the performance evaluation mechanism in SMWU  3.2 Categorization of Performance Indicators for SMWU  In the following sub-sections, the proposed PIs along with their units and required data variables are listed under each category. The selected PIs categories include water resources and environmental, personnel/ staff, physical assets, operational, water quality and public health, quality of service, and financial/ economic indicators. The users of these selected PIs under each category are also mentioned in respective Tables. The users, in a water utility, are classified as technical personnel (T), managers (M), and policy/ decision makers (P) based on the nature and relative significance of the PI in the decision making process. It is possible that more than one user are associated with one PI.   Water Resources and Environmental Indicators 3.2.1 Proposed water resources and environmental indicators for SMS utilities are presented in Table 3.1. Some of the water resource indicators strongly interact with environmental ones. For the sustainable utilization of natural water resources, amount of water sourced should not affect the existing designated use of the source (i.e., downstream use in case of surface waters, and lowering of groundwater table). In case of ground water, source yield of a pump is usually given with pump design, and/or in the findings of the hydrogeological investigations carried out during selection of the source. On the other hand, hydrological START UP - Basic Review of performance indicators Check for available data IDENTIFICATION Most important for the utility under study Measurable with the available data  ASSESSMENT Calculate the selected Indicators Compare with benchmarks and performance of similar utilities  IDENTIFICATION – Additional Additional relevant and important indicators Data required for calculation of new indicators (economically)  ASSESSMENT Calculate the selected Indicators Compare with benchmarks and performance of similar utilities  IDENTIFICATION – Advanced Further advanced level indicators (if required) Data required for calculation of new indicators (economically)   53 analysis of low drought conditions in the case of a surface water source can give the information about maximum allowable draw during low flow periods. It is better to have assessments on the basis of yearly average, but assessment periods for less than a year can also be adopted by using the proportionate time periods (i.e., 365/assessment period) (IWA 2006). Availability of water resources in a sustainable way is an important water resource indicator.   Table 3.1 Proposed water resources and environmental indicators No Use of the PI1 Basic (Start-up) Indicators Additional Indicators3 Advanced (Long-term)4 Indicators Calculation Data variable 1. T/M/P Availability of water resources (%) [(volume of supplied water in a year)/(Annual yield of water resources)] x 100  Water supplied metered volume  Assessment of possible annual yield of source (form surface or ground or both) as per regulations.  Efficiency of reused water supply  Water license capacity  Sector vise availability of water resources (i.e., domestic, industrial, commercial)  Sector vise availability of water resources for both potable and non-potable water (if applicable) 2. T/P Greenhouse gas emissions2 (tones CO2-equivalents per 1000 connected water properties) GHG emissions = [quantity of electricity purchased (KWH) x (Emission factor/1000 connections)]  Number of connections  Electricity consumption records  Emission factors established at state level  GHG emissions from routine transport fuel emissions  Disposal of backwash water (aquatic life should not be affected)  GHS emissions from fuel consumption by stand-by pumps, construction equipment working during maintenance works etc. 3. T/ M/ P Days with restrictions to water service (hosepipe or sprinklers) (%) [(Total number of days with water service restrictions during the year)/(Days in a year)] x 100  Record of the number of days with service restriction during the whole year - - 4. T/M     Impact of residual chlorine on aquatic life due to leakage in mains passing nearby the natural water bodies  5. T/M     Impact of residual chlorine on aquatic life due to flushing of mains  1T – Technical Personnel, M – Management Personnel; P – Policy/ Decision Makers 2 might be applicable to medium sized utilities only in countries where emission factors for consumption of purchased electricity from the grid have been established (CWA 2011) 3 need to be added during within a year  4 might be suitable for medium sized utilities in developed countries  Every water source needs to be sustainably utilized by ensuring the continuity of its intended water uses. Integrated water (quality and quantity) management plans requires estimation of optimum degree of treatment of receiving wastewater (also known as total maximum daily loads) based on the allowable threshold concentrations of the aquatic ecosystem to avoid water quality problems. In this connection, a minimum amount of water in the water body is always required to provide a certain dilution. Environmental protection agencies in collaboration with water supply agencies issue water licenses to the water utilities based on their management plans. The utility is supposed to draw water within the allocated license capacity. To achieve this objective ‘water license capacity’ should be compared with the amount to water supplied as an indicator. This indicator also predicts the needs for water conservation, water reuse, and investigation of a new water source in future.  54 In water scarce areas, reuse of wastewater is commonly practiced particularly in water scarce countries (e.g., for agriculture with treatment through waste stabilization ponds). If the amount of water reused cannot be calculated initially, these estimates can be done in the following years. The concept of water reuse can also be useful in situations where the main water use is agriculture. In this regard, there is a need for careful assessment using water balance approach based on the amount of available reuse water, precipitation, and crop/water requirements. This approach is environmentally sustainable with additional socio-economic benefits. Further detailed PIs can be added in later years (if relevant) for type of water (potable, non-potable) and type of service area (domestic, industrial, commercial, etc.) as mentioned in NWC (2010).   GHG emissions are responsible for climate change, and may lead to drought conditions in the future. These emissions can be considered at the start since it is not difficult to collect data regarding electricity bills - and the number of connections. However, all countries might not have established emission factors for consumption of purchased electricity for CO2, CH4, and N2O emissions as CO2 equivalent. The values of combined emission factors range between 0.3 to 1.21 in Australia for various states and territories. However, a value of 0.67 was recommended for other territories in Australia in which these emission factors have not been established (CWA 2011). Water restrictions describe the indirect situation of availability of water resources in the supply area, and effectiveness of water conservation plan. Sprinkler water regulations should be implemented throughout the year under present scarcity of freshwater all around the globe.   It is also recommended to develop sustainable strategies to control water consumption even in areas where abundant water resources are available. Acceptance to higher water losses on the basis of financial analysis (i.e., lower cost of water in case of plentiful ground or surface waters as compared to the repair cost of the leaking water mains or service connections) could be highly misleading for long-term sustainability. Consider a WSS with very high water consumption; more than 80% of the water is being converted into wastewater, which will not only lead to higher wastewater treatment cost (and GHS emissions) but will also negatively impact the receiving water body (e.g., low dissolved oxygen, eutrophication, etc.). Therefore, sufficient attention should be given to the PIs associated with water resources and environment.  Impact of residual chlorine on aquatic life has not been addressed in literature so far. In this regard, the following two indicators are proposed here (PIs No. 3&4, Table 3.1):  55 WE 3: Impact of residual chlorine on aquatic life due to mains leakage and breaks: This PI can be estimated in terms of the distance of broken water main from the receiving water body and the ground slope  WE 4: Impact of residual chlorine on aquatic life due to flushing of water mains =  (Flow rate of water in receiving water body)/ (Flow rate of water drained from flushing of main) OR (the distance between the point of flushing and the receiving water body in terms of length of surface drain)  In both the indicators, the impact can be considered negligible if high (i.e., 1:10) dilution is available in the receiving water body.    Personnel/ Staffing Indicators 3.2.2 Personnel and staffing indicators are important and strongly aligned with the performance of the other functional components as well. However, it is observed in the review conducted in Chapter 2 that most agencies either have not given relative importance to this category or have ignored completely. Like the other organizations, human resources is also an important department in a water utility; therefore the skills, qualifications, experience, training, health and safety, and overtime culture should always be included in the performance evaluation process, irrespective of the size of the utility. The personnel indicators recommended by IWA (2006) arranged within the proposed, staged framework for SMWU are presented in Table 3.2. Most of these indicators are simple to calculate with the help of data obtained from the human resource department of the utility under study.   The number of O&M staff may increase with the age of the physical assets and their associated problem. If the hiring of additional staff is planned accordingly, the utility performance will be maintained due to prompt response to customer complaints. On the other hand, in case of new and small utilites where customer complaints and operational failures are low, a higher number of O&M staff may result in low productivity. Additional PIs related to specific components, such as water resources, catchment, treatment, transmission, and distribution can also be calculated for SMWU, as shown in Table 3.2. Personnel training for state-of-the-art equipment, operational methods, and new software are also important. According to Brown (2004), one of the most important indicators of successful SMWU in the United States is training of operators and decision makers. Major accidents during operations in which workers have been hospitalized should always be recorded to indirectly measure the effectiveness of the health and safety procedures adopted by the utility.   56 Table 3.2 Proposed personnel/ staffing indicators No Use of the PI Basic (Start-up) Indicators Additional Indicators1 Advanced (Long-term)4 Indicators Calculation Data variable 1. T/ M Employees per connection (No/1000 connections)2 [(Number of full time employees)/(Number of service connections/1000)]  Total personnel  Total number of service connections  Employees per volume of water supplied - 2. M  Management personnel  (%)  Technical Personnel (%) [(Number of full time management (i.e., finance, human resources, marketing, customer services) employees)/(Total number of full time employees)] x100  [(Number of full time technical (i.e., planning and construction, operation and maintenance) employees)/(Total number of full time employees)] x100  Number of connections  Number of total management personnel  Number of total technical personnel  Water resources and catchment management employee  Abstraction and treatment employee  Transmission, storage and distribution employee   General management personnel  Human resources management personnel  Financial and commercial personnel  Customer services personnel  Planning and construction personnel  Operation and maintenance personnel 3. M/ P Qualified Personnel (%) [(Number of full time employees with university degree and basic education)/(Total number of full time employees)] x100   Number of personnel with university degree and high school education  Total number of employees  University degree personnel  Basic education personnel  Other qualification - 4. M/ P Personnel Training (Hours/employee/year) [(Number of training hours during a year)/(Total number of employees)]  Total number of training hours  Total number of employees  Internal trainings  External trainings - 5. T/ M Working accidents (No/100 employee/year)3 [(Number of major working accidents in a year)/(Total number of employees/ 100)]  Number of major accidents in a year  Total number of employees   Absenteeism  Overtime  5.  - - - Water quality monitoring personnel (No/ 100 tests/ year) - 6.  - - - Metering management personnel (No/100 meters) - 1 could be added during next year  2 for smaller utilities units of employees/100 connections can be used 3for smaller utilities units of No/10employee/year can be used 4might be suitable for medium sized utilities in developed countries   Physical Assets Indicators  3.2.3 Physical indicators are related to performance and efficiency of various components (assets), such as storage, pumping, treatment, transmission, and distribution mains. Most of these components are designed to meet the demand until the end of their design period (i.e., optimally equal to their structural life). Considering both the remaining capacity and the total structural life together, technically feasible and economically viable future planning can be done. The outcome of these PIs is useful for asset management. Proposed indicators in this category are listed in Table 3.3.  It can be seen in Table 3.3 that treatment plant utilization and level of metering are the most important PIs in this category. Treatment plant capacity utilization indicates the need for additional treatment units in future, and the metering level is important for estimating water losses. It is very common in SMWU to provide water at flat rates, particularly when the source water is ample. Such utilities may face several operational complications, including wastage of large volumes of water, and difficulties in estimation of  57 non-revenue water. The estimates of water loss would be more accurate if based on the data from bills of metered connections. It is also reported that the reason for higher water loss is the poor metering system (Corton and Berg 2009). Detailed PIs related to valves, hydrants, and automation and control could be important for MWU in developed countries. Some of the source water bodies, particularly rivers and streams, are influenced by large flow variations. Storage of water during high flows is necessary with application of rain water harvesting methods or diversion structures. Therefore, the indicator of raw water storage capacity is also included for SMWU in Table 3.3.   Table 3.3 Proposed physical/ asset indicators No Use of the PI Basic (Start-up) Indicators Additional Indicators2 Advanced (Long-term)3 Indicators Calculation Data variable 1. M/ P Treatment plant capacity1 (%) [(maximum volume of water treated per day)/(maximum  daily designed capacity)] x 100  Treated water supplied from water treatment plant   Design capacity of treatment plant -  Remaining capacity of treatment plant   2. M/ P Raw water storage capacity (days)  [(Net capacity of raw water reservoir)/(volume of supplied water during the period of assessment)]   Volume of the reservoir (can be monitored by keeping the record of reservoir level)  Volume of supplied water (metered volume supplied)  Assessment period of one year will give reliable estimate  Treated water storage (days)   Remaining capacity of storage   3. T/ M Metering level (%) [(number of connection with meters installed)/(total number of connections)] x 100  Number of metered connections  Total number of connections -  Metering density (No/ 1000 service connections) 4.  - - -  Pumping utilization (%)  Energy recovery (%)  Standardized energy consumption (See IWA Manual for details)  Reactive energy Standardized energy consumption (See IWA Manual for details) 5.  - - - -  Valves density (No/km of main)  Hydrant density (No/km of main) 6.  - - - -  Degree of automation and remote control units 1 applicable to the utilities relying on surface water sources or saline ground water treatment is done with reverse osmosis or marine water treatment with thermal desalination 2 can be added during next year  3 might be suitable for medium sized utilities in developed countries   Operational Indicators  3.2.4 Operational indicators are essentially related to inspection and maintenance of the physical assets discussed above. IWA (2006) and others have included water quality monitoring indicators in the same category, but in this research, water quality indicators are proposed under a separate category. The operational indicators recommended are listed in Table 3.4. Periodic cleaning of storage tanks is mandatory to reduce water quality issues resulting from formation of algae, particularly for surface water sources. For smaller diameter mains, it might not be economical to conduct condition assessment. Therefore, percentage of mains subjected to leakage during the year need to be identified during the  58 assessment period for assessing the structural integrity of the WDS. Lengths of rehabilitated mains also provide information regarding the operational efficacy of the utility, as well as the conditions of the mains.   Table 3.4 Proposed operational indicators No Use of the PI Basic (Start-up) Indicators Additional Indicators1 Additional  (Long-term)2 Indicators Calculation Data variable 1. T Cleaning of storage tanks per year [(total volume of storage tanks cleaned during the assessment period)/(total volume of all storage tanks)] - sum of volume of storage tanks cleaned - Total storage volume of tanks - - 2. T/ M Leakage (%/year) [(Length of mains detected to leakage in a year)/(total mains length)] x 100 - - No of breaks in a pipe (total length of that pipe is known from design) - Pipe lengths  Leakage detection and repairs - 3. M/ P Rehabilitation , renewal or replacement of mains  (%/year) [(Length of mains rehabilitated, renewed or replaced during a year)/(total mains length)] x 100 - No of repairs - Pipe lengths  Rehabilitated, renewed and replaced mains individually - 4. T/ M Unaccounted for Water (UFW) [(System input volume) – (billed and unbilled authorized consumption)] - System input volume - Data of billed consumption - Data of unbilled authorized consumption  Apparent losses  Real losses   Infrastructure leakage index (ILI) 5. T/ M Main Failure  (No/100km/year) [(Number of main failures during the year including valves and fitting)/ (total main length x 100)] - Data of main failure irrespective of type - Total main length of the distribution system  Pump failure  Hydrant failures  Power failures 6. T/ M Operational meters (%) [(Number of direct customer meters installed that are operational )/ (total number of meters installed)] x 100 - Total number of installed meters - Complaints received for out-of-service meters or meters found out-of-service during the meter reading  Customer meter reading efficiency - 7.  - - -  Refurbishment or replacement of pumps (%/year)  Frequency of inspection of pumps - 8.   - -  Inspection of mains (valves, fittings and hydrants)  Valves replaced  Service connections replaced 9.  - - - -  Inspection and calibration of instruments (See IWA 2006 for details) 10.  - - - -  Degree of automation and remote control units 1 can be added within a year  2 might be suitable for medium sized utilities in developed countries  The “best practice” proposed by the IWA Task Force, a water balance to determine losses in a water distribution network, is shown in Figure 3.2 (Alegre et al 2000; Hirner and Lambert 2000). This is the only comprehensive water loss calculation framework that can be efficiently used for cross-utilities comparisons at an international level (Lambert 2003). All the components of the water balance given in Figure 3.2 need to be determined in terms of volume of water (preferably for one year). The main component of apparent losses are illegal use (theft), and the errors associated with billing, data handling,  59 and metering as shown in Figure 3.2. Experience shows that apparent losses may range between 1 – 9% of the total system input volume (Lambert 2002). It has also been reported that the main component of apparent losses is inaccuracies in meters (Mutikanga et al. 2009, Criminisi et al. 2009). The other component of water losses is the real losses (also known as physical losses) due to leakage from different components of a WSS. Recent studies have found that one third of the total water lost in urban areas is due to the leaks and breaks of water mains (Kanakoudis and Tsitsifli 2010). The following four real loss indicators have been reported in the literature (Sharma 2008, Radivojević et al. 2008, Hamilton et al. 2006):   percentage of system input volume;  per property per day;  per length (km) of mains per day;  per service connection per day;  per service connection per day per meter pressure;  per length (km) per day per meter pressure; and  per length (length of main + length of service connections up to meter locations) of system per day.   Hamilton et al. (2006) developed a matrix to identify the limitations of the above mentioned real loss indicators in consideration of key factors that affect real losses According to them, none of the indicators take all the key factors affecting real loss into account. Detailed discussions on the application of the above stated PIs of real losses can be found in the literature (Kanakoudis and Tsitsifli 2010; Radivojević et al. 2008; Hamilton et al. 2006; Lambert & Hirner 2000; Lambert and Morrison 1996; Arscott & Grimshaw 1996; Butler and West 1987).   It is well recognized that real losses cannot be completely avoided economically due to continuous, unavoidable deterioration of the WDS (Radivojević et al. 2008). The IWA Task Force recommended a comparison between the Current Annual Real Losses (CARL) and the Unavoidable Annual Real Losses (UARL) (Hamilton et al. 2006). Lambert et al. (1999) developed the following empirical relationship to calculate UARL:  UARL (liters/day) = (18 * Lm + 0.8 * Nc + 25 * Lp) * P    [3.1]  60 where Lm is the length of mains (km); Nc is the number of service connections; Lp is the length of private service pipes from property boundary to the meter (m); and P is the average pressure (m). A value of zero for Lp can be used when a meter is installed at the boundary line.    Figure 3.2 Components of water balance for calculation of water losses in water distribution define by Farley and Trow (2003)   ILI, previously known as the International Leakage Index, has been well recognized as the most appropriate indicator for the calculation of real losses (physical losses). It frequently investigates the management efficiency at a certain operating pressure. Liemberger (2002) proposed the following formula to calculate ILI as the ratio CARL and UARL:  UARLCARLILI      [3.2]  The ILI calculated from Equation [3.2] is unit-less, therefore is more suitable for international cross-utility comparison. The relationship between ILI, CARL, and UARL has been best presented in Figure System Input Volume (Net Production) Authorized Consumption Water  Losses Unbilled Authorized Consumption Billed Authorized Consumption - Billed metered consumption - Billed unmetered consumption  - Unbilled metered consumption - Unbilled unmetered consumption  Real  Losses - Leakage on transmission & distribution mains - Leakage and overflow at storage tanks - Leakage on service connections up to point of consumer metering Apparent  Losses - Unauthorized consumption - Metering inaccuracies and data handling errors  Revenue Water Nonrevenue Water (NRW) Unaccounted for water (UFW)  61 3.3. The outer rectangle shows that CARL increases with aging of the WDS, and hence the ILI value. To reduce the volume of CARL, management methods (shown as arrows in the Figure 3.3) to control the real losses need to be applied. Asset management pushing the CARL rectangle from the bottom includes selection, installation, maintenance, renewal, and replacement of deteriorating assets of a water supply system (Lambert and McKenzie 2002).                  Figure 3.3 The four basic methods of managing real losses (Source: Lambert et al. (1999)  Limberger (2002) presented a graphical visualization of ILI from 1-100, where a value of “1” is ideal but not necessary to be set as the target value. Liemberger and McKenzie (2005) found that the ILI formula had limited applicability in developing countries where data is often unavailable and/or inaccurate due to limited resources. According to them the efficiency of the ILI depends on the accuracy of the UARL formula to some extent but mainly on the annual volume of real losses (i.e., CARL), average pressure, and the data related to the distribution network.   It is uneconomical to completely control the leakage from all the reservoirs and pipe mains; NWC/DoE (1980) stated that there is always an economic level of leakage (ELL). According to OFWAT (2003), the ELL is the level at which further reduction in leakage becomes more expensive than producing water from another source. The optimum or economic level can be determined by adding the cost of distributing treated water and the cost of reducing leakage.    Potentially  Recoverable Real Losses   UARL CARL Speed and quality  of repairs Active Leakage Control Pressure Management Pipeline and Asset Management  62  The water production cost will vary with the type of network and level of treatment. However, there were questions raised about the application of UARL formula and using the ILI approach for systems operating with less than the 3000 service connections, service connection density of less than 20 per km of main length, and an average pressure less than 25 meters, which could be the case of SM-WSS. Due to the significant amount of unauthorized consumption from illegal connections, the concept of non-revenue water (NRW) is not a reliable estimate of real losses. However NRW is still most commonly used in developing countries and SM-WSS as it is easy to calculate and can be useful as a financial indicator (Kanakoudis and Tsitsifli 2010; Lambert 2003; IWA 2006). It is recommended here as well for use as a financial indicator for SM-WSS.    Water Quality and Public Health Indicators  3.2.5 Most of the health problems in S-WSS are caused by pathogenic micro-organisms (MOs), which can be removed by disinfection. The most common MOs in small systems are Escherichia coli (E-coli) and Campylobacter species (Ford et al. 2005). The type of pathogen depends on the water source and the geographical location of the area. However, it is always easy to identify and rectify water quality outbreaks in small systems due to their relatively smaller network size. A well-known example, Washington County Fair, New York, USA in 1999, describes the potential risk of public health associated with a contaminated shallow well in small counties; 921 diarrhea cases were reported as a result of drinking non-chlorinated water (MMWR 1999).   Chronic and acute chemical risks include arsenic, nitrates, pesticides, disinfection by-products (DBPs), iron, lead, pH, etc. Trihalomethanes (THMs) and Haloacetic Acids (HAAs) are the main DBPs if the source water contains sufficient organic matter and the disinfection method is chlorination. The most common aesthetic water quality aspects are taste, odour, and colour (WHO 2011). Detailed reviews on the fate and transport of various chemical constituents (fluoride, iron, nitrification, chlorine residual) in water supply systems and their impacts on both human health and system integrity have been frequently reported in the literature (Benson et al. 2011; Fisher et al. 2011; Zhang et al. 2009; Ayoob and Gupta 2007).  Natural waters may contain some chemical elements that are naturally radioactive. According to the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR 2008), the global average yearly dose from all environmental sources of radioactive radiation is about 3.0 mSv/year per  63 person. However, the possibility of cancer risk through ingestion of drinking water has been reported for doses of 100mSv for extended time exposures. It is not economically viable to identify individual radionuclides in the WSS, but a screening process is practical for identifying the total radioactivity without regard to specific types of radionuclides. Details can be seen in WHO (2011). Therefore, it is important to conduct radioactivity tests in SM-WSS periodically. Proposed water quality and public health indicators are presented in Table 3.5 (IWA 2006, NWC 2011, NRC 2010).   Table 3.5 Proposed water quality and public health indicators No Use of the PI Basic (Start-up) Indicators Additional Indicators1 Additional  (Long-term)2 Indicators Calculation Data variable 1. T Aesthetic water quality tests carried out (%) [(Number of treated water aesthetic tests carried out during a year)/(Number of treated water aesthetic tests required by applicable standards per year)] x 100  Record of aesthetic water quality tests (taste, colour and odour) of treated water  Applicable water quality standards or regulations  Tests performed for taste  Tests performed of odour  Tests performed for colour - 2. T Microbiological water quality tests carried out (%) [(Number of treated water microbiological tests carried out during a year)/(Number of treated water microbiological tests required by applicable standards per year)] x 100  Record of microbiological water quality tests of treated water  Applicable water quality standards or regulations  Tests performed for pathogens  Test performed for viruses  Tests performed for helminthes - 3. T Physico-chemical water quality tests carried out (%) [(Number of treated water chemical tests carried out during a year)/(Number of treated water chemical tests required by applicable standards per year)] x 100  Record of chemical water quality tests of treated water  Applicable water quality standards or regulations  Tests performed for residual chlorine  Tests performed for dissolved solids  Tests performed for Arsenic or other toxic chemicals  Tests performed for chloramines  Tests performed for THMs and HAAs  Tests performed for dissolved organics and inorganics 4. T Radioactivity water quality tests carried out (%) [(Number of treated water radioactivity tests carried out during a year)/(Number of treated water radioactivity tests required by applicable standards per year)] x 100  Record of overall radioactivity water quality tests of treated water  Applicable water quality standards or regulations - - 5. T/ M/ P Population days with boiled water advisories (%) [(Number of days with boiled water advisory during a year)/(Total days in a year)] x 100  Record of the days when boiled water advisories were issued  - - 6.  - - - -  Reduction in number of illnesses, injuries and deaths resulting from the performance improvement 7.  - - - -  Risk based drinking water management plan (Yes/ No) 8.  - - - -  Public disclosure of drinking water performance (Yes/No) 1 can be added within a year  2 might be suitable for medium sized utilities in developed countries  Sources of chemical constituents in surface waters are natural rocks, industrial and domestic activities, fertilizers and pesticides used in agricultural activities; and the use of specific chemicals (e.g., coagulants, polymers, and chloramines) in the treatment processes. Therefore, special considerations for detailed parameter wise analysis are recommended at the time of source selection to select the most suitable water quality parameters and their sampling frequency. Moreover, extra care and frequent monitoring of various types of aesthetic, chemical, and microbiological water quality aspects would be required for fresh surface  64 water sources. For fresh ground water sources, residual chlorine and microbiological water quality aspects are more important because the source is free from suspended and dissolved organic and inorganic elements and the only possibility of microbiological contamination is from cross-connections in cracked pipelines. Turbidity and pathogens are the main problems of surface water sources, which are conventionally controlled through water treatment facilities (e.g., coagulation, sedimentation, filtration, disinfection). However, in SMWU such treatment facilities are not installed and the utilities rely primarily on source water quality. Further for surface water, higher dissolved oxygen (DO) concentrations from saturated water sources (rivers and lakes) may also exacerbate corrosion of metal pipes (WHO 2011).   If the source is marine water, all types of chemical, biological, and aesthetic parameters need to be controlled and monitored. The treatment processes in this case could be either the “conventional treatment followed by reverse osmosis” or the “thermal desalination process. All these treatment facilities may operate at different efficiencies depending on skills of the operators, source water quality, and structural condition and age of plant components. Thus the implementation of a well-structured water quality monitoring plan is always required to ensure the provision of safe drinking water to the consumer.    Quality of Service Indicators 3.2.6 Customer satisfaction is the most important objective of any utility. Agencies have used different PIs to check the efficiency of the utility in this context. Customers will only be satisfied when they get the best service at the cost they paid for the water. Cost of water may range from free (e.g., public stand post installed by an NGO in a small sized water utility of a developing country) to very high (e.g., desalinated water in a medium sized water utility of a developed country). The satisfaction can be mainly correlated to the maximum coverage, adequate quantity (i.e., continuous supply at required pressure), acceptable quality (i.e., meeting with water quality standards and guidelines), prompt response to customer complaints, and less time taken to install a new connection or meter.   Customers may not receive a response to their complaints if the supplier’s email spam filter prevents messages from coming through. As discussed earlier, in the case of privately owned WSSs (e.g., England and Wales) the customers’ expectations are too high proportional to the cost of water, and the indicators used to assess customer satisfaction might not be practical in general. Depending on of the level of response to written consumer complaints, water suppliers in England and Wales whose response efficiency was higher than 99% (within ten working days after the receipt of complaint) were given an  65 incentive to increase their water charges (OFWAT 2009-2010). Therefore, the PIs of customer satisfaction need to be carefully selected for SMWU.  Table 3.6 Proposed quality of service indicators No Use of the PI Basic (Start-up) Indicators Additional Indicators Additional  (Long-term)3 Indicators Calculation Data variables 1. M/ P Population coverage (%) [(resident population served by the water undertaking)/(Total population of the study)] x 100  Record of the population served (based on experts opinion, demographic  surveys and analysis)  Total population of the area  Building supply coverage (%)  Population coverage by service connections  Household and business supply coverage (%) 2. M/ P Population coverage by public stand posts – developing countries (%)1 [(Resident population served by the water undertaking through public stand-posts)/(Total population of the study area)] x 100  Record of the population served (based on experts opinion, demographic  surveys and analysis)  Total population of the area  Population per public  stand-post  Per capita water consumption in public stand-post - 3. M/ P Operational water points and stand-posts – developing countries (%) [(Number of water points that are operational)/(Total number of water points in the study area)] x 100  Record of the operational stand-posts  Total number of stand-posts installed - - 4. T/ M Adequacy of supply pressure [(Number of service connections at which pressure is equal or higher than the target pressure)/(Number of service connections)] x 100  Record of complaints regarding low pressure points  Results of pressure monitoring survey  Total number of connections - - 5. T/ M Continuity of supply (%) [(Number of hours when the system is pressurized during a year)/(Total days in a year)] x 100  Record of the hours when the system was not pressurized  - - 6. T/ M Water interruptions (%) [(Number of hours when the system is pressurized during a year)/(Total days in a year)] x 100  Record of the hours when the system was not pressurized  - - 7. T/ M Average frequency of unplanned interruptions (No/100 connections)2 [(Total number of unplanned interruptions during the year)/(Number of service connections *100)]  Record of the number of unplanned interruptions during the whole year  Number of connections - - 8. M/ P Water quality compliance of supplied water (%) [(Total number of treated water samples complying with standards in a year)/(total number of tests performed in a year)] x 100  Record of water samples analyzed  Microbiological tests compliance  Chemical tests compliance  Aesthetic tests compliance  Radioactivity tests compliance - 9. M Total complaints per connection (No/100/year)** [(Total number of complaints during the year)/(Number of service connections *100)]  Record of the number of complaints during the whole year  Number of connections -  Pressure complaints  Continuity complaints  Water quality complaints  Interruptions complaints  Billing and queries 10. M Total response to written complaints (%) [(Total number of responses to written complaints)/(total number complaints in a year)] x 100  Record of the written complaints in a year  Record of complaints responded -  Pressure complaints  Continuity complaints  Water quality complaints  Interruptions complaints  Billing and queries 11  - - - -  Total telephonic complaints  Percentage of calls answered within 30 sec by an operator 12.  - - - -  New connection efficiency  Time to install a customer meter  Connection repair time 1 specific to developing countries only 2 For small systems only, otherwise (No./1000 connections) should be used 3 might be suitable for medium sized utilities in developed countries   66 A proposed set of quality of service PIs is listed in Table 3.6. Aspects related to coverage and customer complaints are addressed in most of the indicator systems. Complaints are easy to record but might not reflect the actual performance of the water utility due to the fact that some customers don’t complain about problems they experience. IWA (2006), OFWAT (2012), NWC (2011) and ADB (2012) have proposed other PIs as well for example, water restrictions, call response duration, supply pressure, efficiency of connection and meter installations, etc. A lower number of complaints is an indirect measure of an efficient utility. There are two main reasons for relatively larger numbers of PIs in this category as given in Table 3.6. Firstly, these are the most important PIs for SMWU, due to the fact that these utilities usually have fewer routine maintenance staff and vehicles, and the customers’ complaints are the most efficient source of problem identification. Secondly, the data required to measure these PIs only needs good recordkeeping instead of expansive data collection and analysis exercises. Through simple analyses and comparisons with similar utilities, efficient conclusions can be drawn on the overall performance of the utility under study.   SMWU frequently deal with agricultural customers. The connections to the agricultural sector are widespread and sometimes even the customer is unable to identify the problem location. Sometimes, the customer is not using the water at all, based on crop water requirements, and there is a pipe break somewhere in the fields. In such cases, the customer will receive a bill much higher than expected, resulting in an indirect complaint regarding bill inaccuracy, which in reality is due to the pipe break. The PA team should relate the actual cause of the complaint to the relevant component of the WSS. This can be done through use of a well-developed customer complaint work order. The response to the complaint should be indicated on the performa, and also whether the actual problem was the same as recorded by the customer or something different than that (i.e., in-house plumbing issue). Moreover, the work order should mention how many visits were required to completely resolve the problem, how much distance was travelled, etc. This data would be useful to estimate the cost of each response or repair activity. Later, on the basis of such analysis, the management of a SMWU can take the improvement actions.   Financial and Economic Indicators 3.2.7 Different organizations have split the financial indicators into billing, pricing, asset, operating cost, and other categories (See Table 2.2 for details). For a small municipality it is more important to know whether or not its operating costs are being met with the revenues it is generating, and whether or not it is serving its debts (WB 2011). IWA (2006) has provided a list of 46 financial indicators by splitting the cost according to type of cost (i.e., main functions of the water utility, technical functions, etc.), which might  67 not be practical for SMWU. An effort is made in Table 3.7 to identify the important financial PIs. Utilities can also select additional and advanced indicators as per their requirements and data availability. Running costs of a water utility consist of overall O&M costs, and the cost of permanent manpower, whereas, the capital costs include net interest and depreciation during the assessment period.   Table 3.7 Proposed Financial/ Economic indicators No Use of the PI Basic (Start-up) Indicators Additional Indicators Additional  (Long-term)1 Indicators Calculation Data variables 1. M/ P Revenue per unit volume of supplied water ($/m3) [(Operating revenues-capitalized costs of the constructed assets)/(Authorized consumption during the year)]   Operating revenue during the year  Authorized consumption during the year -  Sales revenues  Other (if applicable) revenues 2. T/ M/ P Non-revenue water (NRW) [(Cost of the systems input volume) – (billed authorized consumption)] - System input volume - Data of billed consumption - Unit cost of water - - 3. M/ P Unit total costs  ($/m3) [(Total costs including running cost & capital costs)/(Authorized consumption during the year)]   Running costs  Capital costs  Authorized consumption during the year -  Unit running costs  Unit capital costs 4. M/ P Unit investment ($/m3) [(Cost of investments (expenditures for plant and equipment)/( Authorized consumption during the year)]    Cost of total expenditures for plant and equipment  Authorized consumption during the year - - 5. M Average water charges ($/m3) [(Water sales revenue from all types of customers)/ (Total authorized consumption during the year)]  Total revenue from total water sold  Authorized consumption during the year - - 6. M Operating cost coverage ratio [(Total annual operational revenues)/(Total annual operating costs)]  Total operational revenue from total water sold  Authorized consumption during the year -  Delays in accounts receivable  Investment ratio  Average depreciation ratio  Late payment ratio 7. M/ P Debt service ratio (%) [(Cash income)/(Financial debt service “FDS”)] x 100  Total annual net income  FDS contains the cost of interest expenses, the cost of loans and the principle repayment debt instruments,  -  Debt equity ratio 8. M/ P Liquidity (Current ratio) [(Current assets)/(Current liabilities)]  Current assets include cash in hand, receivable accounts, inventories and prepaid expenses  Current liabilities include accounts payable, current liabilities and current portion of remaining long-term liabilities - - 9. T/ M/ P Underground infrastructure renewed or rehabilitated (%) [(Underground  infrastructure renewed or rehabilitated annually)/ (Total underground infrastructure)] x 100  Lengths of underground water mains renewed or rehabilitated in a year  Total lengths of mains  Value of horizontal components of infrastructure renewed or rehabilitated (%)  Value of vertical components of infrastructure renewed or rehabilitated (%)  - 10.  - - - -  Manpower cost  Electrical energy costs 11.  - - - -  Management functions cost  Financial and commercial functions costs  Customer service functions costs  Technical service function cost 12.  - - - -  Water resources and catchment management costs  Abstraction and treatment costs  68  Transmission, storage and distribution costs  Water quality monitoring cost 1 might be suitable for medium sized utilities in developed countries IWA (2006) recommended calculating the financial indicators for one year; however, assessment periods less than a year can also be used with necessary explanation. Wyatt (2010) developed a financial model to optimally manage NRW in developing countries. He stated that bill collection in developing countries is not as efficient as in developed countries; thus, there is a need to differentiate between the water that is billed and the actual revenue collected. The situation could be worse in the case of SMWU in developing countries. In such circumstances, the indicator in terms of revenue per unit volume of supplied water can provide a more rational basis for cross-comparison; instead of using NRW on the basis of billed authorized consumption (refer to PIs 1 and 2 in Table 3.7).  The PIs of the unit total cost and the unit investment given in Table 3.7 will facilitate life cycle costing and long-term asset management by utility managers and decision makers.  The indicator of the average water charges provides a rational basis for fixing the water charges based on the type and number of customers of each use (residential, agricultural, industrial, etc.). Operating cost coverage is an important PI for defining the operational efficiency in financial terms. For private organizations, or a water utility developed with loans, debt service ratio could be an important indicator even for a S-WSS. Indicators of investment ratio, depreciation ratio, and late payment ratio might not be of significance for SMWU, except high value, medium sized utilities in developed countries. Therefore, the additional PIs given in Table 3.7 can be selected on a long-term basis for such MWUs.   3.3 Summary  It is well recognized that PIs would be different for developing countries than for developed ones due to variations and limitations associated with data availability. In this regard, selected PIs for SMWU have been identified in three stages (levels) after conducting a detailed review in Chapter 2 of the PIs used by different organizations in both developed and the developing countries. Start-up PIs are proposed for both developing and developed countries, requiring limited data to initiate the PA process; additional PIs are proposed for developed countries and for developing countries if the data can be collected; and advanced PIs are proposed for MWU (having sufficient resources) in developed countries. The PIs identified in this chapter are further evaluated for final selection in next Chapter 4.  69 Chapter 4     Selection of Performance Indicators   A part of this chapter has been published in Urban Water Journal as a research articles titled “Selecting Performance Indicators for Small to Medium Sized Water Utilities: Multi-criteria Analysis using ELECTRE Method” (Haider et al. 2015a).  The suitable PIs in Chapter 3 are further evaluated in this chapter using MCDA to achieve a concise list of suitable PIs covering all the functional components of SMWU.   4.1 Background  A PI is used to measure the performance of a program in terms of percentage or index; and it is monitored at defined intervals, and can be compared to one or more criteria or standards (OPM 1990). In Chapter 2 the existing systems of PIs in literature are reviewed in the context of SMWU.  A summary of distribution of PIs grouped into different categories (covering various organizational components of a water utility) by the above mentioned agencies was presented in Table 2.2; here a graphical representation is shown in Figure 4.1. It can be seen that major categories include PIs of operational, quality of service, and financial; however, other categories also need to be given relative importance.   Although, SMWU have lesser participation in benchmarking process, and inadequate and inaccurate data available to calculate PIs; such utilities in developed countries have the potential to improve their performance assessment process with the ability to shift to latest technologies, and an eagerness (as well as structure) to improve their operational and monitoring data inventories. This situation stresses the need to identify and select appropriate and simple PIs for SMWU.   Literature review in Section 2.7 (Chapter 2) revealed that the selection of suitable PIs using the ordinal (qualitative) scales with a small range of rating (1 to 5), generate significantly small differences in final scores, might not be easy for the decision maker (DM). Performance assessment is a continuous process, and defining the cut-off for the ranked PIs may limit the applicability of the selected PIs only for the specific case, and also does not allow including additional PIs as per the future needs. Therefore, the method for selecting the PIs should adequately address these issues.   70             Figure 4.1 Distribution of PIs in different categories by various agencies,  a graphical representation of Table 2.2  MCDA outranking methods such as Elimination and Choice Translating Reality (ELECTRE) based on pairwise comparisons of alternatives are suitable for qualitative attributes, and also when the differences of evaluations are small (Kabir et al. 2013, Figueria et al. 2005). In this research, ELECTRE method is used due to three main reasons. Firstly, by accumulating the smaller differences (of scoring) between different alternatives (PIs) under each criterion, distinct outranking relations between different PIs can be established. Secondly, the network diagrams established based on the outranking relations between all the PIs (included in the evaluation process) provides an opportunity to encompass the important ones to initiate the PA process and also to add more PIs at later stages. In this way, the PIs which might not be important for a specific utility or in view of decision makes are still available in the network diagrams; this is not possible if a discrete cut-off is used. Thirdly, the final ranking based on overall dominance structure through ELECTRE method can be used in allocating importance weights to the PIs during detailed PA process for developing performance indices.      4.2 Modeling Approach  The modeling approach used for the selection of PIs for SMWU is shown in Figure 4.2. Initially, potential PIs (grouped into 7 most commonly used categories) were screened out using simple checklist process from existing PIs available in literature in Chapter 3. A set of 4 criterions has been established to evaluate the suitability of an indicator using multicriteria analysis. Weights to each criterion have been assigned 0 20 40 60 80 100 120 140 160 180OFWAT (2012)ADB (2012)WB (2011)NWC (2011)CSA (2010)NRC (2010)IWA (2006)AWWA (2004)Water resources & environmentPhysical & AssetsPersonnel/ StaffingWater quality & Public HealthOperationalQuality of ServiceFinancial & economic 71 using Analytical Hierarchical Process (AHP) through group decision making process. A matrix between PIs and evaluation criteria was generated by scoring each indicator against each criterion. The ELECTRE method was used to develop the outranking relationships between the indicators under each category, and to establish final preferences. The details of each step of the framework shown in Figure 4.2 are presented below.   Criteria for Selection of PIs and Ranking System 4.2.1 From initial screening in Chapter 3, 114 potential PIs presented in Table 4.1 were identified for further evaluation using MCDA. The following criterions have been used to evaluate the most suitable PIs for SMWU:  C1 – Applicability: how much an indicator is applicable and relevant for the performance assessment of SMWU? It is related to the overall technical, environmental, and socio-economic relevance of the indicator. C2 – Understandability: how much an indicator is understandable to both the public and the utility personnel? It is related to the type of data the PI involves and interpretability of the indicator. C3 – Measurability: how much the indicator is measurable? It is related to the availability, accuracy, and frequency of monitoring data required for the calculation of the indicator. C4 – Comparability: how much the calculated value of indicator is comparable with the other similar utilities in the region and/ or national or international level?  It is important to define the ranking system (as much as possible) to facilitate the scoring process. In this connection, an attempt has been made in this study by defining all the ranks under each criterion in Table 4.2. The selected criterions were defined on ordinal scales, and decided to rank on a 5-point translated as 1 as ‘Very Low’; 2 as ‘Low’; 3 as ‘Average’; 4 as ‘High’; and 5 as ‘Very High’. Applicability and measurability were spanned over 5 ranks with an interval of 1, whereas, understandability and comparability criterion were ranked as 1, 3, or 5. The consistent variability is assumed in the ranking process for all the criteria.  For comparability criteria, the higher ranks have been given to the indicators included in National Water and Wastewater Benchmarking Initiative (NWWBI) evaluating the performance of 41 water utilities (generally LWU with population greater than 50,000) for the year 2010. The minimum, average, and maximum values of the PIs given in the public report published in 2012 by AECOM provide an opportunity to the utilities to compare the values of their PIs with other utilities in the region.  72                               Figure 4.2 Modeling approach for selection of PIs for SMWU       Initial Screening Literature review of available PIs systems for WSS (Chapter 1) CSA (2012) 62*  OFWAT (2012) 14 ADB (2012) 54 WB (2011) 81 NWC (2011) 73 NRC (2010) 33 IWA (2006) 170 AWWA (2004) 31 Selection of 114 short-listed PIs for SMWU (Chapter 2) WE (13) PE  (22) PH (12) OP (22) WP (12) QS (16) EF (16) Defining criteria for evaluation of PIs Selection and Ranking of suitable PIs for SMWU with  ELECTRE 1 method Evaluation of selected PIs with MCDA (Scoring Matrix)  WE1 Applicability Understandability Measurability Comparability WE2 --- WEm Determination of criteria weights using AHP  Performance Indicators for SMWU  73 Table 4.1 Selected PIs through initial screening WE WATER RESOURCES AND ENVIRONMENTAL PH PHYSICAL WE-1 No. of days of water restriction (%) PH-1 Metering level (%) WE-2 Average Daily Per Capita Domestic Water Consumption PH-2 Degree of automation (%) WE-3 Average Day Demand / Existing Water License Capacity PH-3 Raw water storage capacity (days) WE-4 Energy consumption in KWH (D&T) PH-4 Treated water storage capacity at ADD (hrs) WE-5 Impact of pipe flushing on aquatic life PH-5 Treatment Plant Capacity  WE-6 Disposal of backwash water (% Residuals) PH-6 Pumping Utilization (%) WE-7 Sector vise availability of water resources (domestic, industrial, etc.) PH-7 Remote control degree (%) WE-8 GHG emissions from routine transport fuel emissions PH-8 Pump Station Energy Consumed KWH/ Total Pump Station HP WE-9 Per Capita Water Consumption (Overall) PH-9 Hydrant Density (No/Km) PE PERSONNEL (STAFFING) PH-10 Valve density (No/Km) PE-1 Number of in house metering field FTEs1 / 1000 meters PH-11 Metering density (No/ 1000 service connections) PE-2 Water quality monitoring personnel (No/ 1000 tests/ year) PH-12 Treatment plant capacity (%) PE-3 Water resources and catchment management employee (No/106m3/year) OP OPERATIONAL PE-4 Number of field FTEs*/ 100km length OP-1 Service connection rehabilitation (%) PE-5 Number of field FTEs*/ 1000 ML treated water OP-2 Replaced valves (%/year) PE-6 No of Lost Hours due to Field accidents/ 1000 field labour hours – (D)2 OP-3 Mains Replaced (%/year) PE-7 No of Lost Hours due to Field accidents/ 1000 field labour hours – (T)3 OP-4 Mains Rehabilitation/ Renovation* (%/year) PE-8 No. of sick days taken per field employee- (D) OP-5 Hydrant Inspection (per year) PE-9 No. of sick days taken per field employee- (T) OP-6 Leakage (%/ year) PE-10 Total overtime field hours/ Total Paid field hours – (D) OP-7 Cleaning of storage tanks (per year) PE-11 Total overtime field hours/ Total Paid field hours – (T) OP-8 Non-Revenue Water (L/ connection/ day) PE-12 Personnel Training (Hours/employee/year) OP-9 No of Main Breaks (No./ 100Km) PE-13 Working accidents (No/100 employee/year) OP-10 Inoperable or leaking hydrants (%) PE-14 No of Field accidents with lost time/ 1000 field labour hours – (D) OP-11 Residential Customer Reading Efficiency PE-15 No of Field accidents with lost time/ 1000 field labour hours – (T) OP-12 Operational Meters PE-16 Total available field hours/ total paid field hours – (D) OP-13 Infrastructure Leakage Index (ILI) PE-17 Total available field hours/ total paid field hours – (T) OP-14 Network Inspection (per year) PE-18 % of field employee eligible for retirement per year – (D) OP-15 Pump Inspection (per year) PE-19 % of field employee eligible for retirement per year – (T) OP-16 Apparent losses per connection PE-20 Average Work Experience ratio OP-17 Apparent losses per system input volume PE-21 Employees per connection (No/1000 connections) OP-18 Real losses per connection (l/ connection/ day w.s.p) PE-22 Employees per volume of water supplied OP-19 Real losses per main length (l/ Km/ day w.s.p) FE FINANCIAL AND ECONOMIC OP-20 % of Inoperable of Leaking Valves FE-1 O&M Cost ('000)/ Km Length ($/Km) OP-21 Customer reading efficiency FE-2 O&M cost of water treatment ($/ Million liters of treated water) OP-22 Power Failure FE-3 Revenue per unit volume of supplied water ($/m3) WP WATER QUALITY AND PUBLIC HEALTH FE-4 Water rate for a typical size residential connection using 250 m3/year WP-1 No of Boil-Water Advisory Days  FE-5 Operating cost coverage ratio WP-2 Cumulative Length Cleaned as % of System Length  FE-6 Debt service ratio (%) WP-3 Average Value of Turbidity  in WDS (NTU) FE-7 NRW by volume WP-4 No of Total Coliform Occurrences in WDS FE-8 Liquidity (Current ratio) WP-5 THMs in water distribution system (mg/L) FE-9 5 year running average capital reinvestment/ replacement value – (D) WP-6 Residual chlorine in distribution system (mg/L) FE-10 Cost of O&M of fire hydrants/ total number of fire hydrants WP-7 Turbidity of treated water (NTU) FE-11 Metering O&M cost WP-8 No of total coliform occurrences in Treated water  FE-12 Pump station O&M cost ('000)/ total pump station horsepower WP-9 Concentration of Nitrates in treated water (mg/L) FE-13 Cost of Customer Communication/ Population Served WP-10 Aesthetic water quality tests carried out (%) FE-14 Cost of water quality monitoring/ population served ($/ person) WP-11 Microbiological water quality tests carried out (%) FE-15 Chemical Cost / ML Treated ($/ million liters of treated water) WP-12 Chemical water quality tests carried out (%) FE-16 Water Revenue per employee QS QUALITY OF SERVICE 1 Full time Employees QS-1 Billing complaints (%) 2 Distribution system QS-2 Other complaints and quarries (%) – Service connection/ leakage 3 Treatment  QS-3 Number of water pressure complaints/ 1000 people served   QS-4 Number of water quality complaints/ 1000 people served   QS-5 Total response to reported complaints (%)   QS-6 Number of Unplanned System Interruptions/ 100 Km main length   QS-7 Unplanned Maintenance Hours/ Total maintenance hours (%)   QS-8 Population coverage (%)   QS-9 Quality of water supplied   QS-10 Number of water quality complaints by reason/ 1000 served   QS-11 Total complaints per connection (No/1000/year)**   QS-12 Continuity of supply (%)   QS-13 Aesthetic test compliance   QS-14 Microbiological test compliance   QS-15 Physical-chemical test compliance   QS-16 Radioactive test compliance  74 Table 4.2 Scoring system and definition of criteria Score Criteria Description C1 Applicability/ Relevance 1 Very Low Seems to be Irrelevant for SMWU. 2 Low Has low relevance to SMWU. 3 Average Average applicability for SMWU. 4 High Highly applicable for performance assessment of SMWU. 5 Very High Has to be included, extremely important for SMWU. C2 Understandability  1 Low PI is difficult for everyone to understand and interpret. 3 Average  PI is understandable to utility personnel but might not be understandable to public. 5 High PI is understandable to both the public and the utility personnel. C3 Measurability  1 Very low Both the data variables are measured at very low frequency. 2 Low Both the variables are measured at lower frequencies. 3 Average Some of the variables are absolute and some are monitored at low frequency. 4 High Some of the variables are absolute and some are monitored at high frequency. 5 Very High Values of all the variables are known with their absolute values, e.g., fixed physical assets. C4 Comparability  1 Low PIs have rarely been used by the similar utilities. 3 Average  PIs have been used by the water utilities outside the region. 5 High PIs have been used by the utilities in region.   Multicriteria Decision Analysis (MCDA) 4.2.24.2.2.1 Analytic Hierarchy Process  The analytic hierarchy process (AHP), developed by Saaty (1980), formulize the human intuitive understanding of a complex problem with the help of a hierarchical structure. However, instead of using AHP for a complex decision making problem, in this research it is used to determine the weights of above mentioned criteria for selection of PIs. The step-by-step approach is described below:  Step 1: Pairwise comparison matrix The first step is to set the preferences concerning the four selected evaluation criteria to develop the pairwise comparison matrix ‘A’. The nine point intensity scale established by Satty and Vargas (1991) varies between 1 and 9 (degree of preferences) for equal importance to extreme importance shown in Table 4.3 is used for pairwise comparison. In present research all the criteria have their specific importance for the selection of PIs; therefore, even numbers are also used.    75 Table 4.3 Evaluation scale used in pairwise comparison Scale Degree of preference 1 Equal importance 3 Moderate importance of one factor over another 5 Strong or essential importance  7 Very strong importance 9 Extreme importance 2,4,6,8 If a compromise judgment is required.  Step 2: Normalized comparison matrix In this step, normalized comparison matrix is formed. Normalization is done by diving each value of pairwise comparison matrix by the sum of the corresponding column. The corresponding rating (weights ’w’) are determined by averaging the values in each row of the normalized matrix.   Step 3: Consistency Analysis Next step is to check the consistency of the original preference rating.  In this regard, first maximum eigenvalue ‘λ’ needs to be calculated using the following equation:    ni iiwAwn 1max1  [4.1] and then the consistency index (CI) can be calculated as:   1max nnCI    [4.2]  The numerator in equation [4.2] is a measure of the deviation of the inconsistent matrix from the consistent comparison matrix, and provides the basis to determine CI showing the consistency of the expert’s estimates (CABAŁA 2010). Now calculate the random index (RI), which is essentially CI of a randomly generated pairwise comparison matrix. The RI values proposed by Saaty (1980) are given in Table 4.4. Order of matrix ‘n’ is four in this study; and thus RI is 0.9.   Table 4.4 Random indices established by Saaty (1980) n1 1 2 3 4 5 6 7 8 9 10 RI 0.0 0.0 0.58 0.9 1.12 1.24 1.32 1.41 1.46 1.49 1n = order of matrix    76 Finally calculate the consistency ratio (CR) as:  RICICR    [4.3]  Generally, a value of CR equal to or less than 0.1 is considered acceptable. An example of the application of above mentioned method to determine the weights of relevance, comparability, measurability, and understandability for the selection of PIs is enclosed as Appendix-B.   4.2.2.2 Elimination and Choice Translating Reality   The trade-offs among criteria values are desirable due to their relative impermanence in the overall decision making framework (Figure 4.2); therefore compensatory methods are required to solve this type of decision making problem. The ELECTRE 1 (a compensatory method) based on outranking relation theory has been used here as a suitable MCDA method. This method was first introduced by Benayoun et al. (1966), after that it was used in several decision making problems such as water resource management, infrastructure management, material selection, transportation, etc. (Zardari et al. 2010, Coutinho-Rodrigues et al. 2011, Pang et al. 2011, Anton and Grau 2004). The other methods such as ELECTRE III, IV, and IS are based on fuzzy ranking approach (Figueira et al. 2005); however, in this research average values of the ranks have been used for simplicity. ELECTRE 1provides the opportunity to develop visual networks maps of outranking relationships between different alternatives (PIs), which is convenient for the decision makers in the final selection of PIs. Moreover, unlike other non-compensatory methods, in ELECTRE 1 method, weights are the coefficients of importance and not the criteria substitution rates (Milani et al. 2006).   Secondly, all the evaluation criteria are defined on ordinal scale in Table 4.2. Under such situation, when the substantial preferences between various alternatives (PIs in this study) based on small differences of evaluations cannot be established, though accumulation of numerous small differences may become significant. The preference structure using ELECTRE 1 method can be generated using discrimination thresholds (indifferences and preferences) (Figueira et al. 2005). In this study the outranking relationships between the alternatives (PIs) are distributed at different levels. The DM can encompass the desired (most important) levels with the help of decision makers boundary (DMB). Moreover, PIs can be finally ranked according to their preferences based on net concordance and net discordance indexes.    77 In ELECTRE 1 method, concordance and discordance indexes (two types of indices pair-wise comparison) are formulated to form outranking relationships between alternatives (Ap & Aq; where p, q=1,2……m and p q). The indexes can be considered as a measure of satisfaction and dissatisfaction of a DM while giving preference to a specific alternative over the others. If X1, X2,…….., Xn are the criteria for evaluation of alternatives, then xij will be the value assigned to the degree of alternative Ai with respect to the criteria Xj. The following steps were followed after determining the weights (i.e., w1, w2,….wn) to each criteria using AHP method.  Step1: Normalization of weighted matrix Depending on the type of criteria (i.e., benefit or loss; for benefit higher the better and vice versa), develop the normalization matrix “Rij”. In case of mixed criteria, the values of the cost criteria will be inversed.  miijijijxxR12 , i=1,2,……,n  and  j=1,2,……, m  [4.4]  Each value in normalized matrix will be multiplied with the criteria weight, and the normalized weighted matrix will attain the following from:  nmnmmnnijwrwrwrwrwrwrWRV..........................................22111212111   [4.5]  where all the values in the Vij matrix range between 0 and 1.  Step 2: Develop Concordance and Discordance Sets As described earlier, the sets of criteria is divided into concordance and discordance sets for pairwise comparison between any two alternatives (Ap and Aq). Therefore, the resultant concordance interval set C(p,q) consists of all the attributes for which Ap is preferred over Aq, and can be presented as:   qjpj vvjqpC ),(     [4.6]   78 where vpj and vqj are the weighted normalized ratings (from matrix given in equation 4.5) of the alternatives Ap and Aq respectively with respect to the jth attribute.  The discordance set D(p,q) as the complement of Cpq consists of all the attributes for which Ap is worse than Aq and can be written as:   qjpj vvjqpD ),(     [4.7]  Step 3: Calculate Concordance and Discordance Indexes The concordance index (Cpq) shows the relative power of each concordance set by describing the degree of confidence in the pairwise decisions, such as Ap Aq (i.e., Ap outranks Aq), and can be defined as:   ),( qpCj jpq wC    [4.8]  Therefore, Cpq is essentially the sum of the weights of attributes in equation [4.6].  The power of discordance set can be quantified by Discordance index Dpq, which shows the degree of disagreement in the decision Ap Aq, and can be written as:   mjqjpjqpDjqjpjpqvvvvD1),(    [4.9]  The numerator in equation [4.9] contains the attributes contained by equation [4.7].  Step 4: Defining the Outranking Relationships The dominance of alternative Ap over Aq will be stronger in case of higher Cpq and lower Dpq. The results of ELECTRE 1 method in this study of selecting PIs for SMWU are summarized as the outranking relationships between two indicators. The calculated concordance and discordance indexes are compared with the means of these indexes as:  CC pq    [4.10a]  79 and DDpq    [4.10b]  where C and D are the averages of all the Cpq and Dpq respectively.   The outranking relationship between two alternatives will only hold true when both the relations given in equation (4.10a&b) will be satisfied. The PIs may have different types of relationships among each other based on the conditions of equation 4.10 a&b. If both the equations hold true alternative p is better than (outranks) q; if both holds untrue, alternative p is indifferent to q; and if one holds true and the other does not, it mean p is incomparable to q. Figure 4.3 shows an example of these outranking relationships. The decision makers’ boundary has been established based on the existing needs and data availability in the participating utilities in this study. Later, with improvements in the benchmarking process, the SMWU can include more PIs by extending the DMB. Conversely, smaller utilities can initiate the performance assessment process by encompassing less (top level) PIs with DMB.   Step 5: Overall Ranking of Performance Indicators  The net outranking relationships can be developed by calculating net concordance index (Cp) and net discordance index (Dp) for each alternative. Cp measures the degree to which the dominance of an alternative over other alternatives exceeds the dominance of other (competing) alternatives over the alternative (i.e., Ap), and can be calculated as:    mkmk kppkpCCC1 1 ; k  p  [4.11]  In the same way, the Dp estimates the relative weakness of Ap relating to other alternatives and can be written as:    mkmk kppkpDDD1 1 ; k  p  [4.12]  The overall preference (ranks) can be established with higher Cp and lower Dp values for each alternative. Higher Cp and lower Dp values will get the higher ranks. Final ranking is done based on the ranking of net concordance and net discordance values estimated from equations [4.11] and [4.12].  80             Figure 4.3 Outranking relations of water resources and environmental PIs showing DMB   4.3 Application of MCDA – An Example of Water Resources and Environmental PIs   Estimation of Criteria Weights using AHP 4.3.1 The weights of the selected criteria were determined with the help of AHP. The pairwise comparison matrix for estimation of weights is shown in Table 4.5. The rating scheme given in Table 4.3 is used. The normalized comparison matrix is presented in Table 4.6.  The weights of relevance, comparability, measurability, and understandability came out to be 0.5, 0.25, 0.15, and 0.1 respectively. The values of consistency index (CI) and consistency ratio (CR) were found to be 0.007, and 0.01 respectively. A CR value of less than 10% affirms the consistency check.  Table 4.5 Pairwise comparison matrix for weight estimation using AHP  Relevance Comparability Measurability Understandability Relevance/ Importance 1 2 4 5 Comparability 1/2 1 2 3 Measurability 1/3 1/2 1 2 Understandability 1/5 1/ 3 1/ 2 1 Note: Consistent variability assumption; i.e., the values in this table are the averages of the rating values given by the decision makers    WE 2 WE 1 WE 6 WE 3 WE 5 DMB WE 4 WE 7 WE 8 WE 9   Legend:  WE 2 WE 1 WE 1 is better than WE 2 PIs grouped at a level Decision maker’s boundary (DMB) All the PIs below this arrow are outranked by all the PIs above it. Note: Same legend is applicable to all groups of PIs. WE 2 WE 1 WE 1 is indifferent to WE 2 WE 2 WE 1 WE 1 is incomparable to WE 2  81 Table 4.6 Normalized comparison matrix for weight estimation using AHP  Relevance Comparability Measurability Understandability Relevance/ Importance 0.51 0.52 0.53 0.45 Comparability 0.26 0.26 0.27 0.27 Measurability 0.13 0.13 0.13 0.18 Understandability 0.10 0.09 0.07 0.09    Development of Outranking Relationships using ELECTRE 4.3.2 Initially 9 PIs were identified for the functional component of water resources and environmental (WE). The scoring matrix developed (by the scoring system given in Table 4.2) based on experience judgment for all the categories is presented in Table 4.7. However, as an example only scores for WE category are considered for the following calculations. All the values are higher the better. Normalized weighted matrix using equations [4.4] and [4.5] is presented in Table 4.8.  As a result of equation [4.6] and [4.7] to the values given in Table 4.8, the concordance and discordance interval sets are presented in Table 4.9.                      82 Table 4.7 The scoring matrix along with criteria weights DM Ranking Performance Indicator (PI) (C1)  Relevance/ Importance (C2) Comparability (C3) Measurability (C4) Understandability  Weights 0.48 0.25 0.16 0.11 WE WATER RESOURCES AND ENVIRONMENTAL    WE-1 No. of days of water restriction (%) 5 5 5 5 WE-2 Average Daily Per Capita Domestic Water Consumption 5 5 4 5 WE-3 Average Day Demand / Existing Water License Capacity 4 3 4 5 WE-4 Availability of water resources (%) 3 3 4 3 WE-5 Impact of pipe flushing on aquatic life 4 3 3 3 WE-6 Disposal of backwash water (% Residuals) 4 4 4 5 WE-7 Sector vise availability of water resources (domestic, industrial, etc.) 3 2 4 2 WE-8 GHG emissions from routine transport fuel emissions 2 4 4 2 WE-9 Per Capita Water Consumption (Overall) 2 3 4 3 PE PERSONNEL     PE-1 Number of in house metering field FTEs1 / 1000 meters 4 4 4 5 PE-2 Water quality monitoring personnel (No/ 1000 tests/ year) 4 3 4 3 PE-3 Water resources and catchment management employee (No/106m3/year) 4 3 5 3 PE-4 Number of field FTEs*/ 100km length 4 3 5 5 PE-5 Number of field FTEs*/ 1000 ML treated water 4 3 4 5 PE-6 No of Lost Hours due to Field accidents/ 1000 field labour hours – (D)2 4 2 5 5 PE-7 No of Lost Hours due to Field accidents/ 1000 field labour hours – (T)3 4 2 5 5 PE-8 No. of sick days taken per field employee- (D) 4 3 5 5 PE-9 No. of sick days taken per field employee- (T) 4 3 5 5 PE-10 Total overtime field hours/ Total Paid field hours – (D) 4 3 4 5 PE-11 Total overtime field hours/ Total Paid field hours – (T) 4 3 4 5 PE-12 Personnel Training (Hours/employee/year) 4 3 5 3 PE-13 Working accidents (No/100 employee/year) 2 3 5 3 PE-14 No of Field accidents with lost time/ 1000 field labour hours – (D) 2 3 5 5 PE-15 No of Field accidents with lost time/ 1000 field labour hours – (T) 2 3 5 5 PE-16 Total available field hours/ total paid field hours – (D) 2 3 4 5 PE-17 Total available field hours/ total paid field hours – (T) 2 3 4 5 PE-18 % of field employee eligible for retirement per year – (D) 2 2 4 5 PE-19 % of field employee eligible for retirement per year – (T) 2 2 4 5 PE-20 Average Work Experience ratio     PE-21 Employees per connection (No/1000 connections) 2 3 5 3 PE-22 Employees per volume of water supplied 2 3 5 3 PH PHYSICAL ASSETS     PH-1 Metering level (%) 5 4 5 4 PH-2 Degree of automation (%) 5 3 5 3 PH-3 Raw water storage capacity (days) 5 4 4 3 PH-4 Treated water storage capacity at ADD (hrs) 4 3 4 5 PH-5 No of days treatment plant operated greater than 90% of its total capacity 4 3 4 5 PH-6 Pumping Utilization (%) 4 4 4 2   83 Table 4.7(Cont’d) The scoring matrix along with criteria weights DM Ranking Performance Indicator (PI) (C1)  Relevance/ Importance (C2) Comparability (C3) Measurability (C4) Understandability  Weights 0.48 0.25 0.16 0.11 PH-7 Remote control degree (%) 4 3 5 3 PH-8 Pump Station Energy Consumed KWH/ Total Pump Station HP 2 3 3 5 PH-9 Hydrant Density (No/Km) 3 3 5 2 PH-10 Valve density (No/Km) 3 2 5 2 PH-11 Metering density (No/ 1000 service connections) 2 3 5 3 PH-12 Treatment plant capacity (%) 3 5 4 2 OP OPERATIONAL     OP-1 Service connection rehabilitation (%) 5 5 4 5 OP-2 Replaced valves (%/year) 5 3 5 3 OP-3 Mains Replaced (%/year) 5 3 5 3 OP-4 Mains Rehabilitation/ Renovation* (%/year) 5 3 3 3 OP-5 Hydrant Inspection (per year) 5 3 5 3 OP-6 Leakage (%/ year) 5 5 4 3 OP-7 Cleaning of storage tanks (per year) 4 5 4 3 OP-8 Non-Revenue Water (L/ connection/ day) 5 3 3 5 OP-9 No of Main Breaks (No./ 100Km) 5 3 5 5 OP-10 Inoperable or leaking hydrants (%) 2 3 3 5 OP-11 Residential Customer Reading Efficiency 4 5 4 3 OP-12 Operational Meters 4 5 4 3 OP-13 Infrastructure Leakage Index (ILI) 5 1 1 3 OP-14 Network Inspection (per year) 3 3 3 3 OP-15 Pump Inspection (per year) 3 3 3 3 OP-16 Apparent losses per connection 3 3 1 3 OP-17 Apparent losses per system input volume 3 3 1 3 OP-18 Real losses per connection (l/ connection/ day) 3 3 1 3 OP-19 Real losses per main length (l/ Km/ day) 3 3 1 3 OP-20 % of Inoperable of Leaking Valves 2 3 3 5 OP-21 Customer reading efficiency 2 3 4 3 OP-22 Power Failure 2 3 4 2 WP WATER QUALITY AND PUBLIC HEALTH     WP-1 No of Boil-Water Advisory Days  5 5 5 4 WP-2 Cumulative Length Cleaned as % of System Length  4 4 4 5 WP-3 Average Value of Turbidity  in WDS (NTU) 5 3 4 4 WP-4 No of Total Coliform Occurrences in WDS 5 3 4 4 WP-5 THMs in water distribution system (mg/L) 5 3 3 4 WP-6 Residual chlorine in distribution system (mg/L) 5 3 4 4 WP-7 Turbidity of treated water (NTU) 5 3 4 4 WP-8 No of total coliform occurrences in Treated water  5 3 4 4 WP-9 Concentration of Nitrates in treated water (mg/L) 5 3 3 4 WP-10 Aesthetic water quality tests carried out (%) 3 3 3 2 WP-11 Microbiological water quality tests carried out (%) 3 3 3 2 WP-12 Chemical water quality tests carried out (%) 3 3 3 2 QS QUALITY OF SERVICE     QS-1 Billing complaints / 1000 connections 5 5 4 3 QS-2 Service connection complaints/ 1000 people served  4 3 4 3 QS-3 Number of water pressure complaints/ 1000 people served 4 5 4 5   84 Table 4.7(Cont’d) The scoring matrix along with criteria weights DM Ranking Performance Indicator (PI) (C1)  Relevance/ Importance (C2) Comparability (C3) Measurability (C4) Understandability  Weights 0.48 0.25 0.16 0.11 QS-4 Number of water quality complaints/ 1000 people served 4 5 4 5 QS-5 Total response to reported complaints (%) 4 5 4 3 QS-6 Number of Unplanned System Interruptions/ 100 Km main length 4 3 4 5 QS-7 Unplanned Maintenance Hours/ Total maintenance hours (%) 4 3 3 5 QS-8 Population coverage (%) 4 5 3 4 QS-9 Quality of water supplied 4 5 4 3 QS-10 Number of water quality complaints by reason/ 1000 served 2 2 4 5 QS-11 Total complaints per connection (No/1000/year)** 3 4 4 2 QS-12 Continuity of supply (%) 2 5 4 3 QS-13 Aesthetic test compliance 3 3 3 3 QS-14 Microbiological test compliance 3 3 4 3 QS-15 Physical-chemical test compliance 3 3 3 3 QS-16 Radioactive test compliance 2 2 2 2 FE FINANCIAL AND ECONOMIC     FE-1 O&M Cost ('000)/ Km Length ($/Km) 5 3 4 5 FE-2 O&M cost of water treatment ($/ Million liters of treated water) 4 4 4 5 FE-3 Revenue per unit volume of supplied water ($/m3) 5 4 4 3 FE-4 Water rate for a typical size residential connection using 250 m3/year 3 4 5 5 FE-5 Operating cost coverage ratio 4 3 4 4 FE-6 Debt service ratio (%) 4 3 4 4 FE-7 NRW by volume 4 3 4 4 FE-8 Liquidity (Current ratio) 2 3 4 4 FE-9 5 year running average capital reinvestment/ replacement value – (D) 2 3 4 5 FE-10 Cost of O&M of fire hydrants/ total number of fire hydrants 2 3 4 5 FE-11 Metering O&M cost 2 3 4 5 FE-12 Pump station O&M cost ('000)/ total pump station horsepower 2 3 4 5 FE-13 Cost of Customer Communication/ Population Served 2 3 3 5 FE-14 Cost of water quality monitoring/ population served ($/ person) 2 4 4 5 FE-15 Chemical Cost / ML Treated ($/ million liters of treated water) 2 4 4 5 FE-16 Water Revenue per employee 2 2 4 3           85 Table 4.8 The normalized weighted matrix DM Ranking Performance Indicator (PI) (C1)  Relevance/ Importance (C2) Comparability (C3) Measurability (C4) Understandability  Weights 0.48 0.25 0.16 0.11 WE-1 No. of days of water restriction (%) 0.214 0.049 0.068 0.108 WE-2 Average Daily Per Capita Domestic Water Consumption 0.214 0.049 0.054 0.108 WE-3 Average Day Demand / Existing Water License Capacity 0.171 0.030 0.054 0.108 WE-4 Availability of water resources (%) 0.128 0.030 0.054 0.065 WE-5 Impact of pipe flushing on aquatic life 0.171 0.030 0.041 0.065 WE-6 Disposal of backwash water (% Residuals) 0.171 0.040 0.054 0.108 WE-7 Sector vise availability of water resources (domestic, industrial, etc.) 0.128 0.020 0.054 0.043 WE-8 GHG emissions from routine transport fuel emissions 0.085 0.040 0.054 0.043 WE-9 Per Capita Water Consumption (Overall) 0.085 0.030 0.054 0.065   Table 4.9 Concordance and discordance interval sets for performance indicators in WE category Concordance Interval Set C (1,2) = {1,2,3,4} C (1,3) = {1,2,3,4} C (1,4) = {1,2,3,4} C (1,5) = {1,2,3,4} C (1,6) = {1,2,3,4} C (1,7) = {1,2,3,4} C (1,8) = {1,2,3,4} C (1,9) = {1,2,3,4} C (2,1) = {1,2,4} C (2,3) = {1,2,3,4} C (2,4) = {1,2,3,4} C (2,5) = {1,2,3,4} C (2,6) = {1,2,3,4} C (2,7) = {1,2,3,4} C (2,8) = {1,2,3,4} C (2,9) = {1,2,3,4} C (3,1) = {4} C (3,2) = {3,4} C (3,4) = {1,3,4} C (3,5) = {1,3,4} C (3,6) = {1,3,4} C (3,7) = {1,2,3,4} C (3,8) = {1,3,4} C (3,9) = {1,2,3,4} C (4,1) = 0 C (4,2) = 0 C (4,3) = {1,2} C (4,5) = {1,2,4} C (4,6) = {1,2} C (4,7) = {1,2,4} C (4,8) = {1,4} C (4,9) = {1,2,4} C (5,1) = 0 C (5,2) = 0 C (5,3) = {1,2} C (5,4) = {1,2,3,4} C (5,6) = {1,2} C (5,7) = {1,2,4} C (5,8) = {1,2,4} C (5,9) = {1,2,4} C (6,1) = {4} C (6,2) = {3,4} C (6,3) = {1,2,3,4} C (6,4) = {1,2,3,4} C (6,5) = {1,2,4} C (6,7) = {1,2,3,4} C (6,8) = {1,2,3,4} C (6,9) = {1,2,3,4} C (7,1) = 0 C (7,2) = {3} C (7,3) = {3} C (7,4) = {3} C (7,5) = {3} C (7,6) = {3} C (7,8) = {1,3,4} C (7,9) = {1,3} C (8,1) = 0 C (8,2) = {3} C (8,3) = {2,3} C (8,4) = {2,3} C (8,5) = {2,3} C (8,6) = {2,3} C (8,7) = {2,3,4} C (8,9) = {1,2,3} C (9,1) = 0 C (9,2) = {3} C (9,3) = {2,3} C (9,4) = {3,4} C (9,5) = {3,4} C (9,6) = {3} C (9,7) = {2,3,4} C (9,8) = {1,3,4} Discordance Interval Set D (1,2) = 0 D (1,3) = 0 D (1,4) = 0 D (1,5) = 0 D (1,6) = 0 D (1,7) = 0 D (1,8) = 0 D (1,9) = 0 D (2,1) = {3} D (2,3) = 0 D (2,4) = 0 D (2,5) = 0 D (2,6) = 0 D (2,7) = 0 D (2,8) = 0 D (2,9) = 0 D (3,1) = {1,2,3} D (3,2) = {1,2} D (3,4) = {2} D (3,5) = {2} D (3,6) = {2} D (3,7) = 0 D (3,8) = {2} D (3,9) = 0 D (4,1) = {1,2,3,4} D (4,2) = {1,2,3,4} D (4,3) = {3,4} D (4,5) = {3} D (4,6) = {3,4} D (4,7) = {3} D (4,8) = {2,3} D (4,9) = {3} D (5,1) = {1,2,3,4} D (5,2) = {1,2,3,4} D (5,3) = {3,4} D (5,4) = 0 D (5,6) = {3,4} D (5,7) = {3} D (5,8) = {3} D (5,9) = {3} D (6,1) = {1,2,3} D (6,2) = {1,2} D (6,3) = 0 D (6,4) = 0 D (6,5) = {3} D (6,7) = 0 D (6,8) = 0 D (6,9) = 0 D (7,1) = {1,2,3,4} D (7,2) = {1,2,4} D (7,3) = {1,2,4} D (7,4) = {1,2,4} D (7,5) = {1,2,4} D (7,6) = {1,2,4} D (7,8) = {2} D (7,9) = {2,4} D (8,1) = {1,2,3,4} D (8,2) = {1,2,4} D (8,3) = {1,4} D (8,4) = {1,4} D (8,5) = {1,4} D (8,6) = {1,4} D (8,7) = {1} D (8,9) = {4} D (9,1) = {1,2,3,4} D (9,2) = {1,2,4} D (9,3) = {1,4} D (9,4) = {1,2} D (9,5) = {1,2} D (9,6) = {1,2,4} D (9,7) = {1} D (9,8) = {2}          86 Using equations [4.8] and [4.9] concordance and discordance indexes are calculated and presented as the following matrixes:                            The calculated concordance and discordance indexes presented above are compared with the means of these indexes C and D . The values of the means were found to be 0.630 and 0.476 for concordance and discordance respectively.    The outranking relationships for each alternative (indicator) were found by comparing Cpq and Dpq values in the above matrixes with the mean values using equations [4.10a] and [4.10b]. The results are shown in Figure 4.3. The net outranking relationships have been developed by calculating net concordance index (Cp) and net discordance index (Dp) for the selected indicators within the DMB using equations [4.11] and  - 1.000 1.000 1.000 1.000 1.000 1.000 1.000 1.000  0.837 - 1.000 1.000 1.000 1.000 1.000 1.000 1.000  0.252 0.415 - 0.891 0.891 0.891 1.000 0.891 1.000  C =  0.000 0.000 0.585 - 0.837 0.585 0.837 0.272 0.837  0.000 0.000 0.585 1.000 - 0.585 0.837 0.837 0.837  0.252 0.415 1.000 1.000 0.837 - 1.000 1.000 1.000  0.000 0.163 0.163 0.163 0.163 0.163 - 0.891 0.639  0.000 0.163 0.273 0.273 0.273 0.273 0.524 - 0.748  0.000 0.163 0.273 0.415 0.415 0.163 0.524 0.891 -  - 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000  1.000 - 0.000 0.000 0.000 0.000 0.000 0.000 0.000  1.000 1.000 - 0.120 0.140 1.000 0.000 0.06 0.00 D = 1.000 1.000 0.880 - 1.00 1.00 0.26 0.31 0.23  1.000 1.000 0.860 0.00 - 1.00 0.15 0.12 0.13  1.000 1.000 0.000 0.00 0.70 - 0.00 0.00 0.00  1.000 1.000 1.000 0.74 0.85 1.00 - 0.31 0.43  1.000 1.000 0.94 0.79 0.88 1.00 0.69 - 0.70  1.000 1.000 1.00 0.77 0.87 1.00 0.57 0.30 -  87 [4.12]. The results are presented in Table 4.10. Final ranking of the alternatives can also be established by plotting the net concordance vs. net discordance.   Table 4.10 Net Outranking of selected indicators Indicator Net concordance Net discordance Ranking of  net concordance Ranking of  net discordance Final ranking of indicators WE1 2.660 -4.000 1 1 1 WE 2 2.007 -2.000 2 2 2 WE 3 -1.137 2.000 4 4 4 WE 5 -2.557 3.212 5 5 5 WE 6 -0.973 0.788 3 3 3    4.4 Development of Indicators  The results of MCDA for each category of PIs are described in the following sections. The discussion is limited to the selected PIs confined within the DMB in this study. Details of the remaining PIs outside the DMs boundary are given in Chapters 2 and 3.   Water Resources and Environment (WE) Indicators 4.4.1 The results of MCDA showing outranking relations amongst different WE indicators are presented in Figure 4.3. Liter per capita per day (lpcd) is a very important indicator to assess the present and future water requirements of any WSS, but needs to be carefully used. The ‘lpcd’ on the basis of total water requirement would be misleading. For example, in one of the utility (took part in this study), 95% of all connections belong to single family residents; whereas, 23% of the total water consumption is used to meet requirements of irrigation use which are 1.4% of the total connections. Similar water consumption patterns are common in utilities operating in Okanagan basin; where a small % of irrigation users are consuming substantial portion of total water consumption. Therefore, “lpcd for domestic users” was found to be more appropriate indicator to assess the per capita water consumption. According to NWWBI 2010 public report, per capita water consumption for residential consumers varies between 168 to 593 lpcd in Canada (AECOM 2012).  Water restrictions (i.e., sprinkler regulations) from 0 to 365 days in a year (depending on rainfall intensity and availability of water resources) have been promulgated by numerous water utilities to conserve limiting freshwater resources all around the world. These restrictions need to be effectively implemented even in the regions with ample water resources in order to achieve long-term sustainability of water  88 resources. Water license issued by the concerned regulatory authority is another important indicator to ensure sustainable use of water resources. This indicator compares the average daily demand with the allocated water license capacity (in terms of % of utilized capacity) to foresee the future needs. Canadian water utilities have been using less than 1% to 75% of the allocated capacity (AECOM 2012).   The utilities with the higher values need short term or long-term planning to improve their source water availability along with implementation of their water conservation strategies. SMWU may have lesser overall water demands and population growth rates in comparison to LMW. Despite this fact, a utility approaching towards maximum allocated license capacity should evaluate all the possible source water alternatives to meet its future water requirements. New source selection may increase requirements of personnel for catchment management, and more complex water treatment operations (due to inferior water quality than the existing source). Consequently, utility management would need to include additional PIs under other categories as well (i.e., personnel, operational, quality of service, water quality).   Two indicators for environmental sustainability have been included in DMB shown in Figure 4.3. The first one is disposal of backwash water or the percentage residuals of water treatment plant into the receiving water body. Residual (waste) generated from water treatment facilities may contain toxic substances such as aluminum and manganese due to use of coagulants in such facilities, which can cause adverse impacts on aquatic life. Therefore, to avoid such adverse impact these residuals should be properly monitored and managed using this indicator (ENR 1987). The second indicator is the impact of pipe flushing on aquatic life in environmental category is developed in this study. Flushing on periodic basis is an important activity to keep the mains healthy and to provide safe water to the community by removing the biofilm, and corrosion tubercles. The flushed water may have higher chlorine concentrations or other pollutants. The indicator is based on a field observation of the operational staff of a participating utility, when chlorinated water used for flushing of main enters into the natural surface drains having aquatic ecosystems. The utilities should avoid routine flushing programs during dry seasons, when water volumes are low in receiving freshwater bodies.   According to 2011 Water Research Foundation report of “Energy Efficiency Best Practices for North American Drinking Water Utilities”, SMWU might not be aware of their potential to manage energy budget. In general, 80 to 90% of the energy is used in transmission and distribution of water from the treatment plant or source to the consumer. The SMWU should carefully observe the efficiency of their pumping units for energy management. Therefore, to implement an efficient energy management plan, the  89 first step is to keep a record of energy consumption for the transmission, distribution, and treatment in terms of kilowatt hour (KWH). This indicator is also included in the DMB under WE category in Figure 4.3. However, the data for this PI is presently not available for performance assessment.   Personnel/ Staffing (PE) Indicators 4.4.2 Human resources of a water utility play an important role to meet the performance objectives just like any other organization. As SMWU have relatively smaller spatial boundaries than LWU; sometimes operational staff might be allocated to more than one specific task. For example, the field full time employees (FTEs) responsible for service connection repairs will also look into the issues related to hydrant leakage, main breaks, and catchment management. Therefore, careful selection of appropriate indicators in this category is needed by avoiding, to include too many detailed (activity specific) indicators, and also lumping too many into one on the other extreme.   Metering is one of the most important operational components of any water utility. Metering is directly associated with the customer satisfaction, reduction and control of water loss, and efficient billing system. FTEs working in the field to ensure efficient metering system came out to be the top level indicator in this category (Figure 4.4). In subsequent level, “FTEs per 100Km of pipe lengths” was found to be a useful indicator to compare the total field staffing strength with the other utilities. According to NWWBI 2012 public report, the value of this indicator ranges between 1.5 and 10.4 for the year 2010 in large Canadian utilities. This large variation might be due to higher number of customer complaints and operational issues in older systems.  Other indicators in this level according to summary of multicriteria analysis results shown in Figure 4.4 are, numbers of working hours lost by the field accidents and sick leaves. However, these indicators need to be considered separately for distribution and treatment, as some of the SMWU are operating without the conventional treatment facility primarily due to availability of either the freshwater sources with acceptable lower turbidities (e.g., WSSs in Okanagan Basin) or the groundwater supplies. Secondly, SMWU are also facing financial challenges to install conventional treatment for all of their WSSs; this can increase the water rates considerably as well. In such case, utility requires fewer field personnel to maintain only residual chlorine levels. Assessment of the utility’s performance and/ or comparison with other utilities on combined basis of such indicators might be misleading.     90                          Figure 4.4 Outranking relations of personnel PIs showing DMB  Indicators of personnel allocated to water quality monitoring, water treatment, and management of water resources and catchment were ranked at third level in Figure 4.4. In case of surface water sources (like the water utilities operating in Okanagan Basin), catchment management is an important activity. The personnel are responsible for maintenance of drainage channels and natural surface slopes to ensure efficient runoff collection. Numbers of water quality monitoring personnel also depend on type and level of treatment facilities, source water quality, and age of the distribution mains. Therefore, performance comparison amongst utilities should be carefully carried out. DMB PE 14 PE 15 PE 17 PE 16 PE 19 PE 18 PE 21 PE 13 PE 20 PE 22 PE 4 PE 6 PE 1 PE 7 PE 8 PE 9 PE 3 PE 5 PE 2 PE 10 PE 11 PE 12  91 Indicators of overtime hours and training of personnel were found to be at level 4 as per MCDA results shown in Figure 4.4. Training of personnel in SMWU is an extreme necessity, as highly skilled and qualified personnel are difficult to hire and retain in smaller towns. Therefore, the locals (established in the utility area) can be hired and trained to get long-term benefits out of them. This indicator can be compared with other SMWU in terms of number of training hours attended per employee during the assessment year.    Physical (PH) Indicators 4.4.3 This category of indicators apprises about the performance of physical assets of the water utility. The outranking relations in this category distributed at different levels circumscribed by the DMB are shown in Figure 4.5. The indicators of level of metering and automation were found to be the most important ones. The metering level (i.e., percentage of metering) plays a significant role for assessing NRW, and also a step towards sustainability by implementing scale based water pricing defined by water consumption. The automation of various components replaces the man from the different operations in a WSS. Therefore, higher degree of automation means lesser need of operational staff. This indicator is primarily important for developed countries, where even the SMWU possess higher degree of automation. For example, both the utilities from Okanagan Basin participated in this study revealed that almost all of the physical components (i.e., pumps, motors, flow measuring devices, treatment units, etc.) of their water systems are fully automated.  The PIs found at level 2 in Figure 4.5 are associated with capacities of different components of WSS. The PIs include capacity of raw water storage reservoirs, and capacity of treated water reservoirs. No outranking relationships were observed between these indicators means these indicators are not mutually comparable and thus equally important. Use of these indicators for SMWU (relying on clear surface water sources) also needs care in defining the type of reservoirs. For example, the water source having primary chlorination as the only treatment, the storage reservoirs after chlorination should be included under treated water storage reservoirs, even though the water has not been treated with conventional treatment units.   Third indicator included in level 2 is remaining capacity of water treatment facility. All these indicators are important for long-term planning to enhance the capacities of these physical assets with population growth, and resulting increase in water demands by various sectors including domestic, industrial, commercial, public, and agricultural. Particularly, the remaining water treatment plant capacity needs to  92 be assessed. Due to higher maximum daily demand or peak hourly demand factors than those used in design of the treatment facility, this indicator may reveal the future need of additional unit much earlier than the planned year. According to NWWBI (2010) data treatment plant capacity is calculated in terms of number of days the treatment plant operated at more than 90% of its capacity during a year. It was found that the values range between 0 and 213 days, however the median value was around 8; these results show that most of the Canadian utilities have excess treatment capacity to meet future needs.                Figure 4.5 Outranking relations of physical PIs showing DMB  The indicators found at third level in Figure 4.5 are the degree of remote control and utilization of pumps. Degree of remote control was given importance in decision making process because the utility managers thought that the new equipment installed in future might be remote controlled. Low pressure zones are not very uncommon in hilly or rolling terrains such as Okanagan Basin under study. Therefore, pumping utilization is perceptibly an important indicator for pressure management in low pressure zones and meeting demand during days of high consumption. However, the data for these PIs is not currently available for PA.   Operational (OP) Indicators 4.4.4 The indicators in this category are amongst the most important PIs to ensure satisfactory performance of a water utility. The results of MCDA in the form of outranking relationships for the PIs in operational PH 1 PH 2 PH 3 PH 4 PH 5 PH 6 PH 7 PH 8 PH 9 PH 12 PH 10 PH 11  DMB  93 category are summarized in Figure 4.6. The number of main breaks per 100Km of main length, and the percentage of service connections gone through the rehabilitation process were found as top level indicators. Both the indicators are important for control of water loss and to ensure desirable quality of service to the consumers. In Canadian utilities, the number of breaks range from 1 to 20 per 100Km of pipe length, these results reflect large variations of pipe conditions around the country (AECOM 2012). Most portion of the supplied water is lost through leaking service connections. It was observed by the participating utilities as well with large number of complaints and consequent repairs of service connections. However, one important finding needs to be mentioned here that it is important to differentiate between the service connection repair and complaint originated through an in-house plumbing problem. Field staff of the participating utilities revealed that most of the complaints were associated with the later reason instead of the actual service connection leakage.  At the subsequent level, the PIs of mains subjected to leakage, inoperable hydrants, and non-revenue water are placed in Figure 4.6. Non-revenue water (NRW) will be calculated in units of ‘liter/ connection/day’ based on the perception that main water loss occur at service connections (Hamilton et al. 2006). NRW might not be a suitable indicator of water loss where unauthorized water consumption is high, but can be considered as a useful financial indicator (Kanakoudis and Tsitsifli 2010). In this study, all the participants agreed on the usefulness of Infrastructure Leakage Index (ILI), but showed their reservations about limitations associated with apparent and real water losses estimations. According to them, presently no monitoring structure is available to estimate accurate values of such detailed water losses, due to metering inaccuracies, data handling errors, leakage from transmission and distribution mains, leakage and overflow from storage reservoirs, etc. Therefore, the indicators of ILI, apparent losses, and real losses came out to be at lower levels from MCDA in Figure 4.6, mainly due to the measurability criteria.  At level 3, the indicators of percentage of mains replaced, rehabilitated, and renovated during the assessment period were found. It is essentially a stepwise asset management process of prioritizing water mains depending on their condition, age and the environmental conditions around them. For an individual water main the process starts with renovation, i.e., the pipe is found structurally sound and the limited damage can be repaired with simple methods such as sealing. In SMWU, mostly the condition of smaller diameter pipes are worse due to cracked joints or hydrogen sulfide corrosion, and thus needs rehabilitation such as, slip-lining with a new pipe; interior lining with cements or installing cured in-place liners and others (NRC 2003).   94                         Figure 4.6 Outranking relations of operational PIs showing DMB  Finally, pipe replacement is recommended in case of even worse pipe conditions. Therefore, it is important to consider these indicators separately even in the case of SMWU; however, it was decided to join two indicators of rehabilitation, and renovation into one based on the observation that in smaller sized mains mainly renovation is done. The other two PIs observed at this level are the percentage of valves replaced, and hydrant inspection. Hydrant inspection indicator is calculated as % of hydrants inspected during a year, the hydrant inspected more than once will be considered as many times as it was inspected; therefore, the value can be higher than 100%. A wide range of values between 6% and 500% has been reported for Canadian water utilities in NWWBI (2010) public report. OP 14 OP 15 OP 20 OP 18 OP 17 OP 16 OP 22 OP 21 OP 19 OP 1 OP 9 OP 10 OP 8 OP 6 DMB OP 2 OP 3 OP 5 OP 4 OP 7 OP 12 OP 11  95 Indicators found at level 4 in Figure 4.6 are cleaning of storage tanks, meter reading efficiency, and percentage of operational meters. The first one is important from public health security point of view, whereas, the second one ensures the accurate billing and physical working of meters. The third indicator is important to identify the structural condition of meters by estimating the out-of-order meters during the assessment period.   Water Quality and Public Health Indicators 4.4.5 Comparatively a shorter list of PIs was selected through initial screening covering major health related indicators. Number of boil water advisories during the assessment period is found to be the top level indicator outranking all others in Figure 4.7. This indicator is an indirect measure of water quality status of a WSS. According to Water Canada (2013), the WSSs in British Columbia have gone through maximum number of boil water advisories as compared to other provinces. Moreover, most of the water utilities in British Columbia are SMWU with population less than 50,000.  The reasons of such advisories in Okanagan Basin (study area) as reported by Interior Health Canada (2013) include source water contamination, flushing of hydrants, construction, repair and maintenance works, equipment failure, and inadequate treatment.   Turbidity and total coliforms in both the treated water and the distribution systems, and residual chlorine in the distribution system were placed at level 2 (Figure 4.7). All the five indicators at this level were found indifferent from each other, which show their equal importance in this category. Both turbidity and fecal coliforms (FCs) are defined as pollution indicators by USEPA (2013), their presence in surface water is unavoidable. They are removed using the conventional water treatment processes including coagulation, sedimentation, filtration and disinfection. Average turbidity ranges between 0.01NTU and 1.38NTU in Canada, which shows overall acceptable aesthetic water quality; however, at the maximum 25 days with total coliforms were reported in NWWBI (2010) public report.  Residual chlorine is added to avoid any possibility of recontamination through cross-connection within the distribution system. Moreover, in case of higher turbidity levels at the source, chlorine dose is increased. Higher concentration of residual chlorine may react with naturally occurring organic matter and consequently increase possibility of disinfection by products (DBPs) including Trihalomethanes (THMs). Higher concentrations of THMs cause negative impacts on human health including cancer. USEPA has limit the total THMs concentration less than 80 parts per billion (ppb) in treated water (USEPA 2013). Nitrates are also associated with human health risks, and its maximum allowable  96 concentration should be less than 45mg/L in drinking water (Health Canada 2013). The PIs covering the concentration of THMs and Nitrates, along with an important indicator to protect public health (i.e., cleaning/ flushing of mains) came out to be at level 3 in water quality and public health indicators as shown in Figure 4.7.                    Figure 4.7 Outranking relations of water quality and public health PIs showing DMB  It is important to mention here that none of the indicator in this category can be considered as less important. They outrank each other due to higher or lower monitoring frequency, such as higher in case of turbidity and lower in case of THMs and nitrates; but all are important to reduce or eliminate public health risk from drinking water. The indicators proposed by IWA (2006) in terms of percentage of aesthetic, microbiological or chemical tests carried out (level 4 indicators, outside the DMB) and their compliance with standards do not provide clear condition of water quality. For example, under aesthetic tests categories, if pH is monitored at higher frequency than water colour, and complying always with standards will show an overall satisfactory picture of water quality, even though colour might not always be complying with the standards. These may be suitable in case of several sampling locations and higher monitoring frequencies such as LWU. WP 12 WP 3 WP 4 DMB WP 1 WP 6 WP 7 WP 8 WP 5 WP 9 WP 2 WP 10 WP 11  97  Quality of Service (QS) Indicators 4.4.6 Quality of service provided to the public should be efficient, and needs to be maintained through the service period to ensure satisfaction of customers. The outranking relationships between different PIs in this category are presented in Figure 4.8. Top Level-1 indicators in this category were found to be customer’s complaints regarding low pressure, deteriorated water quality, and billing related issues. It is important to mention a field observation from participants of water utilities that in most of the cases pressure and water quality complaints are originated due to plumbing issues within the building line (beyond the service connection), and not due to the distribution system inefficiencies. Therefore, special care is needed to include only the complaints generated by the distribution system failure in the performance assessment process. In this connection, a comprehensive customer complaints work order considering all the possibilities and types of complaints could be extremely helpful. Based on the observations from work orders, a detailed model to manage risk of customer satisfaction is developed in Chapter 7.  Unplanned interruptions, and unplanned maintenance hours were placed at Level-2 in the outranking process by decision makers. People in developed countries understand about the need of maintenance of their water systems; nevertheless, the maintenance hours should planned and the customers must be informed well before the start of maintenance operations. Higher number of unplanned maintenance hours may lead to larger number of complaints, which will affect the performance of the utility in the benchmarking process.   Response to customer’s complaints, and quality of water supplied were found at Level-3 in the outranking process by the decision makers. The important parameters in this regard are type of complaint, percentage of complaints responded, duration to resolve the complaints, actual reason of complaint at site, and percentage of the complaints resolved. Customer satisfaction is directly associated with efficient response to their complaints. In case of privately owned water systems, the cost of water is usually very high such as in United Kingdom and Wales (OFWAT 2012). In such systems, usually an intensive mechanism is adopted to ensure quickest possible response to the complaints. In case of SMWU, higher response times are common; however almost 100% of the complaints have been responded to ensure customer’s satisfaction in the participating utilities.    98 Quality of water supplied is an indicator (QS9 at Level-3) which can be calculated by checking the compliance of water quality (aesthetic, microbiological, and chemical) with the applicable standards. If the data is available QS9 can be replaced with the detailed PIs (QS10, 11, and 12).                      Figure 4.8 Outranking relations of quality of service PIs showing DMB    Financial and Economic (FE) Indicators 4.4.7 According to ABD (1997), any project is financially sustainable if there are sufficient funds available (from both the user charges and/ or the budget sources) to meet all its resource and financial obligations. According to WB (2012), it is more important for a smaller water utility to compare its operating costs with the revenues it is generating, and whether or not it is serving its debts. A list of 46 financial indicators has been provided by IWA (2006) by splitting the cost according to type of cost (i.e., main functions of the water utility, technical functions, etc.), which might not be essential for SMWU for QS 11 QS 13 QS 14 QS 15 QS 12 QS 16 QS 10 QS 1 QS 3 QS 4 QS 6 QS 7 QS 8 QS 5 QS 9 QS 2 DMB  99 assessing its financial sustainability. The DMB encompassing 7 important financial indicators is shown in Figure 4.9.   Operation and maintenance (O&M) distributed over total main length (for benchmarking) and water rates were found to the top Level-1 indicators as per MCDA results. In case of SMWU, denominator might be misleading due to relatively shorter overall main length and higher O&M cost due to less economies of scale on the other hand. Similar is the situation with next two indicators at Level-2, which are O&M cost of water treated per million liters of treated water, and revenue per unit volume of supplied water. Comparison with larger utilities for these PIs should be carefully done for performance benchmarking.                       Figure 4.9 Outranking relations of financial and economic PIs showing DMB  Indicators at Level-3 are essentially different types of ratios between expenditure, revenues and debts (Figure 4.9). Operating cost coverage as the ratio between total annual operational revenues and total FE 15 FE 12 FE 11 FE 10 FE 16 FE 13 FE 14 FE 8 FE 9 FE 1 FE 4 FE 2 FE 3 FE 5 FE 6 FE 7 DMB  100 annual operating cost provides the information about financial performance of utility operations. Other two PIs included in this Level-3 are debt service ratio and liquidity ratio. The ability to pay debt service has become critical for any utility, as debt has become an important instrument to capitalize utility operations. Liquidity ratio is the ratio between the current assets and the current liabilities. Finally, NRW as percentage of total input volume was also included as an indicator of economic loss of water. NRW may not present a realistic picture of water losses in the distribution systems; however it provides the information of volume of water which was lost without being charged.  4.5 Final Ranking of Selected Indicators  It can be seen from Figures 4.2 to 4.8 that preferences amongst different PIs within the levels (dashed boundaries) and then the overall DMB was defined for each category. The concept of net outranking relationships using complementary ELECTRE analysis has been explained above in step 5, section 2.4 and in the calculation example of water resources and environmental indicators in attached annexure. Relative dominance and relative weaknesses of each PI with respect to the other PIs in the same functional component have been established in terms of concordance and discordance indices using equation (4.11) and (4.12), respectively. The final preferences for all the categories of PIs are shown in Figure 4.10 (a-g). The solid line in the plots shown in Figure 4.10 is drawn for final ranking of the PIs, i.e., PIs with almost similar concordance and discordance indices can be ranked based on their relative distance from this line.  The final ranks of selected PIs (within the DMB in Figures 4.2 to 4.8) are presented in Table 4.11. For performance assessment, the ranking of these PIs can be revaluated for weight estimation based on the relative importance (given by the decision makers) of the PIs within the functional component. Instead, the rankings shown in Table 4.11 have been developed from the results of ELECTRE method where other criteria such as comparability, measurability, and understandability have also been considered for selection of PIs.  4.5 Utilization of Selected Indicators  By adopting good record keeping practices (which is not very rare in water utilities under study), and effectively utilizing this data the selected PIs in Table 4.3 can be calculated. However, the conventional methods of performance benchmarking such as regression analysis (based on ample historical data for each indicator) cannot be used in the start for SMWU. Therefore, the indicators for which benchmarking data is available, the performance of each indicator can be compared and can be assigned a relative rank.  101 For others, the utility can establish its own benchmarks using literature or experts knowledge. Using suitable methods, weights of the PIs under each category can be determined.   Table 4.11 Final ranking of selected PIs under DMB for SMWU Water Resources & Environmental Personnel Physical Operational  Water Quality & Public Health Quality of Service Financial & Economic Rank PIs1 Rank PIs Rank PIs Rank PIs Rank PIs Rank PIs Rank PIs 1 WE 1 1 PE 1 1 PH 1 1 OP 9 1 WP 1 1 QS 1 1 FE 4 2 WE 2 2 PE 4 2 PH 2 2 OP 1 2 WP 3 2 QS 3 2 FE 1 3 WE 6 3 PE 6 3 PH 3 3 OP 10 3 WP 4 3 QS 4                                                  3 FE 34 WE 3 4 PE 8 4 PH 5 4 OP 8 4 WP 6 4 QS 6 4 FE 2  5 WE 5 5 PE 9 5 PH 4 5 OP 6 5 WP 7 5 QS 7 5 FE 5 6 WE 4 6 PE 7 6 PH 7 6 OP 2 6 WP 8 6 QS 8 6 FE 6 - - 7 PE 5 7 PH 6 7 OP 3 7 WP 5 7 QS 5 7 FE 7 - - 8 PE 3 - - 8 OP 5 8 WP 9 8 QS 9 - - - - 9 PE 11 - - 9 OP 4 9 WP 2 9 QS 2 - - - - 10 PE 10 - - 10 OP 7 - - - - - - - - 11 PE 2 - - 11 OP 11 - - - - - - - - 12 PE 12 - - 12 Op12 - - - - - - 1Description of each indicator can be seen in Table 4.1  A conceptual cognitive map as an example to estimate the water resources and environmental sustainability index is shows in Figure 4.11. The figure shows that increase in some PIs has positive impact (WE1 number of days of water restrictions) on WEI, whereas some of the PIs need to be reduced or controlled (WE4 Energy consumption) to improve the sustainability. The different types of data variables required to calculate the PIs are also in Figure 4.11. By limiting the PIs to the most essential ones in this study reduces the data collection efforts of SMWU.  However, the assigning weights required great care, one important indicator with poor performance can significantly reduce the index value. On the other hand, if a relatively smaller weight is allocated to an important indicator and the resultant index is showing a higher value (acceptable performance), the results could be misleading. This situation should primarily be avoided in case of water quality indicators, where all the PIs should meet water quality standards.  The detailed description of use of these selected PIs for performance benchmarking of SMWU is given in Chapter 5.     102  Figure 4.10 Net Concordance (C) and discordance (D) indexes for all seven categories of PIs; (a) Water resources and environment; (b) Personnel; (c) Physical; (d) Operational; (e) Water quality and Public Health; (f) Quality of Service;  (g) Financial and Economic -6-4-20246-6 -4 -2 0 2 4-8-6-4-20246-10 -5 0 5 10C D -4-3-2-101234-10 -5 0 5 10-6-4-202468-13 -8 -3 2 7 12-6-4-20246-10 -5 0 5 10-3-2-101234-8 -3 2 7-4-3-2-1012345-8 -3 2 7WE 1 WE 2 WE 6  WE 2 WE 3  WE 2 WE 5  WE 2 WE 4  WE 2 D C (a) PE 1 PE 4, 6, 8 &9 PE 7 PE 5 PE 3 PE 11 PE 2 PE 10 PE 12 (b) C D PH 1 PH 2 PH 3 PH 5 PH 4 PH 7 PH 6 (c) C D OP 9 OP 1 OP 10 OP 6 OP 8 OP 2, 3, 5 OP 7, 11, 12 OP 4 (d) C D WP 1 WP 3, 4, 6, 7, 8 WP 5 & 9 WP 2 (e) C D QS 3 & 4 QS 1 QS 6 QS 7 QS 8 QS 5 & 9 QS 2 (f) C FE 4 FE 1 FE 3 FE 2 FE 5, 6 & 7 D (g)  103                Figure 4.11 An example of cognitive map for estimation of water resources and environmental sustainability index  4.6 Summary  Existing performance indicators (PIs) systems developed for large water utilities need to be re-evaluated for SMWU. In Chapter 3, 114 potential PIs were identified in the water resources and environment, personnel, operational, physical, water quality, quality of service and financial categories. These PIs are evaluated against applicability, understandability, measurability and comparability criteria using the Elimination and Choice Translating Reality (ELECTRE) outranking method for multicriteria decision analysis. The criteria weights and scoring of PIs were done through group decision making.   The results revealed that ELECTRE is a suitable method when the preferences between various alternatives based on small differences of evaluations cannot be established. The network maps based on outranking results provides an opportunity to the utility management to encompass the most suitable PIs based on data availability, and specific needs of their utility.    WEI  WE 3 WE 2 WE 6 WE 1 WE 5     - + - + - + + + + - A1 A2 A11 E1 A4 A5 A6 A7 A8 Legend: Data variables Performance Indicators Performance  Index    Performance  Index E1 WE 4 WEI WE 4 A12 D 36  - + W3 W2 W6 W1 W5 W4  104 Chapter 5     Inter-utility Performance Benchmarking Model (IU-PBM)   A part of this chapter is published in ASCE’s Journal of Water Resources Planning and Management as an original research article titled “Inter-utility Performance Benchmarking Model (IU-PBM) for Small to Medium Sized Water Utilities: Aggregated Performance Indices” (Haider et al. 2015b).  5.1 Background  Towards sustainable performance, the first step is to evaluate the existing performance of all the functional components of the water utility using suitable PIs. The general concept of performance benchmarking of a water utility is to compare its performance with benchmarks (or guidelines and standards) established by the regulatory agencies, and through cross-comparison with other utilities (Marques and Witte 2008, Alegre et al. 2006). Based on the results of performance benchmarking, rational decisions for effective asset management can be taken. Nevertheless, performance benchmarking of water utilities has always been a daunting task for water utility’s management.   Literature review conducted in Section 2.8 of Chapter 2 reveals that substantial data is required for most of the benchmarking methods, which is only possible where there are several utilities participating over a long-time; this certainly is not the case of Canadian SMWU. Conventionally, a comparison of a particular utility’s performance has been made with the best and worst performing utilities to calculate a performance score for each PI as (Stahre et al. 2008):          1090)("" valueworstPIvaluebestPIvalueworstPIvalueactualPIPIScore  [5.1]  The above equation [5.1] produces a performance score ranging between 10 and 100, where 10 is for worst performing utility and 100 is for the best performing utility. Equation (5.1) calculates a minimum score of ‘10’ for the worst performing utility; this might be due to the fact that even such underperforming utility is operational and performing its routine functions. The equation does not compare the performance of a utility with the desirable standards, which could be outside the range of the performance scores of the participating utilities. When the comparison is being done for few utilities with the possibility that none of them is performing satisfactorily, the score from equation [5.1] could be misleading. Moreover, the equation [5.1] follows a straight line which can be erroneous to calculate the PI  105 score for an average performing utility. For example, in Figure 5.1 the performance score using the equation [5.1] for the water utility with an average performance (for a particular PI) comes out to be 30 instead of 50.   Firstly, it is important to consider the relative performance of the utilities by calculating the performance gap from the benchmark in terms of a performance level (or score). This concept is explained in Figure 5.2. One utility performing better than the other one but has a PI value slightly less than the benchmark could be motivated to further improve its performance for the coming year. Likewise, the best performing participating utility should also be rationally compared with the benchmark, i.e., it is possible that the best one itself is just approaching the benchmark. On the other hand, the best utility will need to maintain its performance with an even higher value than the benchmark. This type of comparison can only be made with the help of a benchmarking approach (with limited data) which can cover the entire variation of performance shown in Figure 5.2. Secondly, while establishing the BTFs, the data from larger water utilities reported in NWWBI public reports is used keeping in view the less economies of scale in SMWU. Thirdly, the model calculates the aggregated performance indices, for top level management, which rationally reflect the performance of each functional component in terms of its closeness to the most desirable performance and remoteness from least desirable performance. Finally, for the proof-of-concept, the proposed model has been validated with a case study of two medium sized water utilities in Okanagan Basin, British Columbia, Canada.   The primary objective of this chapter is to develop a simple (with most relevant selected PIs), albeit comprehensive (covering all the performance categories), inter-utility performance benchmarking model (IU-PBM) for SMWU in Canada. The model adequately addresses the existing research gaps. The model calculates the aggregated performance indices which rationally reflect the performance of each category in terms of its closeness to the most desirable solution and remoteness from least desirable solution. The proposed model has been applied on a case study of two medium sized (10,000 < Population < 50,000) water utilities in Okanagan Basin, British Columbia, Canada.     106                   Figure 5.1 Graphical description of Equation [5.1] showing misleading calculation of performance score     Figure 5.2 Relative performance of water utilities in terms of performance gap between the calculated PI values and benchmarks using performance score   024681012min Avg MaxCalculated value of PI PI value from participating water utilities 30 10 50 100 Worst Best Misleading (PS) using equation [5.1] Average Performance Score (PS) Participating utility with average performance should get a PS of 50 Straight line behavior of equation [5.1] 10 50 100 Worst Best Average Performance Score Utility’s Performance ‘PI value’ Range  Established Benchmark Value of PI Utility performing better than the benchmark Additional tolerance available- Maintain Utility performing slightly worse than the benchmark Critical condition -Improve Utility performing much less than the benchmark Large Performance Gap – Major improvements required  Small performance Gap  107 5.2 Approach and Methodology  5.2.1 Performance Benchmarking Modeling Approach  A modeling approach for the proposed IU-PBM for SMWU is shown in Figure 5.3. For performance benchmarking, out of 62 selected PIs in Chapter 4, a set of 47 PIs (based on existing data availability) grouped into seven performance categories covering all the essential functional components of a water utility, i.e., water resources and environment, personnel, physical, operational, water quality and public health, quality of service, and economics and finance.  The performance indicators have either commensurate or non-commensurate values (i.e., percentage or ratio), which are calculated with the use of data variables. These calculated values of PIs are then compared with the performance benchmarks using the benchmarking transformation functions (BTFs) to assess the performance gap in terms of performance level. These performance levels are then aggregated to develop performance indices. Finally the proposed model has been implemented on a case study of Okanagan Basin, British Columbia (BC), Canada. The details of different components of modeling approach shown in Figure 5.3 are presented in the following sections.  5.2.2 Benchmarking Transformation Functions  Due to lack of historic performance data of SMWU in Canada, their performance cannot be assessed using a conventional metric benchmarking method (i.e., based on cross comparison amongst several utilities). Consequently, an effort has been made to develop different BTFs (linear, polynomial, logarithmic, and exponential) based on NWWBI public report (NWWBI-PR) (AECOM 2012), literature values (for the PIs not included in NWWBI), and expert opinion. These BTFs transform the calculated value of all the 47 PIs into performance level between 10 and 100, with 10 being very poor and 100 being very good. For some PIs, an increasing trend (benefit criterion), whereas for others a decreasing trend (cost criterion) is desirable. This approach accommodates all possibilities for utilities; i) performing much worse than the established benchmarks; ii) performing close (e.g., slightly higher or lower) to the benchmark, and iii) which have been performing equal or better than the benchmarks (Figure 5.2).      108                               Figure 5.3 Modeling approach of performance benchmarking model for SMWU     Participating Water Utilities Data Variables  - Water resources and environmental - Personnel - Physical assets - Operational - Demographic  - Service - Financial Importance ranking of performance indicators  Literature review and initial screening of PIs for SMWU (Chapter 2 and 3) CSA (2012)  OFWAT (2012)  ADB (2012)  WB (2011)  NWC (2011)  NRC (2010)  IWA (2006)  AWWA (2004)  Selection of 47 PIs using Multicriteria Analysis for SMWU (Chapter 4) Establishing relationships to calculate performance level (Benchmarking Transformation Functions) - NWWBI Public Reports  - Linear - Polynomial - Exponential - Power  - Literature  - Expert judgment  Transformation of PIs into Performance Level ranging between 10 and 100 using benchmarking relationships  Aggregating Indices – TOPSIS Application  WEI PEI PHI OPI WPI QSI FEI Calculation of 47 PIs  (Performance Evaluation)  WP (6) WE (4) PE (7) OP (9) PH (4) QS (11) FE (6) Performance weights of PI using Simos’ Method Calculated value of PIs   109 5.2.3 Performance Aggregation Indices   The performance levels of individual PIs obtained from BTFs might not be desirable by the senior managers and decision makers. In general, utility managers are more interested in developing composite indices to save their time and efforts which are required to evaluate the individual PI (Galar et al. 2014). A performance index combines information obtained by calculating several PIs into one final score; it consists of a weighting process and an aggregation process. The weighting process is required to determine the importance weights of all PIs under each category; and the aggregation process is finally applied to combine the performance level with their respective weights.   5.2.4 Simos’ Method for Estimating the Weights of PIs   In order to develop the aggregation performance indices, the PIs have to be weighed between 0 and 1 depending on their relative importance under each category. In this research, Simos’ method is used for this purpose due to its simple and easily interpretable procedure (Marzouk et al. 2015). The methods initiates with the ranking of PIs by the decision makers (i.e., utility managers and experts). For analysing these ranks, a table containing seven columns (C1 to C7) can be constructed. C1 corresponds to number of PIs in each of category; while the PIs are described in second column. C3 lists the average frequency of the importance ranks scored by the decision makers from least important to the most important PI in ascending order. Subsequently, the PI with the maximum average frequency is given the higher Simos’ rank in C4. PIs with the same average frequency are allocated the same rank. C5 shows the number of PIs, and the non-normalized weights of the PIs are presented in C6. The weights of all the PIs, which are essentially the positions of Simos’ rank already listed in C4 will be estimated in C7. The responses are attached in Appendix B.  5.2.5 Aggregating Performance Indicators using TOPSIS Method  In this research, the following performance indices are used to state the performance of the functional components of SMWU.    Water resources and environmental sustainability index (WEI)  Personnel adequacy index (PEI)  Physical assets efficacy index (PHI)  Operational integrity index (OPI)  110  Water quality and public health safety index (WPI)  Quality of service reliability index (QSI)  Financial and economic stability index (FEI)  In order to appreciate the concept proposed in Figure 5.2 based on the relative closeness of the calculated performance score of individual PIs to the most desirable performance (100) and its remoteness from least desirable performance (10), there is a need of an aggregation approach based on the synthesizing criterion (Figueira et al. 2005). Therefore, the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) method is used to aggregate the performance levels of the PIs to develop the above mentioned performance indices for SMWU. The indices developed by TOPSIS are based on the concept of similarity (i.e., relative closeness) to the positive-ideal solution (PIS) and the remoteness from the negative-ideal solution (NIS) (Yoon and Hwang 1995, Hwang and Yoon 1981). The method assumes that the performance level of each indicator is either a monotonically increasing or decreasing function, which means the higher value of index corresponds to higher performance. A Step-by-step procedure of the TOPSIS method is given below:  Step 1: Estimation of weights of performance indicators under each performance category. This step involves application of Simos’ method, which has already been described in the above section.  Step 2: Checking need for normalization. As the performance levels of all the PIs range between 10 and 100 with no units, there is no need of normalization in present study.  Step 3: Development of Weighted Matrix. The weighted value of each indicator is calculated as:  ijjij xwv      [5.2]  where vij is the weighted value of each performance score, wij is the corresponding weight of that indicator, and xij is the performance score obtained from benchmarking relationships.   111 Step 4: Identify positive-ideal and negative-ideal solutions. The X* and X- are defined as the PIS and NIS, respectively, in terms of weighted performance scores:   ***2*1* ,........,,........., nj vvvvX       miJjvJjv ijiiji ,,.........1m n,max 21     [5.3]    nj vvvvX ,........,,........., 21      miJjvJjv ijiiji ,,.........1max,m n 21     [5.4]  where J1 is the set of benefit attributes and J2 is a set of cost attributes.   Step 5: Calculate the distance of each water utility from PIS and NIS In this step, the distance of all the performance levels in a performance category (for each participating utility) are measured by the n-dimensional Euclidean distance. The separation or distance of each PI from the PIS can be calculated as:    ....,,.........1,12** mivvY nj jiji     [5.5]  and the distance of each performance indicator form the NIS can be calculated as:    ....,,.........1,12 mivvY nj jiji     [5.6]  Step 6: Develop aggregate performance indices by calculating similarities to PIS The overall performance index of each performance category (functional component) can be calculated by using the following equation:    .,,.........1,** miYYYPiiii      [5.7]   112 As the result of equation [4.7] is a ratio, it should be multiplied by 100 in order to translate it into global performance index.  5.3 Development of Benchmarking Transformation Functions  The overall performance benchmarking process commences with calculating PIs (selected in Chapter 4 under each functional component) using the required data variables. The BTFs developed for each PI are presented in Table 5.1. The performance levels are established between 10 and 100, due to the fact that an initial value of ‘0’ does not seem justifiable for an operational water utility. The adjustment of reported values in NWWBI-PR and literature has been explained with two examples of per capita water consumption (WE2, a water resources indicator) and percentage of service connection repairs (OP5, an operational indicator) in Figure 5.4a&b, respectively. In Figure 5.4a, the values reported in NWWBI-PR for large water utilities, where the maximum value of WE2 has been reported as ‘593’, could be an average value in SMWU. The value of WE2 in SMWU can go up to 900lit/capita/day (AECOM 2014). Therefore, the values are extrapolated, keeping in view the relatively lower rates and higher water consumptions (due to lower treatment levels, higher water availability, less population, less awareness about water conservation, etc.) in SMWU. In case of second example, OP5 shown in Figure 5.4b, the minimum, median, and maximum values reported in NWWBI 2012 public report were found to be convincing for SMWU as well. In the same way the BTFs for the rest of PIs have been developed in Table 5.1.  Some of the selected PIs were not found in NWWBI 2012 report; nevertheless, the indicators have to be included for performance benchmarking of SMWU. For instance, service connection rehabilitation rate is an important indicator with respect to customer satisfaction and water loss control, but this has not been considered in NWWBI-PR. Hence, literature values or expert judgment have been used to develop BTFs for such PIs (refer to last column of Table 5.1). The rationale behind the development of BTFs under each performance category is described in the following sub-sections. However, the discussion is limited to the PIs for which either the NWWBI values are adjusted or the relationships have been developed on the basis of expert opinion and literature. Details about other PIs can be seen in Chapters 3 and 4.  113 Table 5.1 Benchmarking transformation functions developed for performance benchmarking for SMWU PI # Description of PI Formula/ data variables Units Transformation Function  Benchmarking Transformation Functions (BTFs) R2 Sources  1. WATER RESOURCES AND ENVIRONMENT      WE 1 No of days with water restrictions [days with water restrictions] Days Linear (PL)WE1 = 0.26(WE1)+10 0.99 NWWBI  WE 2 Per capita domestic water consumption  [(Total volume of water supplied to residential consumers in the year)/(Population served x 365)] Liter/ person/ day Linear (PL)WE2 = 126 - 0.12(WE 2) 0.99 NWWBI <adj> EOS WE 3 Remaining annual water license capacity (WLC) [(Annual water demand)/(Existing annual WLC)] x 100 % Linear (PL)WE3 = 100 -1.13 (WE 3)  0.99 NWWBI <adj> EOS  WE 4 Impact of residual chlorine in flushing water on aquatic life  [Distance between the discharge point to the receiving water body] meters Polynomial (PL)WE4 = 4.8 +0.088(WE4) – 0.00002(W$)2 0.99 EO + Literature 2. PERSONNEL       PE 1 Field full time equivalent (FTEs)  [(Number of full time equivalent employees working in the field for distribution)/(Total main length/100)] No/ 100km Linear (PL)PE1 = 30.3(PE1)-14  ;       IF 1 < PE1 < 4 (PL)PE1 = 167 – 16.3 (PE1) ;  IF PE1 > 4 0.97 0.98 NWWBI PE 2 Field FTEs - Metering [(Number of full time equivalent employees working in field for metering)/(Number of meters/1000)] No/ 1000 meters Linear (PL)PE2 = 172(PE2) + 10 ; IF 0 < PE2 < 0.05 (PL)PE2 = 193 – 1883 (PE2) ;  IF PE2 > 0.05 0.98 0.99 NWWBI PE 3 Field accidents  [(Number of lost working hours due to field accidents in a year)/(Total number of field labour hours / 1000)] Lost hours/ 1000 field labour hours/ year Polynomial (PL)PE3 = 0.073(PE3)2 - 5(PE3)+100 0.99 NWWBI <adj> EOS PE 4 Sick days taken by FTEs [(Total number of sick leaves in a year)/(Total number of employee)] Days Linear (PL)PE4 = 100- 5(PE4) 0.99 NWWBI PE 5 Water resources and catchment management employees [(Number of FTEs working in WR and catchment planning)/(total input volume/106)] No/ million m3/ year Linear (PL)PE5 = 172(PE5) + 10 ;  IF 0<PE5< 0.05 (PL)PE5 = 193 – 1883 (PE5) ;  IF PE5>0.05 0.98 0.99 EO PE 6 Overtime of FTEs  [(Total overtime field hours in a year)/(Total paid field hours in a year)]x100 % Polynomial (PL)PE6 = 0.08(PE6)2 – 5.3(PE6)+100 0.99 NWWBI <adj> EOS PE 7 Personnel training hours [(Number of training hours during a year)/(Total number of employees)] Hours/ employee/ year Linear (PL)PE7 =PE7 1.0 EO 3. PHYSICAL ASSETS       PH1 Metering level [(number of connection with meters installed)/(total number of connections)] x 100 % Linear (PL)PH1 = PH1 1.0 EO PH2 Degree of automation [Number of automated control units/ number of control units] x 100 % Linear (PL)PH2 = PH2 1.0 EO PH3 Raw water storage capacity [(Net capacity of raw water reservoir)/(volume of supplied water during the year)] x 365 Days Linear (PL)PH3 = 0.3(PH3)+8 0.99 EO PH4 Treated water storage capacity [(Volume of treated water reservoir)/ (Average daily demand / 24)] Hours Polynomial (PL)PH4 = 10 - 0.011(PH4)2+2(PH4) 0.98 NWWBI   114 Table 5.1 (Cont’d) Benchmarking transformation functions developed for performance benchmarking for SMWU PI # Description of PI Formula/ data variables Formula/ Units Transformation Function  Benchmarking Transformation Functions (BTFs) R2 Sources  4. OPERATIONAL       OP1 Number of main breaks [(Number of main breaks)/(Total length of mains in Km/ 100)] No/ 100 km Exponential (PL)OP1 = 100 e-0.114(OP1) 0.99 NWWBI OP2 Mains replacement [(Length of mains replaced during the year) / (total mains length)] x 100 % Polynomial (PL)OP2 =11 – 153(OP2)2+260(OP2) 0.99 EO + Literature OP3 Mains rehabilitation/ renovation [(Length of transmission and distribution mains rehabilitated / renovated during the year) / (total mains length)] x 100 % Polynomial (PL)OP3 =11 – 153(OP3)2+260(OP3) 0.98 EO + Literature OP4 Non-revenue water [(System input volume) - (Annual Billed consumption)] / Number of connections Liters/ connection/ day Exponential (PL)OP4 = 100 e-0.002(OP4) 0.99 NWWBI OP5 Service connection rehabilitation [(Number of connections replaced or renovated during the year) / (total number of connections)] x 100 % Logarithmic (PL)OP5 = 20 ln (OP5) +93 0.97 NWWBI OP6 Inoperable or leaking hydrants [(Number of inoperable or leaking hydrants during the year) / (total number of hydrants)] x 100 % Exponential (PL)OP6 = 100 e-1.14(OP6) 0.99 NWWBI <adj> EO OP7 Valves replacement (Number of mains valves replaced during the assessment period x 365 / assessment period)  / total number of mains valves x 100 % Polynomial (PL)OP7 = 8 – 587(OP7)2+475(OP7) 0.99 EO OP8 Hydrant inspection [(Number of hydrants inspected during the assessment year) / total number of hydrants]x100 % Polynomial (PL)OP8 =2 - 0.014(OP8)2+2.3(OP8) 0.95 NWWBI <adj> EO OP9 Cleaning of treated water storage tanks [(total volume of storage tanks cleaned during the year)/(total volume of all storage tanks)] x 100 % Linear (PL)OP9 =0.97(OP9)+10 0.99 EO 5. WATER QUALITY AND PUBLIC HEALTH      WP1 Days with boil water advisories [(No of Boil-Water Advisory days) x (persons affected)] / Population served Days Linear (PL)WP1 = 100 - 48 (WP1)  0.99 NWWBI + EO WP2 Average turbidity  [Average turbidity in distribution system] NTU Polynomial (PL)WP2 =2.6 (WP2)2 – 31(WP2)+100 0.99 NWWBI <adj> EO WP3 Total Coliforms occurrences  [No of total coliform occurrences] No Linear (PL)WP3 = 100 - 2.3 (WP3) 0.99 NWWBI <adj> EO WP4 Residual chlorine  [Average residual chlorine in distribution system] mg/ L Polynomial (PL)WP4 = 1.83 (WP4)2 – 30(WP4)+102 0.99 EO WP5 Average THMs  [Average concentration of THMs in distribution system] mg/ L Exponential (PL)WP5 = 100 e-6.8 (WP5) 0.97 NWWBI <adj> EO WP6 Length of mains cleaned [(Length of mains cleaned during the assessment period x 365 / assessment period) / (total mains length)] x 100 % Polynomial (PL)WP6 = - 0.011(WP6)2 + 1.9 (WP6)+10 0.99 NWWBI <adj> EO           115 Table 5.1 (Cont’d) Benchmarking transformation functions developed for performance benchmarking for SMWU PI # Description of PI Formula/ data variables Formula/ Units Transformation Function  Benchmarking Transformation Functions (BTFs) R2 Sources  6. QUALITY OF SERVICE       QS1 Billing complaints [(Number of billing complaints during the AP) / (number of registered customers)] No/ 1000 connections Linear (PL)QS1 = 100 - 20 (QS1) 1 EO QS2 Pressure complaints [(Number of water pressure complaints during the AP) / (total Population served/ 1000)] No/ 1000 persons Polynomial (PL)QS2 = 5.7 (QS2)2 – 47(QS2) +100 0.95 NWWBI <adj> EO QS3 Water quality complaints [(Number of water quality complaints during the AP x 365 /AP) / (Population served/ 1000)] No/ 1000 persons Linear (PL)QS3 = 100 - 20 (QS3) 1 NWWBI <adj> EO QS4 Unplanned interruptions [(Number of unplanned interruptions during AP) / (total mains length/ 100)] No/ 100 km Exponential (PL)QS4 = 100 e-0.053 (QS4) 0.98 NWWBI QS5 Unplanned maintenance hours [(Total unplanned maintenance hours during AP)/(Total maintenance hours in AP)] x 100 % Exponential (PL)QS5 = 100 e-0.023 (QS5) 0.99 NWWBI QS6 Population coverage [(Population served by the utility)/(Total population of the area under utility)] x 100 % Linear (PL)QS6 = 2.5(QS6) - 140 1 EO + Literature QS7 Total response to complaints [(Total number of responses to reported complaints)/(total number complaints)] x 100 % Polynomial (PL)QS7 = 0.056(QS7)2 - 6.6(QS7) + 194 0.99 EO QS8 Service connection complaints [(Number of other complaints and queries during the assessment period x 365 / assessment period) / (total population served/ 10000] No/ 1000 persons Linear (PL)QS8 = 100 - 10 (QS8) 1 EO QS9 Aesthetic tests compliance [(Number of aesthetic tests complying with the applicable standards during the AP) / (total number of aesthetic tests carried out during the AP)] x 100 % Linear (PL)QS9 = 2.6(QS9) – 161 0.99 EO QS10 Microbiological tests compliance [(Number of microbiological tests complying with the standards during the AP) / (total number of microbiological tests carried out during the AP)] x 100 % Linear (PL)QS10 = 2.6(QS10) – 161 0.99 EO QS11 Physico-chemical tests compliance [(Number of physical-chemical tests complying with the standards during AP) / (total number of physical-chemical tests carried out during the AP)] x 100 % Linear (PL)QS11 = 2.6(QS11) – 161 0.99 EO 7. ECONOMICS AND FINANCE       FE1 Water rates  [Water rates for a typical size residential connection using 250 m3/year] $ Polynomial (PL)FE1 = 134 – 0.17(FE1) 0.99 NWWBI <adj> EOS FE2 Operation and maintenance cost (Total O&M Cost/ 1000)/ (Main length in Km) $ (‘000) / Km Linear (PL)FE2 = 112 - 3 (FE2) 0.99 NWWBI <adj> EOS FE3 Revenue per unit of supplied water [(Operating revenues-capitalized costs of the constructed assets)/(Billed consumption during the assessment year)] $ / m3 Polynomial (PL)FE3 = 250(FE3) – 50 0.99 EO + Literature FE4 Operating cost coverage ratio [(Total annual operational revenues)/( Total annual operating costs)] Ratio Polynomial (PL)FE4 = 97(FE4)2 – 43(FE4) +10  0.99 EO + Literature FE5 Debt service ratio [(Cash income)/( Financial debt service “FDS”)] Ratio Polynomial (PL)FE5 = 110(FE5)^3– 510(FE5)2+813(FE5)- 369 0.99 EO + Literature FE6 NRW by volume [(Non-revenue water) /(system input volume, during the assessment period)] x 100 % Linear (PL)FE6 = 100 – 2 (FE6) 0.99 EO + Literature    116               (a)             (b)  Figure 5.4 Examples of performance benchmarking relationships (a) per capita water consumption of residential consumers (WE2), a water resources indicator, (b) percentage of service connection repairs in a year (OP5), an operational indicator     (PL)WE2 = 126 - 0.12 (WE2) + 125.57 R² = 0.99 020406080100100 200 300 400 500 600 700 800 900 1000Performance Level (PL) Average per capita water consumption (lpcd)  Knowledge basedNWWBILinear (Knowledge based)Max. Min. Median (PL)OP5 = 20ln(OP5) + 93 R² = 0.98 0204060801000 0.2 0.4 0.6 0.8 1 1.2Performacne Level (PL) Service connection repairs (%) Knowledge basedNWWBILog. (Knowledge based)Median Min. Max.   117 5.3.1 Water Resources and Environmental Sustainability   Except for WE4 “flushing of water mains”, all other PIs were included in NWWBI-PR and are adjusted in Table 5.1 for SMWU. Periodic flushing is an important activity to keep the water mains healthy and to provide safe water to the community by removing the biofilm, and corrosion tubercles. The flushed water may have higher chlorine concentrations or other pollutants. As per Canadian water quality guidelines, the sum of all reactive chlorine species concentrations should be less than 0.5mg/L for the protection of aquatic life in freshwaters (CCME 1999). According to the USEPA (1984) ambient water quality criteria, the acute toxicity values for fish species vary from 50g/L to 250g/L.  Water mains in smaller utilities are usually flushed with 1500 to 2000 gallons/min of flow that contains 1.5 to 2.5 mg/L of chlorine residual. The flushing periods vary between 2 to 6 hours once or twice a year, depending on the location of water mains, raw water quality, and level of water treatment. The flushing water is usually conveyed through storm water drains to the receiving freshwater body (i.e., creek or a stream). Chlorine decays while moving in surface water (Gang et al. 2003). A more practical approach has been adopted to develop BTFs for this indicator (WE5) by considering acute toxicity limits of 250g/L. A linear relationship is developed for this indicator mapped over the length of a collection drain of 100m to 1500m. This range has been established based on an assumption that the chlorine concentration will reduce to half of its initial concentration (following first order kinetics) while flowing through 1000m collector storm water drain. Thus, the field and technical personnel should avoid the planned flushing programs during low flow (dry) periods to minimize the impact on aquatic life. However, spot flushing for smaller durations can be done on customer’s requests.   5.3.2 Personnel Adequacy   Largely, in Canadian SMWU, the trends of outsourcing are almost negligible; thus, the full-time equivalents (FTEs) are distributed amongst their permanent employees. Due to small number of personnel, some of the operational personnel need to multitask. In this regard, special care has to be taken while calculating FTEs for various indicators in this category. Secondly, in large utilities, the personnel indicators are calculated to check staff productivity to maintain the minimum number of employees per km of the water mains’ length (or number of connections). In the case of SMWU, the problem is more intricate, where a minimum number of employees should be adequate to efficiently perform routine operations on one side; and, the staff productivity should be kept optimum on the other side. To deal with this, non-monotonic (increasing and then decreasing) behaviour, two linear benchmarking relationships   118 have been developed for PE1, PE2, PE5 and, PE8 to cover the entire range of performance levels between 10 and 100 (refer to Table 5.1).  In order to develop the non-monotonic function, for these PIs, the function starts by increasing from minimum to median value and then continues with a decreasing function until the maximum value of NWWBI-PR. For example, in the case of Field FTEs per 100Km of water mains (PE1), a linear function is established between minimum, median and maximum reported values of 1.5, 4 and 10, respectively, to calculate performance level.   It can be seen in Table 5.1 that most of the PIs in this category have been included in NWWBI-PR. Two very important PIs are additionally included in this study, i.e., water resources and catchment management employees (PE8) and personnel training hours (PE11). It was found during personal communication with participating water utilities that water resources and catchment management is an important activity for them as most of the WSSs are relying on source water quality. Training of personnel (PE11) is of supreme importance due to the fact that highly skilled and qualified personnel are difficult to hire and retain in SMWU. Therefore, the locals (preferably residing in the utility’s service area) can be hired and trained to develop a long-term association with them. To obtain performance level for this indicator, he BTF (based on expert opinion) varies from 10 to 100 hours per employee in a year.   5.3.3 Physical Assets Efficacy   This performance category evaluates the efficacy of physical assets including raw and treated water reservoirs, metering level, and degree of automation. Based on the data obtained from the NWWBI-PR, a second order polynomial function with a high R2 value of 0.98 is established for PH4. The participating utilities (similar to other SMWU in Canada) primarily rely on source water quality; and the source water has been treated with primary chlorination as the only treatment. In this case, the storage reservoirs receiving this chlorinated water (without complete filtration) should be included under treated water storage reservoirs.  Other important indicators not included in NWWBI-PR are metering level (PH1), degree of automation (PH2) and raw water storage capacity (PH3). Significance of metering level is indubitable (also included in IWA 2006 set of PIs) being one of the most important indicator required to estimate non-revenue water.  However, it is important to include these PIs due to the fact that not all the connections are metered, nor are the entire control units are automated in SMWU like larger utilities. Linear BTFs, presented in Table 5.1, are developed for these indicators using literature values and expert opinion.    119 In smaller water supply systems, the catchment areas of surface water sources (e.g., creeks in Okanagan Basin) are sometimes small and have a limited raw water storage capacity, which should be monitored using PH3 to meet the future water requirements of growing population. Based on expert opinion, the relationship for PH3 is mapped over 10 to more than 300 days.  5.3.4 Operational Integrity  Operational indicators are extremely important for the development of rehabilitation, renewal, and replacement plans for different components of utility’s physical infrastructure for effective asset management. In this category, the PIs not included in NWWBI are: percentage of mains replaced (OP2), mains rehabilitated (OP3), valves replaced (OP7), and cleaning of storage tanks (OP9). Theuretzbacher-Fritz et al. (2013) reported the annual rehabilitation rates of water mains as a percentage of the mains’ length (i.e., OP3) for around 1300 water utilities of all sizes that participated in Trans-National Water Supply Benchmarking Project in Austria and Germany. The value for this PI ranged from 0.1% to 2.0% for SMWU (with water supplied from less than 2Mm3 to 8.0Mm3). It is important to mention here that values higher than 1.0 were observed for privately owned smaller systems (generating higher revenues) with amount of water intake less than 2Mm3. Consequently, literature values are used ranging from 0.1 to 1.0 (%) against performance levels from 10 to 100, respectively. A similar approach to obtaining guidelines from literature has been adopted for OP2 and OP7. Cleaning of treated water storage tanks (OP9) is indispensable in order to maintain desirable drinking water quality in water supply. The total volume of the reservoirs (related to PH4) cleaned during the assessment year is used to calculate this indicator.  For the estimation of water loss or non-revenue water (NRW), units of ‘liter/ connection/day’ have been used based on the observation that water losses primarily occurs at service connections (Hamilton et al. 2006). Recently, Lambert et al. (2014) reported that more than half of system leakage (i.e., 50 to 500 liters/ service connection/ day) in water utilities occurs at service connections. Unauthorized water consumption has not been highlighted as major issue in participating utilities. At the start of the benchmarking process in Canada (primarily for larger utilities) in 2007, the NRW ranged between 88 and 663 liter/connection/day; however, this range has been improved to 32 to 383 liter/ connection/ day for the year 2011 (NWWBI 2013). Keeping in view that there is less monitoring structure for water loss estimation and other O&M issues in Canadian SMWU, the BTF for water loss varies from 25 to 800 liter/ connection/ day (OP4, Table 5.1).     120 5.3.5 Water Quality and Public Health Safety  A moderately concise list of PIs has been included for SMWU in the prior phase of this research based on availability of data, health related significance, and comparability with other utilities. Most of the PIs have been included in NWWBI-PR, except residual chlorine in distribution system (WP4). Residual chlorine is added to avoid any possibility of recontamination as a result of cross-connection within the distribution system. In addition, due to lower treatment levels in SMWU, the turbidity of distributed water to the community sometimes is higher than 1NTU.  Therefore, to improve microbiological water quality, higher chlorine doses are required to be added, which result in taste and odor problems. A range from 0.1 to 4 mg/L of residual chlorine concentration in distribution system is used to map BTF for this indicator (WP4, Table 5.1).  The population affected by the boil water advisories (WP1) is an indirect measure of water quality status of any water utility. The main causes of higher number of boil water advisories in the SMWU are source water contamination, flushing of hydrants, construction, repair and maintenance works, equipment failure, and inadequate treatment. Unlike SMWU, as per NWWBI 2013 public report, all of the participating utilities reported zero (nil) days with boil water notices (AECOM 2013). Consequently, based on expert opinion, the BTF mapped over values from 0 to 2 days has been developed (WP1, Table 5.1).    According to Health Canada Guidelines for Drinking Water Quality, surface waters should be treated to achieve turbidity less than 0.1 NTU at all times. If it is not possible, 0.3 NTU should be maintained at 95% of times but less than 1 NTU at all times (for chemically assisted filtration). The minimum requirement in case of slow sand filtration is 1 NTU 95% of times and should never exceed 3.0 NTU (Health Canada 2012). British Columbia, Ministry of Environment established 5 NTU as the upper limit for drinking water at the consumer’s tap; however, for unfiltered water supplies, the boil water notice should be issued when the turbidity is more than 1NTU (BCMoE 1997). As per WHO (2011) drinking water quality guidelines, in small water supplies with limited resource, turbidity should not exceed 5NTU and, if at all possible, below 1NTU.  Considering the above mentioned guidelines and limits reported in literature, the BTF has been established for a turbidity range from 0.1 to 5NTU mapped over 10 to 100 performance levels (WP2, Table 5.1). Health Canada (2013) established a maximum allowable concentration of Nitrates as 45mg/L for drinking water (Health Canada 2013). Higher concentrations can cause blue baby disease. The BTF to assess the performance level of this indicator (WP8, Table 5.1) from 10 to 100, a range of nitrates between 0.1 and 45mg/L has been used.      121 Health Canada (2012) has established the maximum allowable concentration of 0.1 mg/L (100 μg/L) for THMs concentration in drinking water supplies. Reported minimum, median and, maximum values of 0, 0.025 and, 0.245, respectively, in NWWBI public report have been used to develop an exponential relationship for calculating performance level for this indicator (WP7, Table 5.1).   5.3.6 Quality of Service Reliability   The PIs included in IU-PBM, in addition to PIs in NWWBI-PR, are billing complaints (QS1), population coverage (QS6), total response to complaints (QS7), service connection complaints (QS8), and aesthetic, microbiological and physico-chemical tests compliance (QS9,10&11). The BTF for billing complaints covers 0 to 4.5 billing complaints per 1000 connections mapped over the 10 to 100 performance level. The performance level for pressure complaints is plotted for 0 to 10 complaints per 1000 served. Efficient response to the reported complaints (QS7) is extremely important to ensure customer’s satisfaction. This indicator includes the complaints which were resolved with an acceptable customer satisfaction level. For now, the total (percentage of) complaint responses mapped over 70 to 100% have been included in BTF for this indicator (QS7, Table 5.1).    Overall compliance of the tested water samples with the promulgated water quality standards is a simple measure for public reporting. These PIs (QS9, QS10, and QS11 also recommended in IWA Manual of Best Practice 2006) include aesthetic, microbiological and physico-chemical water quality parameters (Alegre 2006), and are plotted from 65 to 100% compliance against 10 to 100 performance level.  5.3.7 Economic and Financial Stability   In this category, the BTFs for revenue per unit of supplied water (FE3), operating cost coverage ratio (FE5), debt service ratio (FE6), and NRW by volume (FE7) have been established based on reported literature and expert opinion. Revenues per unit of volume supplied (FE3) has been mapped over 0.3 to 0.75 $/m3 to calculate performance score of FE3 (Lange and Hassan 2006). The reported range of O&M cost per km of water mains (FE2) is adjusted due to less economies of scale (i.e., smaller denominator) in SMWU; therefore, a linear BTF for this indicator has been developed for a range from 4 to 35$/ km. Similarly, water rates (FE1) have also been adjusted between 200 and 700$/year to establish a second order polynomial BTF for a typical residential connection using 250m3/year of water consumption (as included in NWWBI-PR).      122 Operating cost coverage (FE5) is the ratio between total annual operational revenues and total annual operating cost, and describes the financial performance of a water utility’s operations. The minimum desirable value of 1.0 shows that there are enough operating revenues available with the utility to comfortably cover the operating expenses (UNC 2013).  A second order polynomial BTF has been established for this indicator for a range between 0.5 and 1.2, versus performance level varying from 10 to 100. The ability to pay debt service (FE6) has become critical for any utility, as debt has become an important instrument to capitalize utility operations. The minimum DSR ratio should not be less than 1.0, with a recommended ratio of between 1.25 and 2 to cover the possible risks due to changes in input costs (UNC 2013, WSP 2012). In result of these recommendations, a BTF has been established for a range of DSR ranging from 0.8 and 2.  Finally, NRW (FE7), as percentage of total input volume, was also included as an indicator of economic loss of water. NRW may not depict a realistic picture of water losses in the transmission and distribution systems of a water utility, but it provides the information of volume of water which was lost without being charged. Nevertheless, NRW can be considered as a useful financial indicator (Kanakoudis & Tsitsifli 2010). The BTF for this indicator has been developed for a range between 0 and 50% of the system input volume.  5.4 A Case Study of the Okanagan Basin  To evaluate the practicality of the IU-PBM, the framework shown in Figure 5.3 has been implemented for two medium sized water utilities in Okanagan Basin, British Columbia, Canada. The land use is diverse and almost the same in both the participating utilities, including residential, agricultural, commercial, public, and industrial with little site specific variation in the area wise distribution of each type of land use. The topography of the Okanagan area is rolling and hilly with medium to steep grades.   Utility-A has been operating for 31000 persons with 10900 connections, primarily residential. The total length of water mains is 276 km. There are five water supply systems (depending on raw water source) being operating under Utility-A with both domestic and agricultural customers ranging between 150 and 5000. All of these systems are drawing water from surface water sources, including lakes and creeks. Only one of the water supply systems (i.e, relatively larger) has a conventional water treatment plant with full filtration system; whereas, chlorinated water is being supplied to the consumers, in the rest of the four systems.     123 Utility-B is comparatively smaller with 16000 residents, 6400 domestic and agricultural connections, three water supply systems, and 152 km of water mains. All three water supply systems (based on raw water source) supply chlorinated water to the community without conventional surface water treatment. Both utilities have linear assets of different pipe materials, including steel, plastic and cementitious. However, the average age of water mains in Utility-A is older than in Utility-B. Details of the data variables have not been provided in this research.   Data variables for calculating PIs were collected from the utility’s technical and financial management, through personal communication, hydraulic models of water distribution systems, GIS maps, customers’ complaints forms and responses, water quality monitoring stations, financial inventories, and utility’s master plans. The step-by-step implementation of the proposed benchmarking methodology has been described below.  Step 1: Evaluation of weights of performance indicators under each performance category Weights of the PIs in each performance category are calculated using the Simos’ method, as described in section 5.2. The list of selected PIs have been sent to three participating utilities in the Okanagan Basin including Utility-A and B. Utility managers and technicians ranked the PIs according to their importance from 1 to n; where n is the number of PIs in each category. Likewise, professionals working in asset management of water utilities also ranked the PIs. Results of Simos’ application are presented in Table 5.1. The first column of Table 5.1 corresponds to number of PIs in each of performance category; while the indicators are described in second column. The third column lists the average frequency of the importance ranks scored by the decision makers from least important to the most important indicator in ascending order. Subsequently, the indicator with the maximum average frequency is given the higher Simos’ rank in fourth column. PIs with the same average frequency are allocated the same rank. The fifth column shows the number of PIs. Non-normalized weights of the indicators are presented in column six.   The weights of all the PIs, which are essentially the positions of Simos’ rank already listed in column four, for all the performance categories are estimated and listed in last column of Table 5.2. These final weights are used to calculate aggregated performance indices for both the participating utilities using the TOPSIS method described in Section 5.2. Due to space limitations, only the details of the quality of service index are described as an example in the following steps; nevertheless, final results for the other indices are included.     124 Step 2: Checking the need for normalization In present study, the BTFs have already been established as the benefit criteria. The performance levels of all the PIs range from 10 to 100; thus, there is no need for normalization. The non-normalized performance scores of PIs in the ‘quality of service' category are calculated using BTFs presented in Appendix B. For example, the indicator “QS4 -number of unplanned interruptions per 100km” due to water main breaks, pump failure, hydrant failure or a valve failure is calculated using the BTF developed in Table 5.1 as:  (PL)QS4 = 100 e-0.053(QS4)  The values of QS4 were found to be 0.36 and 3.49 for Utility A and B, respectively. Putting these values in above equation, the performance level for QS4 comes out to be:  (PL)QS4 for Utility A = 98 (PL)QS4 for Utility B = 87  Likewise, the performance levels for all the remaining PIs in the quality of service category are calculated and arranged in the form of following matrix:    QS1 QS2 QS3 QS4 QS5 QS6 QS7 QS8 QS9 QS10 QS11 Utility A 54.0 59.0 73.0 98.0 100.0 69.0 100.0 92.0 100.0 93.0 92.0 Utility B 57.0 91.0 68.0 87.0 70.6 100.0 100.0 49.0 10.0 28.0 99.0      125 Table 5.2 Weight estimation using Simos’ Method PI No. Performance Indicator (PI) U1-A U-B U-C E2-A E-B E-C Average of PI frequency Simos’ Rank Number of PI Non-normalized weights Normalized weights WE WATER RESOURCES AND ENVIRONMENT            WE-1 No. of days of water restriction 2 2 1 3 3 4 2.5 2 1 2 0.20 WE-2 Per capita water consumption – Domestic users                                                   3 4 3 4 4 3 3.5 4 1 4 0.4 WE-3 Existing Annual Water License Capacity - Utilized 4 3 2 2 3 2 2.7 3 1 3 0.3 WE-4 Impact of residual chlorine in flushing water on aquatic life 1 1 4 1 2 1 1.7 1 1 1 0.1 PE PERSONNEL         4 10 1 PE-1 Field FTEs - Distribution 3 7 7 7 6 7 6.2 7 1 7 0.25 PE-2 Field FTEs - Metering 2 1 4 5 3 1 2.7 2 1 2 0.07 PE-3 Field accidents  7 5 2 4 5 5 4.8 5 1 6 0.21 PE-4 Sick days taken by FTEs  6 4 5 6 7 6 5.5 6 1 5 0.18 PE-8 Water resources and catchment management employees 1 2 1 2 2 3 1.8 1 1 1 0.04 PE-9 Overtime of FTEs  4 6 6 1 4 2 3.8 4 1 4 0.14 PE-11 Personnel training hours 5 3 3 3 1 4 3.2 3 1 3 0.11 PH  PHYSICAL ASSETS           7 28 1 PH-1 Metering level 2 2 2 3 2 2 2.0 2 1 2 0.20 PH-2 Degree of automation 1 1 3 2 1 1 1.0 1 1 1 0.10 PH-3 Raw water storage capacity 4 3 1 1 3 3 2.75 3 1 3 0.30 PH-4 Treated water storage capacity 3 4 4 4 4 4 3.75 4 1 4 0.40 OP  OPERATIONAL           5 15 1 OP-1 No of Main Breaks 9 9 9 9 9 8 8.8 8 1 8 0.216 OP-2 Mains Replaced 5 8 1 5 1 9 4.8 5 1 5 0.135 OP-3 Mains Rehabilitation/ Renovation 4 6 3 2 6 7 4.7 4 1 4 0.108 OP-4 Non-Revenue Water 7 7 2 7 7 6 6.0 6 1 6 0.162 OP-5 Service connection rehabilitation  2 3 5 6 3 4 3.8 2 1 2 0.054 OP-6 Inoperable or leaking hydrants 8 4 8 8 8 5 6.8 7 1 7 0.189 OP-7 Replaced valves 1 5 4 1 5 2 3.0 1 1 1 0.027 OP-8 Hydrant Inspection 6 2 6 3 4 3 4.0 3 1 3 0.081 OP-9 Cleaning of treated water storage tanks  3 1 7 4 2 1 3.0 1 1 1 0.027 WP  WATER QUALITY AND PUBLIC HEALTH           9 37 1 WP-1 Days with boil-water advisory 5 6 3 4 6 5 4.8 5 1 5 0.24 WP-2 Turbidity -  Distribution 3 4 4 5 5 4 4.2 4 1 4 0.19 WP-3 Total Coliforms - Distribution 6 5 6 6 4 6 5.5 6 1 6 0.29 WP-4 Residual Chlorine - Distribution 4 2 5 3 3 3 3.3 3 1 3 0.14 WP-5 THMs - Distribution 2 3 1 1 2 1 1.8 2 1 2 0.10 WP-6 Cumulative length of mains cleaned 1 1 2 2 1 2 1.3 1 1 1 0.05 QS  QUALITY OF SERVICE           6 21 1 QS-1 Billing complaints 3 7 3 2 1 1 2.8 2 1 2 0.03 QS-2 Pressure complaints 2 6 2 5 4 10 4.8 4 1 4 0.06 QS-3 Water quality complaints 5 9 6 9 8 8 7.5 7 1 7 0.13 QS-4 Unplanned Interruptions 9 8 9 11 10 11 9.7 10 1 10 0.16   126 PI No. Performance Indicator (PI) U-A U-B U-C E-A E-B E-C Average of PI frequency Simos’ Rank Number of PI Non-normalized weights Normalized weights QS-5 Unplanned Maintenance Hours 8 11 8 8 9 7 8.5 9 1 9 0.15 QS-6 Population coverage 1 5 1 1 2 2 2.0 1 1 1 0.02 QS-7 Total response to reported complaints 7 2 7 6 3 6 5.2 5 1 5 0.08 QS-8 Service Connection complaints 4 10 5 4 7 4 5.7 6 1 6 0.10 QS-9 Aesthetic Test compliance - Distribution 6 4 4 3 5 3 4.2 3 1 3 0.05 QS-10 Microbiological test compliance - Distribution 11 3 11 10 11 5 8.5 8 1 8 0.14 QS-11 Physical-chemical Test Compliance - Distribution 10 1 10 7 6 6 7.2 7 1 7 0.11 FE  FINANCE AND ECONOMIC           11 62 1 FE-1 Water rates 3 5 6 6 4 6 5.0 6 1 6 0.29 FE-2 O&M cost per km of water mains 5 3 3 5 5 4 4.2 4 1 4 0.19 FE-3 Revenue per unit volume of supplied water 2 6 4 4 6 2 4.0 3 1 3 0.14 FE-4 Operating cost coverage ratio 1 2 2 2 1 3 1.8 2 1 2 0.10 FE-5 Debt service ratio 6 4 5 3 3 5 4.3 5 1 5 0.24 FE-6 NRW by Volume 4 1 1 1 2 1 1.7 1 1 1 0.05              6 21 1 1U= Utility; 2E=Expert                  127 Step 3: Development of Weighted Matrix From the weights of the PIs listed in Appendix C, the weighted matrix can be developed using equation [5.2] by multiplying the performance levels calculated in step 2 with their corresponding weights as:  PIs QS1 QS2 QS3 QS4 QS5 QS6 QS7 QS8 QS9 QS10 QS11 Weights [0.03 0.06 0.11 0.16 0.15 0.02 0.08 0.10 0.05 0.13 0.11] Utility A 1.62 3.54 8.03 15.68 15.00 1.38 8.00 9.20 5.00 12.09 10.12 Utility B 1.71 5.46 7.48 13.92 10.59 2.00 8.00 4.90 0.50 3.64 10.89  Step 4: Identify positive-ideal and negative-ideal solutions The X* and X- are defined as the PIS (100) and NIS (10) in terms of weighted performance levels using equations [5.3] and [5.4]. For example the PIS and NIS for QS1 are calculated as:  PISQS1 = 100 x 0.03 = 3.0 NISQS1 = 10 x 0.03 = 0.3  In the same way, the weighed PIS and NIS are calculated for all the PIs in QOS category and presented in the following matrix:   PIs QS1 QS2 QS3 QS4 QS5 QS6 QS7 QS8 QS9 QS10 QS11 PIS 3.0 6.0 11 16 15 2 8 10 5 13 11 PNS 0.3 0.6 1.1 1.6 1.5 0.2 0.8 1 0.5 1.3 1.1  Step 5: Calculating the distance of each water utility from PIS and NIS The distance of the performance scores for the quality of service category for each water utility are measured by the n-dimensional Euclidean distance. The combined distances of all the PIs from the weighted PIS values, as shown in Step 4, are calculated using equation [5] for each utility. For instance, YA* is estimated as:  YA* = √((1.62 − 3)2 + (3.54 − 6)2 +⋯+ (10.12 − 11)2) = 4.4     128 Similarly, the distance from PIS for Utility B is calculated, and the final results are shown in the following matrix: YA* 4.4 YB* 13.1  And the distances of each water utility from weight of NIS are calculated from equation [6], and the results are presented in the following matrix:  YA- 27.9 YB- 21.7  Step 6: Develop aggregate performance index by calculating similarities to PIS The final performance indices of the quality of service category are calculated for both the utilities by using the equation [7] and multiplying the answer with 100. For example, quality of service index (QSI) is calculated as:  QSIA = (YA- / (YA* + YA-)) x 100 = (27.9 / (4.4+27.9)) x 100 = 83.6  QSIB is also calculated in the same way and the final indices are presented in the following matrix:  QSIA 83.6 QSIB 62.4  Similarly, the remaining performance indices WEI, PEI, PHI, OPI, WPI, and FEI have been calculated for both the water utilities for the assessment year 2012, and the results are shown in Figure 5.5a&b. The proposed management actions against an estimated performance index are presented in Table 5.3. Figure 5.5 shows that the overall performance of both the utilities is satisfactory with performance indices lying in the ‘Medium’ or ‘High’ ranges (i.e., indices values higher than 50), except for the water quality and public health safety category in the case of Utility-B (Figure 5.5b). This is due to relatively objectionable source water quality in one of its water supply systems. The utility has changed the source in FY 2014, which has improved their WPI. The same issue has affected the QSI of Utility B as well due to lower performance levels for QS9&10.     129                (a)                (b)   Figure 5.5 Aggregated performance indices for all the functional components, (a) performance indices of Utility A, (b) performance indices of Utility B    020406080100WR&EnvironmentalsustainabilityPersonnel adequacyPhysical assetsefficacyOperational IntegrityWQ & PH safetyQuality of ServiceFinancial stabilityUtility B Performance020406080100WR&EnvironmentalsustainabilityPersonnel adequacyPhysical assetsefficacyOperational IntegrityWQ & PH safetyQuality of ServiceFinancial stabilityUtility A Performance100 10 High Medium Low Very Low 80 50 30    130 Table 5.3: Description of performance levels with proposed actions Index Performance Proposed action 10 - 30 Very low Urgent and detailed improvement required for several PIs in this category 30 -50 Low Detailed investigation required for underperforming PIs at intra-utility level 50 -80 Medium Careful investigations required at inter / intra-utility level to identify lacking sub-components >80 High Satisfactory performance need to be maintained at inter-utility level  Utility-A, however, is showing relatively better performance overall. However, apart from quality of service and personnel adequacy categories, management of Utility-A needs to investigate the underperforming PIs to further improve the performance of their utility (Figure 5.5a). Conversely, Utility-B needs to do the same exercise for all of the performance categories. Furthermore, the indices just approaching or slightly increasing the good performance zone (i.e., 60) also needs detailed investigations for future performance management.  Performance indices using the above approach describe the condition and efficiency of each functional component of SMWU. Based on the existing efficiency, the utility managers can make decisions about the essential improvements, e.g., hiring additional personnel, increase main replacement rate, increase coverage, improve metering level, implement water conservation plan, etc. Performance less than ‘Medium’ certainly need intra-utility performance assessment of respective functional component, which is described in Chapter 6.  IU-PBM in this study has been developed through a comprehensive review of reported performance assessment studies of utilities around the world. Therefore, the model can be effectively used for performance benchmarking of SMWU with population fewer than 10,000 to 50,000 (using minimum number of PIs) for any region. However, IU-PBM is a data driven model based on piecewise continuous functions, and works most efficiently for the applicable ranges of PIs given in Appendix A. The managers can import the methodology, and if required, they can tweak the BTFs for site specific socio-economic and geographical conditions, and actual benchmarking data for their utilities. Likewise, the weights of the PIs can be re-evaluated following the approach used in the present study. There are always uncertainties exist in available data; this issue could be more significant in case of SMWU due to limited resources and inefficient data management practices. Presently, IU-PBM cannot handle these uncertainties and needs to be enhanced in future.     131 5.5 Summary  In this chapter, inter-utility performance benchmarking model (IU-PBM) is developed for SMWU. This entails consideration of 47 performance indicators (PIs) for different performance categories, such as water resources and environment, personnel, physical assets, quality of service, water quality and public health, and financial. Calculating performance levels by simply comparing the calculate value of PI from the best and worst performing utilities in benchmarking might be misleading; because, this approach does not consider the average performing utilities in the evaluation process. The non-linear approach used in IU-PBM sufficiently addresses this issue. Therefore, 47 (linear, exponential, logarithmic and polynomial) benchmarking transformation functions have been established to translate the calculated PIs into performance levels between 10 and 100, which is based on literature, NWWBI reports and expert judgment. The weights are estimated using Simos’ method from the ranking of PIs by different water utilities in the Okanagan basin, British Columbia, Canada, and the opinion of experts working in water infrastructure management. The proposed approach accommodates wide variations in the calculated values of PIs, being mindful of the smaller economies of scale in SMWU as compared to larger water utilities.   Finally, performance indices have been established by aggregating the transformed performance levels using TOPSIS method (i.e., based on the concept of relative closeness to the most desirable solution and remoteness from the least desirable solution). The IU-PBM results presented in the form of a web diagram demonstrate the utility’s performance to the top level management for pragmatic decision making. The proposed model has also been implemented for two SMWU operating in Okanagan Basin to demonstrate its practicality.      132 Chapter 6     Intra-utility Performance management Model (In-UPM)   A part of this chapter is under review in Journal of Cleaner Production as an original research article titled “Intra-utility Performance management Model (In-UPM) for the Sustainability of Small to Medium Sized Water Utilities: Conceptualization to Development” (Haider et al. 2015c).  If one or more than one functional component is not meeting desired LOS (i.e., not performing ‘High’) from the IU-PBM results, the utility managers can use the model developed in this Chapter for performance management of SMWU.  6.1 Background  A water utility consists of different functional components (or processes): water resources and environment, personnel, physical assets, operation, quality of service, water quality and public health, and economics and finance. Each of these components consists of sub-components; for example, ‘personnel’ may comprise of staff health and safety, overtime culture, training hours, etc. For a sustainable water utility, all the functional components and their sub-components need to meet desired performance objectives. The Federation of Canadian Municipalities (FCM) and National Research Council (NRC), Canada described benchmarking as the mapping of one’s own process and subsequent comparison of your process with those of other companies with exemplary performance in a similar process (FCM/NRC, 2005). Based on inter-utility performance benchmarking results, the utility can hone in the performance of different sub-components within a functional component in order to identify the key areas for improvement; this process can be defined as Intra-utility performance assessment.     Furthermore, sometimes a water utility might be operating more than one water supply systems (WSSs) at a time because of geographical limitations and availability of source water. A WSS may have a separate water source, transmission, treatment, and distribution network. It is also useful for the utility to evaluate the performance of each WSS individually for prioritizing their short- and long-term investments. In this regard, a less addressed issue for intra-utility performance assessment, so far, is the identification of the underperforming WSS within a utility.  Performance indicators (PIs) are typically used to measure the performance of a program in terms of percentages or an index (score); which can be monitored at pre-defined intervals that can be compared to    133 one or more criteria or standards (Office of Public Management New South Wales [OPM] 1990). A comprehensive literature review of PIs for water utilities has been conducted in Chapter 2. Most of the existing performance assessment methods are based on involving similar water utilities over several years; which is not the case for Canadian SMWU. Moreover, aggregating all the PIs to estimate the overall performance of a functional component can eclipse the underlying processes (sub-components). In the absence of benchmarking data, an inter-utility performance benchmarking model for SMWU is developed in Chapter 5. However, at this point, there is no model/ study available for intra-utility performance assessment for SMWU, which can:   provide a systematic approach to identify the underperforming functional components, which may provide an opportunity to the  utility’s mangers to make rational and timely decisions.  handle the uncertainties in data variables/ inputs and knowledge base to evaluate PIs and performance measures at the sub-component and component levels.  differentiate under and over performing WSSs within a water utility to plan short-term and long-term investments for overall performance improvement.   The overall objective of this chapter is to develop an intra-utility performance management model (In-UPM) for sustainability of SMWU which can address above issues.   6.2 Establishing Performance Assessment Criteria  In this research, the sustainability criteria for SMWU are defined as, “all the functional components of a water utility are desired to meet their respective performance objectives”. In order to evaluate the performance of each functional component, a hierarchical based top-down approach, consisting of different performance factors, is proposed. These factors includes performance objective of the functional component at the top, followed by primary and secondary performance measures (PMs) assessing the performance of the sub-components. The performance measures are derived from performance indicators (PIs), and the PIs are estimated from the data/ decision variables at the bottom level. All of these performance factors are listed in Table 6.1. The first four columns of Table 6.1 contain PMs and PIs for seven functional components, while the last two columns list the data and decision variables, respectively. The objective, primary, and secondary performance factors are designated as italics; whereas, the PIs (when described first time in text) are described with their corresponding numbers presented in 4th column of Table 6.1. Details of performance assessment criteria for each functional component are described in the following sections.    134  Water Resources and Environmental Sustainability 6.2.1 The functional component of ‘water resources and environmental sustainability’ is evaluated with the help of two primary level PMs of ‘source water conservation’ and ‘environmental protection’, and a PI of water license capacity (WE3), which directly adds input to the top level of the hierarchy (Table 6.1). Based on the increasing water requirements and implementation level of water conservation measures, the utility mangers should renew the water licenses of their WSSs.   The first primary level PM is ‘source water conservation’, which can be evaluated by comparing the existing ‘water resources management’ practices and the implementation level of water conservation plan (WCP). The existing ‘water resources management’ practices are estimated from the PIs of water consumptions (WE2), water restrictions (WE1), watershed management employees (PE8), and water loss indicator (FE7) (refer to Table 6.1). In general, per capita water consumption decreases as the water price increases, particularly in case of consumption based billing (Whitcomb 2005). This type of billing can conserve the limited water resources and rationally recover the operation and maintenance costs as well. Usually, people living in expensive homes use more water due to several reasons, e.g., landscape irrigation, swimming pools, fixtures with higher flow rates, etc.   SMWU do not fully implement a well-structured WCP, which generally contains: i) a planned water loss and a leakage control program, ii) consumption based metering and billing, iii) reducing water wastage by elimination of single-pass cooling, iv) reuse of non-contact cooling water and low-flow toilets, v) implementing building codes that mandates minimum water efficiency requirements for fixtures; and vi) public education and awareness programs. As per some recent studies, significant amount of water can be conserved with the high efficiency appliances for domestic use (Gurung et al. 2015). However, in some of SMWU, the WCPs are at some stage of implementation (i.e., developed but not implemented yet, or at the initial stages of implementation).     135 Table 6.1 Performance objectives, performance measures (PMs), performance indicators (PIs), and data variables Generation 1 – Performance objective Generation 2 - Primary PMs  Generation 3&4 - Secondary PMs  Generation 5&6 - Performance Indicators  Data Variables / Decision Variables 1 Decision Actions P1,1,01,0 - Water resources and environmental sustainability P1,1,12,1 - Source water protection P1,2,12,1 - Environmental Protection  P1,1,13,2 - Water resources management P1,1,14,3 - Restrictions, consumption, and management  P1,2,23,2 - Impact of flushing water   P1,1,15,4 - WE1: Water restrictions (A6)2 P1,2,15,4 - WE2: Residential water consumption (E1,A1)  P1,4,15,1 - WE3: Existing water license capacity (A1,A2) P1,5,25,2 - WE4: Discharge of WTP1 residuals (A4,A5) P1,7,25,3 - WE5: Effect of flushing water on aquatic life (A7,A8) P1,3,15,2 - WE6: Implementation of water conservation plan  P1,6,25,3 - WE7: Distance between flushing point and natural drain – length of storm-water drain (A12) P1,1,26,5 - FE1: Water rates (L2) (G7) P1,2,16,4 - PE8: Water resources and catchment management personnel (L2) (A1, B2) P1,3,16,3 - FE7: Non-revenue water (NRW) by volume (L2) (A1, A10) A1: Average annual demand  A2: Existing annual water license capacity (WLC)  A4: Amount of WTP residuals discharged into natural environment  A5: Total residuals from WTP  A6: Days with sprinkler regulations  A7: Water volume in water body  A8: Amount of flushing water   A10: Revenue water A12: Distance between the flushing point and water body  B2: Water resources and catchment management personnel  E1: Total resident population  G7: Water rate for typical residential connection  A6: Increase days of water restrictions A2: Apply for new water licenses A4: Reduce or eliminate discharge of water treatment plant residuals A7: Perform flushing when flow in natural water bodies are high A8: Reduce amount of flushing water by optimizing flushing durations A12: Select flushing points located away from the natural water bodies as much as reasonably possible G7: Rationally increase water rates keeping the affordability in consideration A13: Increase implementation level of WCP  P2,1,01,0 - Personnel productivity  P2,1,12,1 - Personnel adequacy P2,2,12,1 -Personnel health and safety  P2,3,12,1 - Working environment efficacy  P2,1,13,2 - Catchment and treatment employees P2,2,13,2 - Productivity ratio  P2,3,13,2 - Metering and distribution employees P2,4,23,2 - Loss due to field accidents P2,5,23,2 - Personnel healthiness P2,6,33,2 - Overtime culture P2,4,34,3 - PE1: Field FTEs4 – Distribution (D) (C1,B3) P2,5,34,3 - PE2: Field FTEs – Metering (M) (B5,C2) P2,6,44,3 - PE3: Lost hours due to field accidents (D) (B8,B9)  P2,9,54,3 - PE4: Sick days per employee (D) (B1,B3) P2,1,14,3 - PE5: Field FTEs – Treatment (T) (A9,B4) P2,7,44,3 - PE6: Lost hours due to field accidents (T) (B1,B10) P2,10,54,3 - PE7: Sick days per employee (T) (B1,B4) P2,2,14,3 - PE8: Field FTEs - Water resources and catchment management (A1,B2) P2,11,64,3 - PE9: Overtime hours (D) (B1,B8) P2,12,64,3 - PE10: Overtime hours (T) (B1,B10) P2,13,34,2 - PE11: Personnel training (B1,B7) P2,3,24,3 - PE12: Staff Productivity  P2,8,24,3 – PE13: Implementation of health and safety plan P2,1,26,3 - PH2: Degree of automation (L2) (C9,C10) A1: Average annual demand  A9: Treated water supplied  B1: Total personnel  B2: Water resources &catchment personnel  B3: field FTEs  (D) B4: Field FTEs – Treatment (T) B5: Field FTEs – Metering (M) B7: Total training hours (D) B8: Field labour hours (T) B9: Lost  hours due to accidents (D) B10: Field labour hours  (T) B11: Lost  hours due to accidents (T) B12: Sick leaves (D)  B13: Sick leaves (T)  B14: Overtime hours (D) B15: Overtime hours (T) C1: Pipes’ length  C2: Meters installed  C9: Total control units C10: Automated control units B2: Optimize field FTEs for water resources and catchment management B3: Optimize field FTEs for distribution system operations B4: Optimize field FTEs for treatment plant operations B5: Optimize field FTEs for metering operations B7: Increase personnel training hours in a year B9: Reduce or eliminate time lost due to field accidents during distribution system operations B11: Reduce or eliminate time lost due to field accidents during treatment plant operations B14: Reduce overtime hours for distribution system operations by optimizing the staff B15: Reduce or eliminate overtime hours for treatment plant operations by optimizing the staff C10: Convert or replace the un-automated control units with the automated ones Note: Optimization of FTEs means increase or decrease number of personnel to improve staff productivity (PE12) P3,1,01,0 - Physical systems efficacy P3,1,12,1 - Storage and treatment systems capacity P3,2,12,1 - Monitoring system Integrity P3,1,13,2 - Storage capacity P3,4,24,2 - PH1: Metering level (C2,C6) P3,5,24,2 - PH2: Degree of automation (C9,C10) P3,1,14,3 - PH3: Raw water storage capacity (A1,C4) P3,3,14,2 - PH4: Treatment plant capacity (D1) P3,2,14,3 - PH5: Treated water storage capacity (A1, C5) P3,1,25,4 - WE2: Residential water consumption (L2) (A1,E1) P3,2,25,4 - WE6: Implementation level of WCP3 (L2) P3,1,16,5 - FE1: Water rates (L3) (G7)  A1: Average annual demand  A3: Average daily demand  C2: Meters installed  C4: Capacity of raw water reservoirs  C5: Capacity of treated water storage reservoirs  C6: Total number of service connections C9: Total number of control units C10: Automated control units D1: Days WTP operated greater than 90% capacity G7: Water rate for typical residential connection  C2: Increase metering  C10: Convert or replace the un-automated control units with the automated ones  C4: Increase capacity of raw water storage D1: Increase treatment plant capacity by adding additional units; careful judgment is required to evaluate maximum demand G7: Rationally increase water rates keeping the affordability in consideration to reduce water consumption  A13: Increase implementation level of WCP to improve the remaining storage capacity for future needs     136 Table 6.1 (Cont’d) Performance objectives, performance measures (PMs), performance indicators (PIs), and data variables Generation 1 – Performance objective Generation 2 - Primary PMs  Generation 3&4 - Secondary PMs  Generation 5&6 - Performance Indicators  Data Variables / Decision Variables Decision Actions P4,1,01,0 - Operational Integrity  P4,1,12,1 - Distribution system integrity P4,2,12,1 - Distribution system performance P4,3,12,1 - Distribution network productivity P4,1,13,2 - Distribu