UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Development and implementation of a priority setting and resource allocation evaluation tool for achieving… Hall, William 2013

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
24-ubc_2013_fall_hall_william.pdf [ 10.17MB ]
Metadata
JSON: 24-1.0107199.json
JSON-LD: 24-1.0107199-ld.json
RDF/XML (Pretty): 24-1.0107199-rdf.xml
RDF/JSON: 24-1.0107199-rdf.json
Turtle: 24-1.0107199-turtle.txt
N-Triples: 24-1.0107199-rdf-ntriples.txt
Original Record: 24-1.0107199-source.json
Full Text
24-1.0107199-fulltext.txt
Citation
24-1.0107199.ris

Full Text

Development and Implementation of a Priority Setting and Resource  Allocation Evaluation Tool for Achieving High Performance  by William Hall  B.Sc., Simon Fraser University, 2011  A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF  MASTER OF SCIENCE in  The Faculty of Graduate and Postdoctoral Studies  (Heath Care and Epidemiology)  THE UNIVERSITY OF BRITISH COLUMBIA (Vancouver)   October 2013  ? William Hall, 2013   ii Abstract  Objective: Canadian healthcare decision makers are facing greater pressure in setting priorities and allocating resources. However, recent studies suggest that only about 50% of healthcare organizations follow a formal priority setting process, and that even fewer are evaluating their processes to achieve ongoing improvement. This research developed an evaluation tool to help organizations identify the strengths and weaknesses of their process, and performed a meta-evaluation of the tool itself to inform future refinements.  Methods: A high performance framework for priority setting and resource allocation formed the foundation for this research. The framework was operationalized into an evaluation tool that took the form of a semi-structured interview. The tool was then implemented in test organization. Data from this application were analyzed using template and content analysis, and organizational strengths and weaknesses were identified. At the end of each evaluation interview, debriefs with participants were used to inform refinements for future applications of the tool.  Results: The evaluation tool was successfully developed from the high performance framework, and was implemented through interviews with 27 members of the test organization. Strengths of the organization?s process included involvement of a strong leadership team and use of a proposal assessment tool. Weaknesses included lack of training, and the presence of proposals that circumvented the formal process. Refinements to the tool involved formatting of interview questions as well as the addition of a new element and a new sub-element.  Conclusion: This research represents the first attempt at creating an evaluation tool using the high performance framework, and is novel in its application at a macro level within the test healthcare organization. Based on feedback from participants and the ability of the tool to capture relevant strengths and weaknesses of the organization?s process, further application is warranted. Future implementation will also serve to further refine the tool itself.      iii Preface This study to develop an evaluation tool was part of larger Canadian Institutes of Health Research (CIHR) funded initiative to examine high performance in priority setting and resource allocation led by Dr. Craig Mitton (University of British Columbia). Funded by CIHR, the initiative was conducted over the course of three years and is divided into 5 phases including: a literature review, national survey, case studies, development of a high performance framework, and creation and implementation of an evaluation tool based on the framework. This thesis work focused on the final phases to develop, implement, and refine the evaluation tool. The research team included: Dr. Stirling Bryan (University of British Columbia), Dr. Cam Donaldson (Glasgow Caledonian University), Dr. Jennifer Gibson (University of Toronto), Dr. Stuart Peacock (BC Cancer Agency), Mr. Neale Smith (Center for Clinical Epidemiology and Evaluation), Dr. Stuart MacLeod (University of British Columbia) and Ms. Bonnie Urquart (Northern Health).   In conjunction with the broader research team, Mr. Hall conducted the following research activities: 1. Development of evaluation tool: Hall led the research in operationalizing the framework for high performance to create an evaluation tool that could be used to identify the strengths and weaknesses of a healthcare organization?s priority setting process.  2. Evaluation tool implementation: Hall determined the sample selection strategy, and liaised with the test organization to conduct 27 semi-structured interviews using the evaluation tool over teleconference, video conference, and in-person.   iv 3. Data analyses and reporting: Hall analyzed the data from interviews, and determined the strengths and weaknesses of the test organization?s priority setting and resource allocation process using the evaluation tool. As part of a report to the senior management of the organization, Hall described the strengths and weaknesses of their organization?s process and delivered recommendations for improvement. 4. Tool refinement: Using observations from interviews and data for debriefs with participants following interviews, Hall recommended refinements to the evaluation tool for future application. Ethics approval for this study was secured from the UBC Behavioural Research Ethics Board ? Certificate # H10-01541    v Table of Contents  Abstract .......................................................................................................................................... ii!Preface ........................................................................................................................................... iii!Table of Contents ...........................................................................................................................v!List of Tables ................................................................................................................................ ix!List of Figures ................................................................................................................................. x!List of Abbreviations ................................................................................................................... xi!Acknowledgements ..................................................................................................................... xii!Dedication ................................................................................................................................... xiii!Chapter 1: Introduction ................................................................................................................1!1.1! Background ........................................................................................................................ 1!1.2! Normative Approaches ...................................................................................................... 6!1.3! Current State ...................................................................................................................... 9!1.4! Evaluation Frameworks ................................................................................................... 12!1.5! Evaluation Theory ............................................................................................................ 14!1.5.1! Methods Approach .................................................................................................... 14!1.5.2! Use Focused Approach ............................................................................................. 15!1.5.3! Values Focused Approach ........................................................................................ 16!1.6! Evaluation of Priority Setting and Resource Allocation in Practice ................................ 16!Chapter 2: Study Objectives and Research Questions .............................................................21!Chapter 3: Methods .....................................................................................................................23!3.1! Research Design ............................................................................................................... 23!  vi 3.2! Development of an Evaluation Tool ................................................................................ 24!3.2.1! The Balance Scorecard Precedent ............................................................................. 24!3.2.2! Development Theoretical Framework ...................................................................... 26!3.2.3! Methods for Development of Tool ........................................................................... 27!3.3! Implementation of the Tool ............................................................................................. 32!3.3.1! Methodological Tradition for Implementation ......................................................... 32!3.3.2! Description of Test Organization?s Process .............................................................. 32!3.3.3! Sampling ................................................................................................................... 35!3.3.4! Interviews .................................................................................................................. 37!3.4! Refinement of the Tool .................................................................................................... 40!3.5! Summary .......................................................................................................................... 41!Chapter 4: Results ........................................................................................................................42!4.1! Development of the Tool ................................................................................................. 42!4.2! Implementation of the Tool ............................................................................................. 43!4.2.1! Strengths ................................................................................................................... 46!4.2.1.1! Public Engagement ............................................................................................ 48!4.2.1.2! Criteria and Assessment Tool ............................................................................ 50!4.2.1.3! Frontline Staff Involvement ............................................................................... 53!4.2.1.4! Evidence Based .................................................................................................. 55!4.2.1.5! Leadership Team ................................................................................................ 57!4.2.1.6! Culture of Improvement ..................................................................................... 59!4.2.1.7! Ability and Authority to Move Resources ......................................................... 60!4.2.1.8! Summary of Strengths........................................................................................ 61!  vii 4.2.2! Weaknesses ............................................................................................................... 62!4.2.2.1! Lack of formal Training and Education ............................................................. 63!4.2.2.2! Communication .................................................................................................. 66!4.2.2.2.1! Initial Messaging ......................................................................................... 66!4.2.2.2.2! Process Transparency .................................................................................. 67!4.2.2.2.3! Feedback ..................................................................................................... 70!4.2.2.3! Process Timeline and Deadlines ........................................................................ 71!4.2.2.4! Lack of Monitoring and Oversight .................................................................... 73!4.2.2.5! Coordination of the Process Across the Organization ....................................... 74!4.2.2.6! Program Budgeting ............................................................................................ 76!4.2.2.6.1! Example of Successful Program Budgeting ............................................... 80!4.2.3! Recommendations ..................................................................................................... 81!4.3! Refinement of the Tool .................................................................................................... 82!4.3.1! Observations During Interviews: .............................................................................. 82!4.3.2! Results of Data Analysis ........................................................................................... 84!4.3.3! Interview Debriefs .................................................................................................... 86!4.3.4! Summary of Changes ................................................................................................ 89!Chapter 5: Discussion ..................................................................................................................90!5.1! Development of the Tool ................................................................................................. 90!5.1.1! Change in Structure ................................................................................................... 90!5.2! Implementation of the Tool ............................................................................................. 92!5.3! Refinement of the Tool .................................................................................................... 93!5.4! Broader context ................................................................................................................ 94!  viii 5.4.1! Limitations ................................................................................................................ 97!5.4.2! Future Research ...................................................................................................... 100!Chapter 6: Conclusion ...............................................................................................................104!Bibliography ...............................................................................................................................106!Appendices ..................................................................................................................................112!Appendix A Allocation Elements of High Performance in Organization-wide Resource ..... 112!Appendix B Conceptual Criteria from Sibbald et al. .............................................................. 113!Appendix C Original Evaluation Tool ? Senior Manager Version ......................................... 114!Appendix D Original Evaluation Tool ? Middle Manager Version ....................................... 119!Appendix E Recommendations From Test Organization Evaluation Report ......................... 124!Appendix F Adapted Evaluation Tool ? Senior Manager Version ......................................... 132!Appendix G Adapted Evaluation Tool ? Middle Manager Version ....................................... 139!   ix List of Tables  Table 3.1 Main Evaluation Theories ............................................................................................ 27!Table 3.2 Evaluation Tool Questions ........................................................................................... 29!Table 3.3 Comparison of PBMA in Theory to PBMA in Test Organization ............................... 35!Table 3.4 Sampling Matrix........................................................................................................... 36!Table 3.5 Evidence Used to Make Strength and Weakness Determinations ............................... 39!Table 3.6 Follow up Questionnaire .............................................................................................. 40!Table 4.1 Pilot Organization Dashboard ...................................................................................... 45!Table 4.2 Strengths of the Test Organization?s Priority Setting and Resource Allocation Process....................................................................................................................................................... 47!Table 4.3 Weaknesses of the Test Organization?s Priority Setting Process ................................. 62!Table 4.4 Recommendations for Improvement ............................................................................ 81!Table 4.4 Version 1 and 2 of Outcomes Element #4 ................................................................... 83!Table 4.5 Summary of Changes made to Elements in Evaluation Tool ...................................... 85!   x List of Figures  Figure 1.1 Total Health Expenditure, Canada, 1975 to 2012 ........................................................ 2!Figure 1.2 Total Health Expenditure Per Capita, Selected Use Funds, Canada 1975-2012 .......... 4!Figure 1.3 Stages in a PBMA Priority Setting Process .................................................................. 8!    xi List of Abbreviations  A4R ? Accountability for reasonableness PSRA ? Priority setting and resource allocation PBMA ? Program budgeting and marginal analysis SMT ? Senior Management Team VWG ? Validation Working Group    xii Acknowledgements  This thesis work would have been a lot less colourful without my cubicle mates at C2E2, and a lot tougher without the kindness of my cohort at SPPH. Thank you for the cards, teas, and macaroons.  I would like to thank the members of my thesis committee, Craig Mitton, Jennifer Gibson and Stirling Bryan for providing guidance and feedback throughout this project ? your insight, enthusiasm and comments were invaluable. I would also like to extend my thanks to Neale Smith, without whom I would have never developed the deep appreciation for qualitative methods and data collection that I now hold. To everyone in the test organization who participated in this research ? for your support and feedback, many thanks.  Finally, I would like to express my gratitude to Craig Mitton, my thesis supervisor, for guiding me through this research, for his support this past year, and for providing countless opportunities to grow as a professional. Thank you.   Thank you to my family ? Mum and Pange, and Sarita too.    xiii Dedication For my dad    1 Chapter 1: Introduction  1.1 Background  Setting priorities and allocating resources across a range of possible activities are essential public sector functions (1). This is especially true in healthcare where people?s lives and well-being are concerned. In most countries, meeting the health needs of a population is the responsibility of health organizations that manage and administer health resources (2). As healthcare resources are scarce, some mechanism is required within health organizations to decide what services to fund and not to fund (3). For example, should additional hospital resources be put into hip surgeries or diabetes prevention programs? Should more funding be allocated to rare diseases or be used to hire additional nurses for an emergency department? In essence, when there are more claims on resources than there are resources available, some form of priority setting must occur (4).   Regardless of fiscal climate, healthcare organizations face the need to set priorities and make resource allocation decisions based on strategic direction, accountability to the public, and their responsibility to meet the needs of the communities they serve. When new money is continually available to fund new treatments or programs, healthcare decision makers may face fewer limits on the initiatives they are able to invest in. Arguably, this may relieve some of the pressure to set priorities during times of resource abundance. In Canada, total healthcare expenditures more than doubled from $100 billion to just over $200 billion (in current dollars) between the late 1990?s and 2010 as seen in Figure 1 (5).    2 Figure 1.1 Total Health Expenditure, Canada, 1975 to 2012           Source: 2010 National Health Expenditure Database, Canadian Institute for Health Information.  As can be seen above, a plateau in spending that mirrored the economic recession during the 1990s directly preceded the increase from 1996 to 2011. A similar observation can be noted between 2011 and 2012, albeit less clear as the impact of the most recent recession is still playing out. Nevertheless, in 2012 the Government of Canada announced that they would tie the Canadian Health Transfer (CHT) to expansion of the economy (6). While noting the CHT only represents a portion of the overall spend on public health care in Canada, the Canadian Parliamentary Budget Office (PBO) predicts that this change in policy will ?ultimately bring the level of federal cash support to historical lows observed under the 1996-1997 period? (7). This shift from dramatically increased budgets to a stagnation of resource availability (with equal or    3 greater demand) may be putting additional pressure on Canadian healthcare decision makers that was not felt during the time of sustained growth referred to above.   Even more importantly, since the majority of funding for health care services is provincial, decision makers in provinces that are facing deficits may feel additional pressure as well. Recent economic challenges in Alberta and Ontario illustrate how provinces can be forced to shift from a mentality of healthcare resource abundance to a mindset of greater restraint over a relatively short period of time (8)(9).  This is also the case in British Columbia with the 2012 Provincial budget announcing decreases to the rate of growth of funding from 4.2% between 2009-2012 to 2.4% between 2012-2014 (10). While the pressure to set priorities experienced by individual healthcare decision makers at baseline may vary, the fact that resources are not as abundant as they were during the period from the mid-1990s to 2011 cannot be disputed. Given the necessity of resources to fund current and new services, it is conceivable that during these times of more limited funding healthcare decision makers will experience even greater pressure to set priorities when allocating resources.   In addition to constrained funding, healthcare decision makers are also facing increased costs of providing care. All three of the largest spending drivers, hospitals (29.1%), drugs (15.9%), and physicians (14.2%) ? (percentage share of total expenditure in 2010), have shown per capita spending increases since the late 1990?s (5). Figure 1.2 illustrates the increased and forecasted spend on each of these costs.     4 Figure 1.2 Total Health Expenditure Per Capita, Selected Use Funds, Canada 1975-2012  Source: 2010 National Health Expenditure Database, Canadian Institute for Health Information.  With increasing costs, economic uncertainty, federal funding returning to historical lows, and ever growing demands for new services and expensive technologies, the need to set priorities must remain top of mind if the healthcare system is to live within its means and make effective and sustainable choices going forward (11). This need to set priorities and allocate resources is a complex and difficult problem faced by healthcare decision makers, and a task that may not succeed if the proper tools are not available (2)(4). A study by Mitton and Donaldson (2002) suggests that decision makers have a general appreciation for what a formal priority setting process should be able to achieve. The following quote was taken from that study, and belongs to an administrator who was interviewed as part of the research.    5 ?when we get another dollar [?], we should be able to say [?] where do I put this dollar [?] to get the maximum health gain.? (12)   Despite this recognition by healthcare administrators, traditional approaches to resource allocation decisions have been heavily influenced by historical allocations and political factors. In the same 2002 study by Mitton and Donaldson, another administrator described internal political influence on their organization?s priority setting and resource allocation process in the following way.   ?It's a squeaky wheel process. Whoever is able to more clearly articulate their problem, or lobby for their group, or through some other form of power and influence impact whatever process is in place that year, [they] will come out with [their preferred] outcome.? (12)   While it is plausible to realize important aims of maximing health gain and having a ?fair process? using an approach that largely includes historical and political influences, several studies have shown that decision makers working within the healthcare system favour a more formal approach to priority setting and resource allocation (13)(14). Understandably, making decisions based on ?what was done last year? or ?who is the best lobbyist? is a long way from a formal process and is more likely to lead to less benefit realized and less fair processes.     6 1.2 Normative Approaches  Decision makers have recognized the potential benefits of shifting from historically and politically dominated processes to more formal approaches to priority setting and resource allocation, and have expressed desire for a more explicit, evidence-based framework to address priority setting (12)(15).  ?No, I don't think [the system] works well. I think it works as well as it can without some more overarching framework in which to make those decisions.? ? Healthcare Administrator (12)  In response to this need, frameworks have been proposed to aid decision makers including ?accountability for reasonableness? (A4R) and ?program-budgeting and marginal analysis? (PBMA) (16)(17). A4R is an ethical approach to priority setting that focuses on the ?fairness? of how resource allocation decisions are made (18)(19).   Daniels & Sabin proposed the four original conditions of A4R (20), and in 2005 Gibson et al. introduced a 5th condition (empowerment) (18).   1. Relevance: Decisions should rest on evidence, reasons, and principles that ?fair-minded? stakeholders can agree are relevant under the circumstances  2. Publicity: Rationale for decisions and the outcomes should be made publicly accessible    7 3. Appeals/Revision: Opportunities to revisit and revise decisions if further evidence or arguments are made available. A mechanism should also exist to challenge and dispute resolutions  4. Empowerment: Efforts should be made to optimize opportunities for participation, and minimize power differences are included in the process  5. Enforcement: There should be a voluntary or regulatory mechanism that ensures the first four conditions are met    While A4R emphasizes fair processes, PBMA focuses on allocating resources to optimize health and non-health benefits through a practical application of the economic principles of opportunity cost ? the forgone benefit of the next best available option, and marginal analysis ? the benefit gained or lost from adding or subtracting the next unit of resource for a given program (3). This approach highlights that unless these core economic principles are considered, benefit will not be maximized for the given resources (14). The general stages of PBMA can be found in the figure below.          8  Figure 1.3 Stages in a PBMA Priority Setting Process                         Source: Health care priority setting: principles, practice and challenges (2).  Through the application of these frameworks over many years, studies have been carried out to describe relevant factors for success for priority setting and resource allocation including barriers and facilitators. The presence of a high level champion, adequate resources, external expert involvement, stakeholder engagement, and training for decision makers represent a few of the identified factors (13)(21)(22). By implementing these approaches and incorporating additional facilitators, some Canadian organizations have been able to create more transparent methods for setting priorities in ways that are fairer and aspire to maximize the benefit of resources allocated. 1) Determine the aim and scope of the priority setting exercise  2) Compile a program budget (i.e. map of current activity and expenditure)  3) Form marginal analysis advisory panel  4) Determine locally relevant decision making criteria  a. Decision maker input   b. Board of Directors input   c. Public input 5) Advisory panel to identify options in terms of:   a. Areas for service growth   b. Areas for resource release through producing same level of output (or outcomes) but with less resources    9 Achievements from these efforts have included the re-allocation of millions of dollars from low to high priority services, deficit reductions in the millions of dollars, and feedback from decision makers that reported a much more rational and fair process (23)(24)(25). Although we cannot empirically claim that more formal processes in these organizations necessarily resulted in a higher quality of service for the population they serve as compared to other less formal processes, the outcomes achieved and reports from decision makers involved would certainly suggest that a formal approach has allowed for more explicit decision making and more sustainable decisions to be made.  1.3 Current State  In order to explore the current state of priority setting and resource allocation in Canadian healthcare organizations, a survey was conducted as part of the second phase of the larger CIHR project that encompassed this thesis work. Ninety-two senior decision makers from healthcare organizations across every province and territory in Canada responded with information about their organization?s process for priority setting and resource allocation (26). Despite the advent of these frameworks and achievements by various organizations, the survey found that 50% of participants reported conducting organizational priority setting primarily on the basis of historical patterns or political influences rather than using formal approaches (26). As part of the survey, respondents were asked to characterize their organization?s priority setting and resource allocation process by choosing from a set of ?process descriptions? with each corresponding to a particular approach. Formal process description options included the following (percentages represent the portion of respondents who felt this description most closely matched their   10 organization):   - Our entire budget is reassessed each year, and all department and program spending needs to be justified in terms of whether or not it meets the organization?s priorities ? 14.4%  - We have a formal process that we use to set priorities and allocate resources. Everyone knows what the rules are, and how and why decisions are made. For the most part, strong evidence is needed to justify all spending decisions ? 36.7%   Descriptions for historically and politically based processes included the following (percentages represent the portion of respondents who felt this description most closely matches their organization):  - Each Department and program expects to receive about the same amount as in past years. Much of our money is tied up in things that were historically important services, but the organization is slow to adjust its spending to meet changing needs and times ? 24.4%  - The squeaky wheel gets the grease. It seems like additional money goes to those Departments and programs that complain the most loudly, and they are also the best at avoiding any cuts. Their arguments aren?t necessarily always evidence based ? 2.2%  - Our spending pattern is almost entirely determined by provincial or federal government requirements and expectations. We have very little real freedom to decide which programs or services will be funded, and to what degree ? 22.2%    A notable finding was that those who reported having a formal priority setting approach were more likely to report their organization?s priority setting processes as fair and to be generally more satisfied with the priority setting process than those who reported having a historical or political approach (26). This suggests that there may be advantages to having a more systematic, explicit, and transparent priority setting process if fairness or decision-maker satisfaction are outcomes or indicators of a successful priority setting process.   11 Therefore, a significant portion of respondents characterizing their organization?s process as ?formal? is arguably a ?step in the right direction?, but the remaining organizations? approaches suggest that we still have ?a ways to go? in creating more transparent processes that optimize available resources. Indeed, the achievements and reports from organizations with formal processes (eg. re-allocation of millions of dollars from low to high priority services, feedback from decision makers that reported a much more rational and fair process) and weaknesses that have been reported among organizations without formal processes (eg. lack of engagement, lack of transparency) (12)(15)(23)(24)(25), suggest that resources (which, as argued above, are becoming increasingly limited) could perhaps be allocated in more optimal fashion across many healthcare organizations in Canada by adopting more formal approaches.   This Canadian survey also identified wide variability between aspects of the respondents? organizational priority setting processes including levels of technical support, presence of communication plans, clearly defined criteria for assessment, use of an appeal mechanism, and others (26).  This apparent heterogeneity of approaches (formal, historical, and political) and aforementioned process aspects raises the following questions: Given these differences between organizations, which organizations have high performing priority setting processes? Can we assess the processes of organizations and offer recommendations for improvement?      12 Following on these arguments, creating an evaluation tool* that examines elements of a priority setting process that can enable greater transparency and optimal allocation of resources could be a step towards improving priority setting practices in healthcare organizations.  1.4 Evaluation Frameworks  Evaluation frameworks have been designed for the purpose of improvement in other aspects of health organizations including safety culture (27), use of information systems (28), and application of clinical guidelines (29) to name a few. Examples of similar evaluation frameworks also exist in other public sectors. For example, in an urban planning study Rutgers University, Glickman et al. (2010) created a framework to evaluate the capacity of community development corporations (CDCs) to build housing, engage in economic development projects, and provide social services. In order to develop a more comprehensive framework of capacity, investigators expanded a pre-existing definition (which focused solely on the number of houses built in a community) to include five components: resource, organizational, programmatic, network, and political capacity. These components were operationalized into an evaluation tool, and a discussion of difficulties encountered during evaluation was included in the study.  One of these recommendations was for evaluation frameworks to consider all contextual factors that affect an organizational activity - not simply its outcomes. These investigators also found that measuring intangible aspects of an organization (eg. culture) proved to be challenging (30).                                                   *Tool has been defined as an instrument to carry out a particular function - in this case: evaluation    13 Frameworks for evaluation and improvement have also been developed in private industry. One example of such a tool is the balanced scorecard. By transforming a company?s strategic goals into an integrated collection of objectives and performance indicators, the balanced scorecard provides a framework for strategic management and improvement. Like the framework created to evaluate capacity in CDCs, the balanced scorecard broadened the criteria traditionally used to evaluate organizations to include customers, internal business processes, and growth (rather than simply focusing on financial measurements). In order to implement the scorecard, objectives are developed within each domain, and measures are chosen to create a scorecard that promotes and reflects the company?s strategy (31). Methods of identifying these measures can include focus groups, one-on-one discussions, and surveys (32)(33). Within the health care context, the balanced scorecard template developed by Kaplan and Norton (1992) has been adapted to create a scorecard for Canadian hospitals (34)(35). In this research, Baker and Pink suggest that balanced scorecards can bring seemingly disparate elements of a hospital?s agenda into a single management report. In doing so, the balanced scorecard for a hospital can serve as a clear means to translate organizational goals by defining and communicating priorities to internal and external stakeholders. Baker and Pink also note that collecting data for the selected measures can prove to be difficult and at times prohibitively expensive for certain hospitals (35).  The aforementioned studies both detail how an evaluation framework can be developed and refined. In this way, their development could serve as examples for the creation development of a priority setting and resource allocation evaluation tool.    14 1.5 Evaluation Theory  In order to develop a greater understanding of how priority setting and resource allocation processes could be evaluated in healthcare, analyses of evaluation theory have been carried out to determine the most appropriate evaluation approaches for priority setting and resource allocation. The most in-depth exploration to date of the application of evaluation theory to health care priority setting was conducted by Smith et al. (1). In this examination, the authors analyzed three theoretical approaches to evaluation from Alkin and Christie?s ?evaluation theory tree? that could be applied to priority setting and resource allocation (36). The original ?Evaluation Theory Tree? was constructed as a means to categorize different approaches to evaluation by many different investigators. At the ?root? of the tree is accountability and social inquiry, and the branches include use-focused evaluation, methods-focused evaluation, and value-focused evaluation.  1.5.1 Methods Approach  The methods approach includes evaluation that is guided by quantitative research methods and is often preferred by quantitative researchers (37). A methods-focused approach emphasizes outcomes and generally includes a summative judgment of worth. For priority setting and resource allocation processes, a methods focused approach would pose the question ?does a particular type of priority setting and resource allocation process make an organization or population better off than another process?? (1). In other words, ?is a more formal process better than a continued reliance on historical or political patterns of resource allocation?? (1).   15 The role of the evaluator in this case is to develop a clear set of standards against which a new process?s success or failure can be judged (1). In this way, a causal relationship is created between the evaluand (the subject of evaluation) and outcomes used for evaluation. However, this task of establishing and measuring direct outcomes of priority setting and resource allocation processes is challenging since they are complex interventions which take place in dynamic organizational contexts that are influenced by many internal and external factors (1).   1.5.2 Use Focused Approach  Utilization focused evaluation was developed to address the lack of use and uptake of recommendations from evaluations by placing the intended user?s perspective at the center of the evaluation (38). According to this evaluative approach, evaluators must orient themselves to the ?intended user? of the evaluation by considering how their findings will be useful in affecting the evaluand.  From a priority setting and resource allocation standpoint, evaluators should consider how their work will be useful to decision makers in adjusting and adapting their organization?s process. In this way, acceptability, usability and fit of the priority setting and resource allocation process for the decision makers would all be of high consideration under an use-focused approach (1).  Engaging those decision-makers who are directly involved in the priority setting and resource allocation process as sources of data (through interviews or surveys or focus groups) would also be central to a use-focused evaluation process. Qualitative methods would be appropriate for such organizational research, which may lead to recommendations for managers (24)(39).   16 1.5.3 Values Focused Approach  This approach is strongly based upon social constructivist ontologies and suggests that organizational processes should not be evaluated in isolation; rather, the broader social impacts of the program in question should always be included (40). Accordingly, the evaluator must then consider all relevant perspectives in the discussion (41), and not simply those of the immediate decision makers. Some proponents of this approach argue that the evaluator must go further and act as an advocate for disadvantaged groups or interests (1). As a result, many participatory and community-based action research models use this approach. Dialogue among stakeholders and greater ongoing engagement are essential to a values based approach (1). Ultimately, local stakeholders? input plays a major role in determining the course and content of the evaluation. Questions that would be typical of a value-focused approach when applied to priority setting and resource allocation would include: how are resource re-allocations received by provider and patient/client groups? Who wins and who loses in each instance? How is the relative social or economic standing of health professional groups affected? (1).  1.6 Evaluation of Priority Setting and Resource Allocation in Practice  Despite its importance in healthcare, there is a limited number of studies that have addressed evaluation of priority setting and resource allocation processes (39)(42)(43). Without clear approaches to evaluation of their priority setting and resource allocation processes, this dearth of research may contribute to the limited evaluation by healthcare decision makers in Canada. In fact, a survey of senior decision makers revealed that only 20% reported that their organization   17 conducts some form of priority setting and resource allocation evaluation. In addition to impeding improvement, respondents who indicated that their organization did not perform such evaluation also reported a lack of planning, participation, and fairness in their process as well (26).  The most recent and comprehensive attempt to address this gap to date has been carried out by Sibbald et al. (2010). In their research, a Delphi approach, involving scholars and decision makers from five different countries who were chosen for their experience or interest in priority setting, was used to reach consensus on five process criteria and five outcome criteria defining ?successful priority setting? that could be used to evaluate a healthcare organization?s priority setting and resource allocation process (11). This conceptual framework, included the 10 criteria: stakeholder engagement, explicit process, transparent information management, consideration of values and context, appeals mechanism, stakeholder understanding, shifted resources, decision making quality, stakeholder acceptance, and positive externalities, was then piloted in a community hospital in Ontario (39). A table of these criteria can be found in Appendix B. The pilot was successful in eliciting strengths and weaknesses of the organization?s process including involvement of a senior manager, limited consultation with external stakeholders, and lack of a formal revision process. In a post-evaluation debrief, participants believed that the report developed as a result of the evaluation captured the essence of their process and that recommendations would be more useful if they included priority setting and resource allocation experience from other hospitals (39).    18 Investigators described this research as an ?initial attempt to evaluate priority setting decisions in a specific context.? They suggested that ?future research is required to determine the best combination of components? as well as the best method of implementation (39). In particular, they encouraged the cultivation of lessons and problems faced by healthcare organizations across Canada that could be shared as a ?set of industry best practices?. While this would require ?constant updating?, ?much work [has] to be done to refine the evaluation process? and ?ultimately improve the quality of priority setting in specific contexts? (39).  In order to further this research and address these aims, a group of investigators from Canada and the United Kingdom performed case studies of six Canadian healthcare organizations with high performing priority setting and resource allocation processes and developed a framework of key elements based on their findings (44). Based on the notion of high performance put forth in Baker et al?s 2008 book on high performing healthcare systems, these case studies aimed to assess environment, mid-course corrections, and strategic leadership (45). In this way, successes and challenges in a range of organizational settings in order to provide a more refined definition of high performing priority setting and resource allocation processes in healthcare. Organizations were selected using reputational paradigmatic sampling (46) and were vetted by an expert panel of Canadian healthcare CEOs and researchers. The selection of organizations was balanced to include variability of characteristics including geography, size (large vs. small budgets), and density (urban or rural) (44). Semi-structured interviews were used during data collection, and inductive analysis was conducted using open, axial, and selective coding (44). Sixty-two individual and group interviews were carried out with senior executive members as well as middle managers and board members. Document reviews supplemented the interviews. The   19 overall goal was to develop a conceptual framework defining what constituted high performance in priority setting practice (44). The study resulted in a conceptual framework for defining high performance in priority setting practice, which comprised four domains and 19 indicators. For the full framework including all of the elements, please refer to Appendix A.  In summary, economic uncertainty, reduction in funding, growing demands, and increasing costs have all placed pressure on Canadian decision makers to set priorities when allocating resources. The appetite for a shift from historically and politically dominated processes to more formal approaches to priority setting and resource allocation is clear, and frameworks including A4R and PBMA have been proposed to aid decision makers. Despite the advent of these frameworks, and successful implementation among certain organizations, a Canadian survey of senior healthcare decision makers has found that 50% of participants reported conducting organizational priority setting without a formal approach. This same survey revealed that even fewer respondents (20%) conduct some form of evaluation to improve their processes.  To date, there have been initiatives to develop and refine evaluation tools to measure aspects of organizations in both the public sector and private industry. Despite the importance of priority setting and resource allocation in healthcare, only a few studies have attempted to address this issue. The most recent attempt was carried out by Sibbald et al. To further their research, a group of investigators from Canada and the United Kingdom developed a framework for high performance in priority setting and resource allocation. From the framework, an evaluation tool was developed to be used to improve priority-setting practice in health care organizations. The   20 following chapter discusses the objectives of the current study in relation to development and application of this evaluation tool.   21 Chapter 2: Study Objectives and Research Questions  In order to further the existing research on evaluation of priority setting and resource allocation, this thesis operationalized a comprehensive framework for high performance (see Appendix A) into a tool that can be used for evaluation and improvement of priority setting and resource allocation practices. To realize this goal, the high performance framework created by Smith et al. (44) was converted into a tool, and its functionality was tested through application in a healthcare organization. To facilitate future implementation, lessons from the test application were used to refine the tool. Building on these objectives, the current research project had three inter-related research questions: 1. Can an evaluation tool for achieving high performance in priority setting and resource allocation be developed from a conceptual framework describing elements of high performance?   2. Can this evaluation tool capture the strengths and weaknesses of a healthcare organization?s priority setting and resource allocation process?  3. What refinements are needed to improve the tool for future application?  In this way, the strengths and weaknesses that emerge from the evaluation of the test organization?s priority setting and resource allocation process (to help them improve their performance related to their process) will serve as a data for a meta-evaluation ie. an evaluation of the evaluation tool itself. The following methods chapter describes the protocol followed to answer each of the research questions. The results chapter then describes key findings related to each question while the discussion chapter provides key interpretation and insight. This work is of critical importance since healthcare organizations across Canada are facing significant   22 pressure to set priorities and make difficult funding choices in times of fiscal constraint. Having a tool to improve this practice is thus crucial to encouraging transparency and maximizing the limited resources available.    23 Chapter 3: Methods  3.1 Research Design  In order to address the research objectives mentioned above, the research team implemented a design incorporating methodological traditions and precedents from the literature that were most appropriate for evaluation of the test organization (using the evaluation tool), and for the meta-evaluation of the tool itself. To develop the evaluation tool from the high performance framework, a protocol similar to the steps an organization would follow to create a balanced scorecard was used. While methods and values focused approaches to evaluation were incorporated in the content of the tool, use focused evaluation made up the foundation of the theoretical framework to develop the process for evaluation tool implementation.  In this way, both the literature on balanced scorecards and use focused evaluation dictated the methods used within the evaluation tool (semi-structured interviews) to collect data from members of an organization regarding the performance of their priority setting and resource allocation process. A core three-member team led the development of the tool, while the wider research team provided feedback to ensure face and content validity.  Implementation of the tool followed a ?case study? methodological tradition whereby ?an exploration of a bounded case through detailed data collection? (47) was carried out. Data from the qualitative interviews were analyzed using template analysis and content analysis to   24 determine the strengths and weaknesses of the organization. These data were also used as evidence in the meta-evaluation of the tool itself. Debriefs with members of the organization after each interview would also provide an opportunity to examine the face validity of the tool and any other logistical issues or areas for refinement.  In each of the following sections of this chapter, the steps of this design are presented in greater detail under the three ?research objective? headers: development of the tool, implementation of the tool, and refinement of the tool. The results of each section are presented in the following chapter under the same headings.  3.2 Development of an Evaluation Tool  3.2.1 The Balance Scorecard Precedent   As described in Chapter 1, the balanced scorecard provides a framework for strategic management by transforming a company?s goals into a collection of objectives and performance indicators (32). Objectives are grouped into four perspectives, and performance indicators are developed within each objective. Instruments to measure each objective are then created to populate the scorecard and create a single management report including many seemingly disparate elements of an organization?s agenda (35). Given the similar goals of the scorecard (to measure organizational performance) and our research (to create a tool that measures performance of an organization?s priority setting and resource allocation process), the structural similarities of the scorecard (with four perspectives) and the high performance framework (with   25 four domains), and the ability of the scorecard to visually translate the priorities of an organization to stakeholders, the balanced scorecard development protocol was used as a guide to operationalize the evaluation tool from the high performance framework. In this case, the four perspectives of the balanced scorecard were replaced with the four domains of the high performance framework, and the objectives within the perspectives were replaced with the elements of the domains. See Appendix A for the full high performance framework including domains and elements.   Following the balanced scorecard development protocol, indicators and possible measures for each element of the high performance framework were identified. For example, use of criteria to evaluate investment and disinvestment proposals would be an indicator for an explicit priority setting and resource allocation process (Process domain, Cell 1). Or, a respondent indicating that they had learned more about another department within their organization as a result of their organization?s priority setting and resource allocation process would be an indicator of greater understanding among participants (Outcome domain, Cell 3). Once indicators were established for each of the elements in the framework, a measurement instrument (the evaluation tool) that could be implemented to collect data related to these indicators was developed. Since the development of the evaluation tool was also informed by evaluation theory, a description of how each evaluation theory approach (ie. methods focused, values focused, and use focused) was incorporated into the tool is included in the following sub-section. A discussion of methods selection for tool implementation will follow.    26 3.2.2 Development Theoretical Framework   From an evaluation theory standpoint, the breadth of elements included in the high performance framework enabled the ?methods focused? and ?values focused? evaluation theory approaches to be incorporated within the evaluation tool ? since each indicator used to create the evaluation tool was derived from the elements of the high performance framework. With an emphasis on causality and outcomes, the ?methods focused? approach was adopted through evaluation of elements within the outcome domain including resource re-allocation and the ability of an organization?s priority setting and resource allocation process to improve population health. In addition, a ?values focused? approach was incorporated by including indicators related to the involvement of external and internal stakeholders in the priority setting and resource allocation process, and the extent to which the process considered values from the community it serves.  While these theories were incorporated into the tool by virtue of the content within the high performance framework, investigators determined that a ?use focused approach? should be the under laying framework to guide the implementation of the evaluation tool since the primary intent was to identify the strengths and weaknesses of an organization?s process and deliver recommendations for improvement. In this way, a ?use-focused? approach that places the intended user?s perspective at the center of the evaluation in order to ensure that action is taken as a result of the evaluation report was deemed most appropriate. Table 3.1 provides a short description of the above mentioned evaluation theories, and describes how each one was incorporated into the evaluation tool.    27  Table 3.1 Main Evaluation Theories   Source: Adapted from ?Using Evaluation theory in priority setting and resource allocation? (1).  3.2.3 Methods for Development of Tool   Use focused evaluation theory is now most commonly associated with Michael Q Patton?s research to address the lack of use and uptake of evaluation research by knowledge users (1)(38). To address this issue, Patton proposes that the goal of the use focused approach should be to focus on ?the intended use [of an evaluation] by intended users? (38). He argues that any method of evaluation could be used, but advices that intended users are more likely to use evaluations if they understand, feel ownership, and have been involved in the evaluation (38). When Evaluation Theories Guiding evaluation Question Evaluation Foci Primary perspective from which to calculate benefits of PSRA Method of implementation in evaluation tool Methods-Focused Approach How to identify and measure the (causal) impact of PSRA processes on desired outcomes Intended objectives and outcomes; summative; evaluation of merit or worth Evaluator Evaluation of PSRA outcome measures including resource re-allocation Values-Focused Approach How have PSRA processes affected the relative standing of different groups or social values? Process; unintended outcomes; Equity Public/Society Focused on involvement of internal and external stakeholders of PSRA. Interviewed lower level managers. Use-Focused Approach What do decision makers in a particular context need to know in order to improve the usefulness to them of PSRA? Process; intended use; organizational learning and capacity building Decision Maker Decision makers? perspectives were used to calculate the benefits of PSRA   28 developing measurement instruments for indicators in a balanced scorecard development and implementation, investigators recommend tailoring the type of measurement instrument to match the indicator data one is attempting to examine (35). Based on this theoretical approach and precedent set in the literature, the core research group in this thesis determined that qualitative methods would be the most appropriate method to implement the evaluation tool. Given the controversial nature of the topics that would be discussed, individual interviews were favoured over focus groups.   The determination to use qualitative interviews was made based on several factors. First, the intended users (senior management in the test organization) would participate directly in the evaluation. Second, the data (quotations from interviews transcripts) could be easily understood by the users. Thirdly, the same methods had been used successfully in similar research contexts (24)(39). Research on balanced scorecard development acknowledges that while some methods of data collection or instruments may be ideal to collect data related to certain indicators, at times they methods may demand too many resources to be feasibly implemented (35). Investigators in this study acknowledge that while some methods of investigation would be preferred for data collection related to indicators for certain elements (eg. a full cultural assessment of the test organization to determine levels of trust among members or quantitative examination of each resource re-allocation that took place during the past budget cycle), these options were not feasible giving the time and resources available for this particular project. Indeed, they are discussed as possibilities for future research in Chapter 5 of this paper.    29 Once the methodological tradition, theoretical framework, and implementation method of the evaluation tool had been established, the research team went about operationalizing each element and their respective indicators into questions that could be posed to members of the test organization in a semi-structured interview. This process was conducted by a core group of three researchers including the principle investigator and research coordinator for the study, and was led by the author of this thesis. Using a combination of direct (both positive and negative), indirect (both positive and negative), funnel, open, descriptive, comparison, analysis, and probing questions, the core team developed an extensive list of possible questions for each element and indicator (48)(49)(50). Table 3.2 provides a description of the question formats that were used, and examples from the evaluation tool that correspond to each format.  Table 3.2 Evaluation Tool Questions  Types of Questions used in Evaluation Tool with examples Question Type Example Direct ? questions have prescribed responses (eg. yes or no). Does your organization have a decision review process for revisiting and revising decisions made during the PSRA process? Indirect/Open ? generally begin with what, why, how. Does not have a prescribed response. How does your organization engage the public specifically in the PSRA process? Funnel ? technique that begins with general question and focuses in on a point within each answer asking more and more detail at each level. Are explicit criteria used to evaluate proposals?  Could you give some examples of criteria?  On a scale of 1-5, how would you rank the ability of the criteria to capture the benefits and costs of proposals? Descriptive ? generally begin with ?describe?, ?tell?, or ?explain?. Encourages respondents to share their experience or narrative. Describe the relationships between members of the senior management team in your organization. Comparison ? require respondents to evaluate and compare differing concepts. How does your current priority setting and resource allocation process compare to what was in place previously in your organization?    30 Types of Questions used in Evaluation Tool with examples (continued) Question Type Example Analysis ? requires respondents to think critically about a concept. Would you characterize your organization as having a culture of improvement?  How does that manifest itself in your priority setting? Probing ? questions used to elicit additional information on a particular topic If you have a concern (about a decision or an aspect of the priority setting process), how comfortable do you feel voicing it to your SMT?   Are all members of the SMT given the opportunity to contribute to discussions?  Are members of the SMT direct and honest with each other?  Similarities between elements allowed for certain themes to be addressed in different questions. By virtue of these overlaps and through the use of positively and negatively worded questions, the tool was designed to be able to determine whether respondents were describing organizational elements consistently. For example, the questions in ?Structures Element 1: Ability and Authority to move financial resources? and ?Outcomes Element 1: Re-allocation of Resources? both touch on the re-allocation of resources at different points of the interview. Significantly different responses to these questions would indicate that a participant was not answering questions in a consistent manner or perhaps did not understand the questions themselves.  Initially, the questions in the evaluation tool were developed with the intent of interviewing senior managers exclusively. In order to remain faithful to the ?values? theoretical evaluation approach, and to triangulate responses from senior management, lower level managers were also   31 interviewed. A separate questionnaire was adapted from the set of senior manager questions to capture the lower level management perspectives. This set of questions omitted certain elements that were out of the managers? scope of practice, and focused on their perspective of the organization?s priority setting and resource allocation process.  After many iterations during which both versions were edited to include questions that were most relevant to their corresponding elements and indicators (as determined by the core research group), both versions were sent out to members of the wider research and advisory teams for feedback. The broader research team of the larger CIHR funded study on high performance in priority setting and resource allocation was comprised of five health service researchers and health economists each with considerable experience in healthcare decision-making and implementation of priority setting frameworks. A further advisory team was made up of six senior management personnel (including directors and CEOs) from health authorities across Canada. Feedback was requested regarding the clarity of questions as well as how well they were thought to capture the corresponding elements from the high performance framework. Revisions to language, simplification of questions, and combination of similar concepts were carried out based on comments received.        32 3.3 Implementation of the Tool  3.3.1 Methodological Tradition for Implementation  A case study is a methodological approach used for exploration of a bounded system or case through detailed data collection. The bounded system is bounded by time and place, and is the case being studied eg. a program, event, activity, or group (47). In this case, the bounded system or case is the priority setting and resource allocation process of the test organization. Through data collection tailored to each particular case, a detailed description emerges as the researcher analyzes themes or issues and makes assertions about the case (51). In this way, the case study tradition lends itself well to the evaluation discipline (47). One of the main challenges inherent in this case study methodology is building trust and rapport with study participants. To address this issue, purposive sampling ? sampling based on a particular characteristic (52) -  was used to select a test organization that was familiar with the researchers in this study and eager to participate in an evaluation. The following section describes the test organization?s priority setting and resource allocation process prior to evaluation.  3.3.2 Description of Test Organization?s Process  Certain characteristics of the test organization have been deliberately withheld to protect its identity. This was done in order to maintain the anonymity of participants, and encourage trust in participation. Between 2007 and 2010, the organization under study implemented program budgeting and marginal analysis (PBMA). As described in the introduction, program budgeting   33 and marginal analysis (PBMA) is a framework for priority setting and resource allocation. It focuses on developing a set of weighted criteria based on an organization?s strategic goals, and applying those criteria to investment and disinvestment proposals. In our test organization, key ethical conditions as described in Accountability for Reasonableness (engagement and fairness) were also purposefully incorporated, including an attempt to focus on communication (i.e., publicity), having a clear set of criteria (i.e., relevance), and basing proposals on the best available evidence (i.e., relevance).  Implementation began in select program areas with full organization-wide adoption taking place during the 2011/2012 fiscal year. Prior to implementation, strategic plan consultations were held to facilitate input from external stakeholders. The final version of the strategic plan was drawn upon to create criteria for investment and disinvestment proposal assessment, and forms the foundation of the organization-wide process.    During creation of investment and disinvestment proposals, managers consulted frontline staff. Initially, ?short form? business case templates were prepared that simply included a summary of the proposal ie. general description and some (limited) evidence support. These abbreviated investment and disinvestment proposals were then sent to the Validation Working Group (VWG) ? a group of middle managers that received additional training and performed preliminary assessment of short form proposals using a criteria-based assessment tool. The assessment tool contained weighted criteria based on the organization?s values, and allowed for each proposal to be given a numerical score that enabled comparison across proposals.     34 Proposals that received high scores on the assessment tool were advanced to the next stage in which a ?long form? business case template was completed. Long form proposals generally required additional evidence and support for evaluation at the Senior Management Team (SMT) level. Using the same criteria and assessment tool as the VWG, the SMT would then score proposals and make the ultimate decision as to whether they would be accepted. Managers who created the proposals were notified, and implementation could then begin.  Table 3.3 highlights the differences between a theoretical PBMA process, as defined in the literature, and the process followed by our test case organization. The reader should note that the author does not intend to imply that PBMA is the ideal model of priority setting or that this table is part of an evaluation. Rather, the table below is designed to illustrate the process that the test organization was using at the time of evaluation, and how it compared to a theoretical PBMA process.             35 Table 3.3 Comparison of PBMA in Theory to PBMA in Test Organization Theoretical PBMA (2) Test Organization?s Process 1. Determine the aim and scope of the priority setting exercise - Strategic plan developed using values elicited from internal and external stakeholder consultation. - Scope of exercise was determined and increased over time to reflect aims of organization. 2. Compile a program budget (ie. map of current activity and expenditure) - No program budgeting performed 3. Form marginal analysis advisory panel - Validation Working Group (VWG) created to evaluate condensed ?short form? proposals using Assessment Tool ? ?Long Forms? of successful proposals requested by VWG, and sent to senior executive  4. Determine locally relevant decision making criteria (Decision maker, Board of Director, and Public Input) - Decision making criteria were created using the strategic plan as a foundation.   - Criteria were weighted and operationalized into an Assessment Tool  5. Advisory panel to identify options in terms of areas for service growth, and resource release - Middle and Frontline Managers responsible for submitting investment and disinvestment proposals ? VWG then reviewed short forms before sending long forms to SMT 6. Advisory panel to make recommendations in terms of: funding growth areas, re-allocation of resources 7. Validity checks with additional stakeholders and final decisions to inform budget planning process - Senior Executive team used Assessment Tool to evaluate ?Long Form? proposals and make resource allocation decisions   3.3.3 Sampling  To obtain a sample that reflected the diversity of the organization, participants were selected across their geographical location, department, and level of success with submitting past proposals for investment and disinvestment (ie. whether their proposal was approved or not approved). In order to accomplish this distribution of participants, a purposive-criterion sampling   36 matrix similar to Table 3.4 was used. Every participant was placed into the matrix, and equal distribution was sought across each quadrant. The reader should note that the table is meant to be illustrative, and does not contain information from the test site.  Table 3.4 Sampling Matrix  29 members of the organization were invited to participate and 27 were interviewed using the evaluation tool. Two members were unable to participate due to scheduling conflicts. Participants included 3 clinical leaders, 8 managers, 12 directors, and 4 executive team members. There was representation from all three geographical regions of the organization. Clinical, non-clinical, capital, and information technology (IT) departments were also represented. Participants included members who had submitted both investment and disinvestment proposals (both accepted and rejected) in the last priority setting and resource allocation cycle, as well as individuals who had not submitted proposals at all. The interviews themselves were carried out  Submitted Investment Proposal Submitted Disinvestment Proposal Approved Name Title Location Name Title Location John S Director Location A James D CIO Location G Alice C Nurse Manager Location C Jane E IT Manager Location B Bill S VP of Finance Location E Steph B CEO Location E Not Approved Jason R Clinician Manager Location F Dave D VP Operations Location E Jill V Director Location I Doug L Director Location A Kent F VP Planning Location E Mark J Service Admin Location I   Name Title Location No Submission Kendra C Chief Medical Health Location B Michael K Manager Home Health Location G Ron D Surgical Lead Location F    37 with participants in person (n=12), through videoconference (n=13), and over the phone (n=2). The author of this thesis conducted the interviews; however, several interviews were also attended by the research coordinator to ensure rigour of data collection (ie. that biases was not being imposed on the data and that misleading questions were not being asked).   3.3.4 Interviews  The semi-structured interviews lasted approximately 60 minutes with 45 for the evaluation and 15 minutes for participant evaluation of the tool itself (described below). Interviews were audio-taped with permission, and notes taken during the interviews served as the basis for analysis. Interviews were carried out in person, through video-conference, and over the phone.  In order to ensure rigor of the qualitative analysis, a structured protocol was followed. First, a template method of analysis (that begins with a framework of existing organizing codes ? or, a template) was used (53). In this case, the high performance framework from Smith et al. served as the ?template? (44). Following this approach, responses were sorted by the researcher into the elements of high performance from the framework (eg. training, stability, culture of improvement, etc.). Once categorized, data were examined to determine whether the element was a ?strength?, ?area of improvement?, or ?weakness? of the organization.   To make this determination, the investigator applied content analysis, ?a technique for making inferences by objectively and systematically identifying specified characteristics of messages? (54). In this case, this analysis was applied to interview transcripts by examining the language in   38 each quote and examples provided by participants. Positive wording (eg. ?excellent?, ?very good?, ?done very well?) and negative wording (eg. ?not good?, ?poor?, ?totally lost on this?) were identified in data related to the particular elements. In this way, quotes from participants related to a particular element were coded as ?positive? or ?negative?. To make a judgment as to whether the element was a strength or weakness, the investigator determined whether there was a preponderance of positive or negative quotes related to that specific element. Examples provided in quotes also served as downstream indicators for certain elements (eg. knowledge of criteria by middle managers would be an example/downstream indicator of a successful training program and vice versa - lack of criteria knowledge among middle managers would be an example/downstream indicator of an unsuccessful training program). Finally, descriptions of elements from the high performance framework were also used as reference points for determining the strength of elements.  Following this protocol, elements that were described by the majority of participants in a ?positive? way with agreement to descriptions in the high performance framework and had supporting examples were categorized as strengths. Negatively worded quotes related to a particular element that conflicted with the description in the high performance framework and were supported by negative examples were categorized as weaknesses. Elements were identified as areas for improvement when their sub-elements formed a mix of strengths and weaknesses, or the negative language in the quotes was softer (eg. ?needs some improvement?, ?something we need to work on?, ?not where we want to be on this?). By following this protocol, the investigator used the language and examples in quotes related to particular elements as well as the descriptions in the high performance framework to determine whether an element was a   39 strength, area for improvement, or weakness. Table 3.5 below gives two examples of how these determinations were made for both a strength and weakness of the test organization?s process.   Table 3.5 Evidence Used to Make Strength and Weakness Determinations Evidence used to make determination Strength ? Strong and Stable Leadership Team Weakness - Training Description from high performance framework There is relative stability of organizational structure and continuity of personnel. SMT is aware of and manages the external environment and other constraining factors, and is willing to take and stand behind tough decisions. Skill development for PSRA occurs throughout the organization including managers, directors, senior executive and clinical leaders. Direct responses to evaluation questions ?Again, overall, I think that they [the SMT] are excellent. One of the main people of course is the CEO. Very bright lady and this is one of her best areas in fact ? (B-25, Middle Manager) ?I would think that there would need to be more time spent on educating staff about the decisions and how the decisions came to be made? (I-16, Middle Manager) ?Incredible role models, great people. People who care. I love the people that I work with, I love working in [this area] and I love working with [our health authority], really. We have the best CEO, not that I?m competitive or anything, but we have a fabulous CEO. We?re lucky. We have great COOs and we have really, really great people? (E-19, Middle Manager) ?I?m not always sure still when I say PBMA to my management group on the ground here that they all know exactly what I?m talking about.? (F-22, Middle Manager)  ?So, there?s actually a very good core of stability and when we have turnover, we do a very good job at bringing people into the culture in the orientation.? (I-6, Senior Manager) ?So, I think as you talk to other HSAs they probably will say, ?We didn?t understand enough about PBMA.? We?re learning it as we go but it?s not something that we sat down for a day?s training on what to do. ? (D-21, Middle Manager) Examples Stood by difficult disinvestment decisions despite public outcry. (I-6, Senior Manager) Middle manager who had very little knowledge of process (A-18, Middle Manager) Challenged mandate from government using process (C-2, Middle Manager) Descriptions of very informal training (D-21, Middle Manager)  To ensure accuracy of analysis and include a broader perspective, several strength and weakness determinations were carried out by the research coordinator and author independently and subsequently compared to test for agreement.  All determinations were discussed among the core   40 research team, and resulting strengths and weaknesses for all elements of the organization?s priority setting and resource allocation process are presented in the results section.  3.4 Refinement of the Tool  Refinement of the tool was informed in three ways. The first was investigator field notes collected during interviews with participants. The investigator noted ability of participants to comprehend questions, and potential logistical issues with timing or running of interviews were also documented. Notes were used to guide refinements of the tool for future application. In addition, 15 minute debriefs were carried out at the end of each interview. This provided an opportunity to question each participant directly about the content and delivery of the evaluation tool. The questions posed in this debrief are found in Table 3.6.  Table 3.6 Follow up Questionnaire Questions posed during evaluation of tool with participants  from test healthcare organization 1. Were there elements of high performance in PSRA that you felt were not included in the interview? Were their elements addressed that you felt should not be included? What would you add or remove? 2. Do you think that the wording of the questions will elicit honest + complete responses from participants? Were they too vague or not specific enough? 3. Can you think of a medium that would be more suitable for the diagnostic tool to be implemented? 4. Was the length of the interview manageable? 5. Did you feel comfortable sharing the weaknesses of your organization in certain aspects of priority setting? Would you feel as comfortable if it was someone internal to your organization doing the interview? 6. How can we ensure validity of the tool? Or make sure that the questions we are asking are being answered truthfully? 7. How would you like to see recommendations from this evaluation presented? What would make the most impact for you?    41 Finally, observations based on the analysis of interview data were used to inform refinements. A thematic approach allowed for new elements of high performance to be highlighted and incorporated into the existing tool.  3.5 Summary  Investigators used a previously defined high performance framework (Appendix A) to develop a tool that could be used to evaluate the strength and weaknesses of a healthcare organization?s priority setting and resource allocation process. The theoretical framework for developing the evaluation tool included methods, values, and use focused approaches to evaluation. Seven different question formats were used, and two different versions of the tool were created for senior and lower level managers. A sampling matrix was used to select participants from a test organization for 60 minute interviews that were audio taped and transcribed. Template analysis was used to categorize data from interviews, and a protocol for determining strengths and weaknesses was implemented. At the end of each interview, a 15 minute evaluation of the tool itself by participants served as the primary basis for refinements for future application.  In the next chapter, the results will be presented. First, the evaluation tool itself will be described, and then following this results from interviews (including strengths and weaknesses) will be presented. Finally, refinements for the tool will be outlined to enable effective application in the future.   42 Chapter 4: Results  4.1 Development of the Tool  As described in the methods section, all three evaluation theories were incorporated into the tool: a methods approach was implemented by eliciting responses pertaining to the ?re-allocation of resources? and ?health outcomes? related to the organization?s priority setting process; the values approach was incorporated by investigating elements that effected downstream stakeholders including engagement, training, and communication; and the use-focused approach formed the theoretical framework for implementation of the tool itself. Using this theoretical framework and drawing on balanced scorecard development as a related example, the core research team operationalized the elements and indicators from the high performance framework for priority setting and resource allocation into a set of semi-structured interview questions that could be used to collect qualitative data from members of the test organization. The research team and decision maker advisory group, from the larger CIHR study that this thesis work was a part of, delivered feedback on the face and content validity of the tool.  As a result, all 19 of the elements from the high performance framework were operationalized and edited according to this protocol. Some sub-elements were also converted into questions. The sequence of domains in the tool was altered from the high performance framework so that the process domain was addressed first, followed by structures, attitudes/behaviours, and outcomes. This alteration was made because the questions in the process domain were thought to   43 be less controversial relative to the other domains, and it was felt that this would be a good way to ease participants into the evaluation.   Elements generally contained two to three questions, and the entire interview was designed to last approximately 45 minutes with 15 minutes left over for evaluation of the tool itself. A copy of the evaluation tool in both the senior and middle manager forms can be found in Appendices C and D. The results of the evaluation tool implementation are detailed in the following sections.   4.2 Implementation of the Tool  Following a ?case study? methodological tradition, semi-structured interviews were carried out with the 27 members of the test organization as described in the methods section. Interviews were recorded and transcribed. This data was then analyzed using template and content analysis, and results were presented to senior management in a report. The report included a description of the organization?s current process, a color-coded dashboard of the framework for high performance representing the strengths, weaknesses, and areas for improvement in the organization, in-depth descriptions of strengths and weaknesses, and recommendations for improvement.  Table 4.1 is the ?dashboard? of strengths and weaknesses presented to senior management. The purpose of the dashboard is to give decision makers a comprehensive view of their priority setting and resource allocation process? strengths and weaknesses on a single page. This is   44 presented using the high performance framework referred to above with each element colour-coded as follows: strength=green, area for improvement=yellow, weakness=red.   The number of green elements (strengths) reflects the relative maturity of this organization?s formal priority setting and resource allocation process. Notable areas of strength include the culture and structures that this organization has put in place. However, the yellow and red elements clearly indicate that this organization has areas for improvement that need to be addressed to enhance their current process.              45 Table 4.1 Pilot Organization Dashboard                         46  In depth descriptions of the organization?s strengths and weaknesses are included below. Each strength and weakness is tagged to an element on the dashboard. For example: Communication to frontline staff is lacking - P2. This is an area of weakness that needs to be addressed (as indicated by the red colouring), and directly refers to P2 = ?process cell 2? in the dashboard. While the majority of the elements from the framework are addressed in the results section, there are exceptions. These omissions will be addressed in the discussion chapter. Having presented the overall summary of the evaluation in the test organization, the thesis will now turn to addressing the key strengths and weaknesses in some detail.  4.2.1 Strengths   In this section, strengths of this test organization?s priority setting and resource allocation process are presented. Elements that were described by participants in interviews using positive language and closely matched descriptions in the high performance framework were characterized as strengths. Table 4.2 provides a summary and short description of all the strengths including coloured tags that correspond to the elements from the dashboard in Table 4.1. The order of strengths presented in Table 4.3 is intended to follow a chronological flow ie. to develop a strategic plan an organization must consult the public; then public opinion is used to craft a strategic plan which is in turn used to develop the criteria for assessment of proposals and an assessment tool; frontline staff are then involved in proposal creation which must be supported by evidence; in order for the process to perform at a high level it must be led by a   47 strong and stable team in a culture of improvement; if these strengths are in place, re-allocation of resources from areas of low-priority to high-priority should be able to take place Table 4.2 Strengths of the Test Organization?s Priority Setting and Resource Allocation Process  In the following sub-sections, each of the strengths from Table 4.2 are described in greater detail. Each sub-section begins with a brief description of why a particular strength is important to priority setting and resource allocation using existing literature, and describes how the strength was operationalized in the test organization with quotes from evaluation interviews. Quotes are categorized by geographical location of the participant in the test organization (represented by the letter), their role in the organization (middle or senior manager), and a number for identification purposes.    While the practice of presenting material from existing literature in the results section of a paper is admittedly unorthodox, this has been done to provide context for the reader in each sub-section Elements of High Performance Description in Test Organization 1. Public Engagement - A4, A3  Used town hall meetings to elicit public values and priorities that informed the creation of strategic plan 2. Criteria and Assessment Tool - P1, A3, O4 Criteria for evaluating proposals are linked to strategic plan and applied consistently in a weighted Assessment Tool 3. Frontline Staff Involvement - S2  Frontline staff are engaged at the proposal development stages 4. Evidence Based - P1  Data collection is supported, and multiple forms of evidence are used to inform proposals 5. Leadership Team - S4, A1, O2  A strong and stable senior management team supports the process 6. Culture of Improvement - A2  A desire to improve and enhance existing mechanisms is pervasive in the organization 7. Ability and Authority to Move Resources - S1, O1  Managers have the ability and authority to re-allocate resources   48 by illustrating the importance of each strength. The reader should also note that some of the strengths of this test organization?s process could be refined even further ie. simply because a particular element is a ?strength? does not mean it is ?perfect? and cannot be tweaked to deliver additional performance. Indeed, the coloured tags in Table 4.2 clearly illustrate that some strengths contain elements from the dashboard in Table 4.1 that are also ?areas for improvement?. To reflect this, the presentation of some strengths in the following sub sections will include findings from evaluation interviews which include areas for improvement.   4.2.1.1 Public Engagement  As taxpayers and patients, the public represent payers and users of the health system. Both ideological and pragmatic arguments can be made for their engagement in priority setting processes including social justice and democracy (3)(55)(56)(57), the unique ?lived? experience they bring (58)(59)(60), and the fact that early involvement can generate buy-in for later decisions (58)(60). As such, public engagement is key to an organization that desires to have long-term strategic alignment and fit with social and community values. Both these elements are located in the attitudes and behaviors domain of the evaluation tool in table 4.1 and high performance dashboards in Appendix A.  In the test organization, public engagement took place during the development of their strategic direction in the form of town hall meetings.  ?So, we engage the public in setting our strategic direction. We engage the public in talking about issues that we know are important to them and trying to understand where they?re coming from? (I-4, Senior Manager).   49  Questions posed to community members during these consultations included ?What are your priorities? What do you think your community needs? What things is the organization doing well?? (C-2, Middle Manager). Their responses informed the creation of the organization?s strategic plan that in turn formed the foundation of their entire priority setting and resource allocation process.  ?So, [criteria are] based on the mission and vision for us. It?s directly connected to the strategic plan, the focus on our people.? (C-2, Middle Manager).  In order to ensure continuing alignment of values, this organization has also held ongoing meetings with certain sub-groups of the population it serves. One example noted by a manager included meetings with aboriginal health improvement councils. These meetings reportedly promoted ?meshing? of values, and have increased the level of ?ownership? among members of the councils (B-3, Middle Manager).  In addition to these consultations, input from the public has also been solicited to provide data to support certain proposals within the priority setting and resource allocation process. In several cases this took the form of patient satisfaction surveys in particular service areas (eg. mental health, senior services, primary healthcare etc.)(I-4, Senior Manager)  Despite this involvement in a consultative capacity, members of the public had not been involved directly in this organization?s decision making. Further engagement could be trialed by including representation from members of the public on the validation working group. One manager suggested that this may be a possibility in the future, but had not been tested thus far because   50 their process ?hadn?t fully cemented? and it was still ?too early in the process? to directly involve members of the public (C-2, Middle Manager).  4.2.1.2 Criteria and Assessment Tool  If the mechanisms employed to guide resource allocation are inequitable or even non-explicit, the distribution of resources will potentially be sub-optimal and will certainly lack transparency (3)(61). Within these mechanisms, the criteria used to evaluate proposals are likely to have profound implications on the final decisions made (61).  By linking criteria to the strategic plan of the organization and consistently applying them to each proposal using a formal assessment tool, organizations can better direct their resources to high priority areas that will meet their long-term goals and objectives (44). All three of these elements are located in Process, Attitudes, and Outcomes domains of the evaluation tool in table 4.1 and high performance framework in Appendix A.  Prior to the implementation of their priority setting and resource allocation process, managers in the test organization reported that ?there was no objective criteria? used in the resource allocation process. ?It was basically whoever could yell the loudest? (I-5, Middle Manager). Since introducing a more formal approach to resource allocation, this organization has developed criteria ?that are tied to [its] strategic plan? (I-6, Senior Manager). Managers agreed that proposals which did not reflect the strategic plan were not funded within their process. (H-7, Middle Manager)(I-8, Middle Manager) ?Strategic planning is obviously the key ? and it forms the basis for the criteria? (I-9, Middle Manager).   51  To ensure consistency in applying the criteria, the test organization developed an assessment tool based on the criteria and applied this to its funding decisions. ?I would also say that the criteria are explicitly stated in the process and are part of the formal ranking tool that applies to every submission? (I-6, Senior Manager). Further, within the assessment tool, criteria are defined and weighted, and scoring guidelines are provided to users. In addition to self-scoring by the proposal authors, the validation group (consisting of mid-level managers and clinicians) also evaluated and scored each proposal using the assessment tool.  In this way, a peer-review check encouraged validity and consistency in scoring.   ?With the development of the validation working group last year, I believe that we?ve seen a lot closer adherence to the criteria for scoring, yes. Previous to that when it was self-scoring it was challenging to be sure that there was validity to the scoring? (C-2, Middle Manager).  Although the criteria used in this organization?s priority setting and resource allocation process were developed using their strategic plan and implemented in a consistent manner within an assessment tool, there still remains room for refinement. For example, priority setting and resource allocation literature advises that criteria must be specific enough to distinguish between proposals, and must be broad enough to ensure that important areas are not neglected (61)(24). In this organization, there was general agreement that the existing criteria do not capture the benefits of non-clinical (i.e., corporate and/ or administrative) proposals as well as they could.   ?I would say that on the clinical side it's easier than the non-clinical side. And so, it seems to have a bit of a bias in the structure the way that it's is applied in healthcare. So, that the support services on the admin side seem to rank less in the big picture? (I-10, Senior Manager).     52 As a result, many middle managers in non-clinical areas reported difficulty aligning the benefits of their proposals with the clinically focused criteria.  ?[The criteria] lead to a bit of challenges when it comes to administrative kinds of proposals versus clinical? We propose to do this and we hope it will improve our quality over time. But I can?t say that because if I do this quality education program I?ll save 50 lives? (I-11, Senior Manager).  Another respondent perspective: ?So, laundry, housekeeping, we can?t run hospitals and organization without that service. But if you look at the strategic plan I don?t know where they find themselves other than a high-quality services but then again they?re one step removed from the direct patient care but that can?t happen without it. Or even in finance. What if they need to invest in something, a new system or a different way of doing something? They?re indirectly supporting patient care but really they?re part of making the organization run and how would they fall in that criteria? (I-12, Middle Manager).  Further, both middle and senior managers also reported that some proposals were approved without passing through the formal assessment process. These proposals were either directed towards external mandates or simply didn?t score well with the existing criteria, but were deemed necessary by the senior executive (G-1, Middle Manager)(A-13, Middle Manager).  ?One example might be there was a clerk for physician recruitment? We actually had physician recruiters but they were doing a lot of administrative work -- booking flights for physicians and whatnot. We were able to take that work from them, give it to the clerk at a lower cost and basically expand the number of hours that these folks now had. So, the number of physicians we were able to recruit grew. Growing the number of physicians has a positive impact on clinical [outcomes]. But it didn?t score high against the criteria because the criteria was kind of clinically driven? (I-14, Senior Manager).   These data on criteria would suggest that while there were clearly a number of key strengths (i.e., having identified criteria, use of a formal assessment tool), there were again, just as was the case for public engagement, some areas for improvement. In particular, the test organization should   53 address the fit of the criteria with non-clinical proposals, and further ensure that ?end-runs? (i.e., senior executive making ?out of bounds? decisions) are either minimized or are done so in a transparent manner with clear rationale attached.  4.2.1.3 Frontline Staff Involvement   By soliciting proposals for investment and disinvestment from the frontline, organizations can engage staff in priority setting and resource allocation decision. In doing so, they can take advantage of direct experience to deliver more accurate assessments of proposals, and potentially improve buy-in to final resource allocation decisions (44). As such, this element is located in the structures domain of the evaluation tool in table 4.1 and high performance framework in Appendix A.    In the test organization, staff participation in the priority setting and resource allocation process was evident in several ways. The most common method appeared to be during the creation and refinement of proposals.   ?You may have a working proposal and it?s very much then refined by discussion with staff as to feasibility. You also need to operationalize that plan. And obviously the input from staff is really important to be able to do that? (H-7, Middle Manager).   Even in the creation of funding proposals, managers reported that staff were very helpful with suggestions. For example, ?We don?t need this extra person on this four hours,? or, ?we continue to throw away a third of this because it doesn?t get used before it?s outdated? or, ?you buy all these things and it?s really expensive and we should be going back to these because they?re   54 cheaper and they?re just as good? (G-1, Middle Manager)(I-16, Middle Manager). This ?more open and transparent? approach which facilitated staff engagement was very different from previous years when managers ?had 48 hours to come up with ways to cut 2, 3, 4%? arbitrarily from their budget and ?kept it under wraps until [they] were ready to move? (D-17, Middle Manager)(B-3, Middle Manager).  Senior managers also highlighted LEAN - a quality improvement methodology that encourages waste reduction and a focus on patients - as ?another tool that really does help with engagement of front-line staff?, and provides a methodical way to work through becoming more efficient (I-4, Senior Manager). Incentives for staff participation included paid time for attendees and food at meetings.  In addition to contributing to proposal creation, a group of staff internal to this organization also participated directly in proposal evaluation by sitting on the validation working group (VWG). This group consisted of middle and front line managers, and was headed by a senior lead. Each member received training in the organization?s priority setting and resource allocation process, and in each budget cycle they were tasked with evaluating all the proposals put forward by staff using the organization?s assessment tool. One member of the group who had come from a clinical background was reportedly very skeptical of the process at first, but then ended up describing her experience on the VWG as ?an amazing opportunity? that she would recommend to any of her colleagues (C-2, Middle Manager).    55 Similar to ?public engagement? and ?criteria?, the frontline staff involvement in the test organization has some areas for improvement. Challenges faced by this organization?s engagement strategy included loss of interest among staff when the lag time ?between when they were asked for input and information and when a decision was implemented? was too long (C-2, Middle Manager). Soliciting input from physicians and support staff were also highlighted as weak points in the overall engagement strategy. Finally, several participants reported that ?every manager has a different way of engaging their frontline staff? and recommended greater standardization since some methods were ?highly successful and doing extremely well and others [were] more ?traditional? ? (A-13, Middle Manager). Two contributing factors to this lack of engagement are the training and communication related to this organization?s priority setting and resource allocation process. Both these weaknesses will be discussed in greater detail in Section 4.2.2.  Thus far, findings from public engagement, criteria, and staff involvement elements have been presented. In the remainder of this section, strengths in this organization relating to evidence, leadership, culture, and ability to re-allocate resources are addressed. Subsequently, the weaknesses of this organization?s priority setting and resource allocation will be presented and described in a similar manner.  4.2.1.4 Evidence Based   Whether in the form of a Cochrane review or qualitative input from staff, evidence is needed to support proposals in priority setting and resource allocation processes. Comparative assessment   56 of proposals can be fostered through clear presentation of relevant data (3); however, traditional scientific evidence may be unavailable in particular areas including those outside of acute or clinical care (44). Organizational data collection can be facilitated by supports including data analysts, mechanisms for evaluating patient outcomes, and research services. An evidence basis for decisions is a key sub-element of the explicit process element in the process domain.  When developing proposals for investment or disinvestment, both senior and middle managers in this organization acknowledged that in an ideal world they would have ?published, randomized controlled studies demonstrating absolutely beyond a shadow of a doubt? whether a proposal should be adopted, but that they ?rarely get that? (I-6, Senior Manager). When this type of evidence is unavailable, managers reported that clinician and staff experience, experience of other health authorities, input from communities, and administrative data were among the forms of evidence put forward to support proposals (A-18, Middle Manager)(I-6, Senior Manager).  For those middle managers that used available support personnel (eg. data and budget analysts, research specialists), they typically found them to be very helpful in collecting and compiling data (D-17, Middle Manager)(I-16, Middle Manager). The same managers also acknowledged that their peers with less experience developing business cases or performing literature searches often struggled with this portion of the process (I-6, Senior Manager).  Despite multiple forms of evidence used to support proposals and the availability of supports, challenges still remained for middle managers when developing their evidence base to support their proposals. It was widely acknowledged that those with non-clinical portfolios struggled far   57 more than their clinical counterparts when developing proposals. The differences stemmed from the fact that acute and clinical programs have benefitted from ?robust statistics at the ministry level for quite some time?, while ?there are challenges in the community programs with statistical collection and analysis? (G-1, Middle Manager). This gap between data availability and quality of data was reported as a significant challenge for middle managers working in non-clinical areas.   4.2.1.5 Leadership Team  Without a functional leadership team that has some level of stability, high performance in priority setting and resource allocation or any other organizational process would be difficult to achieve (44). Indeed, as Peacock (1998) claims ?instability in health service organizations almost inevitably results in the failure of priority setting processes? (42). Given that leadership affects many aspects of an organization, the elements from the evaluation tool and high performance framework included in this strength span the structure, attitudes and behavior, and outcome domains.  Middle managers described their senior leadership team as ?engaged, committed, educated, compassionate?, and agreed that they all worked within a culture of respect (A-18, Middle Manager)(A-13, Middle Manager).  ?Incredible role models, great people. People who care. I love the people that I work with, I love working in [this area] and I love working with [our health authority], really. We have the best CEO, not that I?m competitive or anything, but we have a fabulous CEO. We?re lucky. We have great COOs and we have really, really great people? (E-19, Middle Manager).   58   Senior managers agreed that they have created a ?strong, collegial, supportive, and innovative environment? with a ?shared vision, trust and collaborative attitudes.?(I-10, Senior Manager).  ?I think we have a collective feel for that. I would say we have a healthy team of relationships. We wouldn't get a message from your CEO that says, ?I'm going to give you this money, but don't tell anybody else that you have it.? That wouldn't happen? (I-10, Senior Manager).   Senior managers also credited their organizational stability as a key factor in being able to ?maintain movement forward? and focus on a shared vision for priority setting and resource allocation in their organization (I-6, Senior Manager)(I-11, Senior Manager). Both middle and senior managers agreed that strong leadership was displayed in ?going down the road of PBMA? towards a more formal and explicit priority setting and resource allocation process, but that the process must be pushed even harder ?past its infancy and newness? by the senior management team (G-1, Middle Manager).  Respondents also felt that that the process could be taken to ?the next level? where the health authority would support disinvestments in certain low priority services to fund higher priority programs, and support these decisions in the face of government pressure (G-1, Middle Manager). In a move towards this direction, several senior managers described instances where they turned down money from the government to fund an initiative that did not fit their strategic criteria, and where they cut services to a low priority area despite intense public pressure (I-6, Senior Manager)(I-14, Senior Manager).     59 ?It?s sometimes hard to stand up in public and say we?re not investing in these services, but it?s way easier to stand up and say, ?And here?s the process that we went through, this is why we?re doing it. This is where the money is going.? (I-6, Senior Manager).   4.2.1.6 Culture of Improvement   A culture of change and improvement has been documented as a critical factor for facilitation of priority setting and resource allocation in healthcare organizations (22)(62). As an element, it is located in the attitudes and behaviours domain of the evluation tool in Table 4.1 and high performance framework in Appendix A.  In the test organization, senior managers reported a continual focus on the future and ensuring success by ongoing refinement ? both indicators of an improvement culture.   ?This year we?ve started the process earlier. Next year it?ll be better as well. We haven?t finished this year and we?re already talking about next year. So, as you see it?s getting better and better and it goes back to building it into the day-to-day activities as well, the expected way we do business? (I-6, Senior Manager).  These efforts seem to be recognized and felt by middle managers as they described a process that is ?becoming more refined? after multiple applications (H-7, Middle Manager).  ?We?re really a couple of years into this, but my experience this year was I think the process certainly ran smoother than it did last year, and I think people had a better understanding of it? (H-7, Middle Manager).  Although a ?culture of improvement? can mean very different things to different members of an organization, both senior and middle managers from the test organization seemed to agree that   60 they were all committed to improvement. A senior manager summed up their overall attitude in the following quote:  ?I think when you have 700 million dollars to spend, you definitely can never fool yourself into thinking that you?re 100% efficient... I think we can always improve? (I-4, Senior Manager).    This same openness to change and awareness of imperfection has been documented in several studies of priority setting and resource allocation as necessary attitudes for improvement (41)(62)(42). Indeed, based on responses from middle and senior managers the test organization seems to have developed both a strong and stable leadership team that have facilitated and drove a strong culture of improvement.   The final sub-section below describes how this organization has also been able to re-allocate resources within their priority setting and resource allocation process. The following section then transitions from the strengths of the test organization to its weaknesses.  4.2.1.7 Ability and Authority to Move Resources   Maximizing the benefits of health services offered to an entire population requires re-allocation of resources from low yield programs to higher yield programs (3). As Smith argues, ?If the senior management team is constrained in its ability to make these organization-wide trade offs, optimal distribution of resources may not be achieved? (44). This key outcome of priority setting and resource allocation processes is addressed in both the structure and outcome domains of the evaluation tool and high performance framework.   61  Many senior managers in this organization agreed that they had the technical capacity to re-allocate resources and that there ?is still the availability to move money around and make resource allocation decisions? (I-6, Senior Manager). Some acknowledged that before they make a decision they have to ?make sure that it aligns with government?s vision and change agenda? (I-4, Senior Manager). As a result, their authority to re-allocate resources may be somewhat limited.  ?I think some of the challenges are political in nature even though the organization doesn?t like to be at the mercy of a government directive. Government can be very reactive to bad news stories, especially leading up to an election year and may give directions about clinical things that they all of a sudden the story gets out that hip and knee replacements -- the waiting list in British Columbia is twice that that is in Saskatchewan and then word comes down, ?Must do more knee and hip replacements.? And so money that may have been saved in that or in another area is being reinvested in something that we may not have seen as a priority? (C-2, Middle Manager).  Despite these limitations, all of the senior managers interviewed reported that re-allocations of resources took place during their last priority setting and resource allocation cycle.   4.2.1.8 Summary of Strengths  Overall, the respondents in our sample viewed the test organization?s process for priority setting and resource allocation as positive.  For the most part, proposals follow a consistent process of evaluation that is well supported by structures and the culture of the organization. As would be expected, despite these strengths, areas of improvement do exist and the process is not without   62 its weaknesses. Additional findings related to both these areas will be discussed in the following section.  4.2.2 Weaknesses  Having described the strengths of the test organization?s priority setting and resource allocation process in detail, the thesis will now focus on weaknesses identified by participants in the evaluation interviews. Elements that were described by participants in interviews using negative language that did not match descriptions in the high performance framework were characterized as weaknesses. Table 4.3 provides a summary and short description of the weaknesses in this organization?s process. Once again, the coloured tags following each weakness identify the corresponding element from the dashboard in Table 4.1  Table 4.3 Weaknesses of the Test Organization?s Priority Setting Process Elements of High Performance Description in Test Organization 1. Training and Education - P3, O3  No formal education for PSRA process and a lack of understanding among mid to frontline managers 2. Communication ? P1, P2, A5, O3  PSRA process communication is lacking, and in special need of attention at the proposal feedback stage 3. Timeline and Deadlines - S2  Unclear, condensed, and non-harmonized timelines 4. Monitoring and Oversight - P4  Variable levels of monitoring and oversight for accepted proposals 5. Coordination of Resource Allocation Processes - S3  Lack of coordination between Operational, Capital and IT PSRA processes 6. Program Budgeting - P1  Very limited program budgeting taking place within the organization    63 As the coloured tags in Table 4.3 indicate, only two elements were deemed to be ?red weaknesses? ? indicating a substantial issue or challenge in the organization. The remaining elements were in the ?yellow? category. This suggests that there are some components of these elements that could be improved, but that they are less serious than the ?red? elements. Elements that were described by participants in interviews using softer negative language and did not quite match descriptions in the high performance framework were characterized as these yellow areas for improvement. Each of the weaknesses in Table 4.3 will be addressed in turn within the following sub-sections. Supporting quotes are presented from members of the test organization. Recommendations for improvement that were developed as part of the report to the senior management team in this organization are included in Appendix D.  4.2.2.1 Lack of formal Training and Education  Without a formalized education program for stakeholders, organizations may find difficulty engaging their staff members and implementing a priority setting and resource allocation process (22)(63). Studies of past implementations have revealed that education is very much needed to ensure understanding and build acceptance (64). As such, this element is included in the process domain and its effects are included in the outcome domain of the evaluation tool and high performance framework.  Since the priority setting and resource allocation process in this organization was introduced relatively recently, ?there?s a bit of a learning curve there for everybody? (A-18, Middle Manager). However, ?new staff orientation is definitely a weakness? for the organization (A-18,   64 Middle Manager). Currently, a learning-on-the-job approach is common whereby managers are told that ?this is the process and here is the document... fill it in? with minimal or no prior training given (D-21, Middle Manager)(I-20, Middle Manager).  Even when their formal process for priority setting and resource allocation was introduced to the organization, managers who did receive training reported that their instructors were ?very green? since the process was ?new to them as well? (G-1, Middle Manager). ?We were blind leading the blind because we both had never done [PBMA based funding proposals] before? (I-20, Middle Manager).   Senior management agreed that no formal priority setting and resource allocation education exists for staff in the organization, and that ?most front-line staff would not know exactly how resource decisions get made? (I-4, Senior Manager). Tellingly, mid-level managers also reported confusion about the process, and described lower level management and staff that were uncertain even when it came to the name of the priority setting process ? ?I?m not always sure still when I say PBMA to my management group on the ground here that they all know exactly what I?m talking about? (F-22, Middle Manager). This lack of understanding was clearly demonstrated when one of the mid-level managers interviewed was completely unaware of key aspects of the process including the use of a criteria-based assessment tool to score and rank proposals (A-18, Middle Manager). One senior manager further iterated this: ?Staff, no, I don?t think our staff really understands the PBMA process? (I-14, Senior Manager).     65 With an informal education process in play, much of the responsibility for educating frontline managers has fallen to mid-level managers. However, this approach has introduced a significant degree of ?inconsistency across the organization? (I-4, Senior Manager) since some managers ?have not spent a lot of time explaining the nuances of [the organization?s priority setting process] to [frontline managers and staff] at all? (I-5, Middle Manager). ?I don?t think the work has been done to explain that and coach people in that so that they?re comfortable with it? (I-5, Middle Manager).  Middle managers also noted that there is a particular challenge to learning ?for the folks whose background? is in the clinical world? and who are not well versed in the financial aspects of healthcare delivery (C-2, Middle Manager).  In the latest cycle of the process, these ?clinical folks? were described as having difficulty with ?get[ing] their heads around if we do something different and save resources then we have an opportunity to expand something else that would be of greater value? (C-2, Middle Manager).  Participants also recognized a lack of training as a risk to the integrity and effectiveness of their priority setting and resource allocation process. Without an understanding of their own organization?s process, frontline managers and staff are likely to ?lose confidence in the process? (B-3, Middle Manager), create ?proposals that are not strategically oriented at all? (I-11, Senior Manager), and ?game the system? (G-1, Middle Manager). Participants reported all these behaviours to different extents during the interviews.  ?I sometimes think that if we educated the staff why we?ve gone down this path of resource allocation or disinvestment, they would have better buy-in? (I-16, Middle Manager).   66 Given that the test organization?s formal approach to priority setting and resource allocation was introduced relatively recently, their training program has yet to fully develop. Potential consequences of an underdeveloped training and education process could include lack of trust, attempted gaming of the process, and proposals that are not strategically oriented.   4.2.2.2 Communication  Effective communication has been shown to be essential throughout the entirety of a priority setting and resource allocation process. This includes ensuring that the criteria used to make decisions, the decisions themselves, and the implementation plans are well understood (44). In the event of proposal rejection, highlighting the rationale for decisions will also enable staff to understand why they did not get funding and may strengthen future attempts (22). Since communication affects multiple aspects of an organization, the elements included in this weakness also span the process, attitudes and behaviors, and outcome domains.  In the case of our test organization, communication issues were apparent in several areas of the priority setting and resource allocation process including: initial messaging, process transparency, and feedback.  4.2.2.2.1 Initial Messaging  Despite being designed for both investment and disinvesment, when the pilot organization?s priority setting and resource allocation process began it was perceived as being ?introduced   67 actually as a disinvestment process, not an investment process?. As a result, it did not provide the ?confidence that was needed for [members of the organization] to buy into the process? (A-18, Middle Manager). Senior managers recognized this gap in initial communication, and suggested that their language around disinvestments had been misinterpreted in previous cycles. While some managers thought the message from the executive was that their health authority had ?a 20 million dollar deficit and is looking to cut 3% of their programs?, in fact, the executive actually  ?wanted 20 million dollars of ideas that potentially could be disinvestments?, but not all would necessarily be implemented (I-14, Senior Manager). These subtleties were not well understood by staff in general.  Managers also reported difficulties communicating the scarcity of healthcare resources to their staff. As one manager stated: ?There is only so much money in the bank. This is why I think the staff needs to understand that there?s only so much given to each of our health authorities and divvied out within the hospital. There needs to be budget cuts made? (I-16, Middle Manager).   4.2.2.2.2 Process Transparency  ?I wouldn?t say our process is clear at this point? (D-21, Middle Manager).  Several middle managers who were new to the organization?s priority setting and resource allocation process reported similar experiences when submitting their proposals for investment or disinvestment without any idea when the proposal would be returned or what the general process was ? ?you send this thing off into the big black hole and they say basically? don?t call us, we?ll call you? (A-18, Middle Manager). These same managers were uncertain as to what   68 made an ?acceptable proposal?, and were frustrated with their lack of understanding of the process in general.   ?The messaging is so confusing at times that it?s not like you know exactly what is required of you and how to move forward? (G-1, Middle Manager).   Lack of clarity has reportedly led to the creation of proposals that are not aligned with the strategic priorities or criteria of the organization.  ?What we could be stronger at is actually linking the proposals to the criteria? We?ve got a strategic evaluator, a strategic executive, looking at it. But we?re looking at a bunch of proposals that are or are not sort of strategically oriented at all? (I-11, Senior Manager).  Uncertainty extends to the frontline staff as well since the organization?s priority setting and resource allocation process was not ?something that?s widely advertised? We don?t offer a whole lot of information unless somebody asks? and [staff] don?t ask questions? (D-17, Middle Manager). This lack of information exchange was recognized by middle and senior managers as potentially damaging to trust in the process and could ultimately affect engagement and buy-in. For instance, a senior manager described a case where there were service cuts in the budget and extra money for a one-time purchase.   ?How do I sell the fact that somebody lost their job in the same year I?m buying a whole bunch of things? Well, these chairs are .001% of the entire budget, and this one time, versus a job needs to have money every year.? The manager recognized how frontline managers could perceive this, and recommended ?the more people who understand the budgeting process, the better? (D-21, Middle Manager).   69  Without proper communication and understanding among members of the organization, some staff could perceive that this organization has ?a very informal process? that is ?inadequate in that it lends itself to influence via a relationship? where one is ?submitting to a set of people versus an actual set of criteria? (A-18, Middle Manager).  For frontline managers and staff members who did participate in the process, a ?time lag between when they were asked for input and information when a decision is implemented? was often ?quite long? (C-2, Middle Manager).  ?If action doesn't result and it's just another dialogue, then I don't really want to be at the table. It's boring. Nothing happens. So, making sure that linkages are made from the table to the action into clinical care, right?? (I-10, Senior Manager)  Senior managers who were interviewed recognized this lag in the previous cycle ? ?external factors got in the way that caused meeting times to be changed or cancelled? and there was a lot of back-and-forth noise that makes [the priority setting and resource allocation process] look like it?s less important to the organization? (I-10, Senior Manager).  Although the test organization intended to implement a more formal process for priority setting and resource allocation, the initial perception by staff was that this process was simply another cost-cutting exercise. This lack of understanding with respect to the intention of the process was followed by a lack of clarity relating to the process itself. As a result, participants in evaluation   70 interviews reported managers and frontline staff who were experiencing feelings of alienation and distrust.  ?I?m not clear because I?m not - I?m not privy to the intricacies of how they?re determining or which criteria is used to determine when a successful proposal is or not. And be -- because I don?t know -- know that, then it makes me wonder why some are approved over others? (A-18, Middle Manager).   4.2.2.2.3 Feedback  There was virtually unanimous agreement among members of the organization that one of the weakest aspects of communication was the rationale for resource allocation decisions. Middle managers who created proposals reportedly did not receive ?any communications back from the committee? or have any way ?to go back to the committee and say, Why didn?t you fund [my proposal].? (G-1, Middle Manager)(D-21, Middle Manager). This lack of communication once a resource allocation decision has been made could perhaps be addressed in part by a formal appeals process. Not only does the absence of such a process negate the ability of the organization to revisit and revise decisions in light of further evidence or obvious misinterpretation, it may leave managers uncertain as to whether they should pursue a proposal further if they have not received any rationale for the decision not to fund it. Ultimately, engagement levels and trust in the process may be detrimentally affected if this issue remains unresolved.  The lack of training and communication are two key weaknesses in the test organization?s priority setting and resource allocation process. Timelines, oversight, coordination, and program   71 budgeting are also areas that require further improvement, with each of these being presented in turn in the following sub-sections.  4.2.2.3 Process Timeline and Deadlines  The importance of timelines has been highlighted by several studies that have identified a ?lack of time? as a significant barrier to effective priority setting and resource allocation (39)(65)(66). By structuring priority setting and resource allocation processes to allow adequate time for preparation and implementation, organizations can encourage greater participation among staff and avoid rushed decision making in which proposals are not properly researched or debated (67). This element is therefore best captured in the structure domain of the evaluation tool in Table 4.1 and high performance framework in Appendix A.   Members of the test organization agreed that ?sometimes we have trouble keeping on our timeline target? and ?there?s often not enough time or the time is poorly communicated? (F-22, Middle Manager)(I-5, Middle Manager). Issues with the timeline were reported even in the preparation stages of the process. Discussions around potential proposals were reportedly occurring ?in the summer when a lot of our [frontline] managers aren?t available? forcing decisions to be made without their input (G-1, Middle Manager). Lack of clarity and time restrictions also prevented managers from engaging with their staff.     72 ?I think we just have to attach guidelines to the steps and that might really support getting an early start and having time to actually listen to the staff. It?s not that managers don?t want to ask or engage or hear from their staff. I think we just set up really tight timelines and there?s no time to do it? (A-13, Middle Manager).  As described above, proposal development in this organization was split into two phases. First, managers submitted a ?short form? of their proposal to the VWG that used an assessment tool to perform evaluations of each proposal. If approved, the committee would request a ?long form? of a proposal - with greater support - to send to the senior executive for deliberation. Several middle managers reported frustration with the length of time it took for the validation committee to approve short forms and request long form proposals. They also reported that the request for long forms lacked harmonization with other departmental duties since demands from different areas often coincided in ways that generated considerable workloads at certain times.  ?We have stuff coming out of sort of our quality improvement team. We have stuff coming out of organizational development. We have stuff coming out of clinical practice. We have all these different things that are coming at our managers from different portfolios within our organization and there?s not a lot of understanding within the different portfolios how much is getting dumped. So, the managers all feel it all because they?re the receivers of it all? (A-13, Middle Manager).  Middle managers also expressed frustration with the timing of proposal evaluation at the Senior Management level. If their proposal was accepted, middle managers often had very condensed timeframes to implement the decisions ? not allowing for proper change management, and potentially reducing future engagement of staff.    73 ?You send a proposal in for a disinvestment. You don?t get the approval until last minute, so it really shortens your implementation time, which then affects the staff, which then affects? engagement. If it?s slow up top, it?ll speed it up in the bottom? (A-18, Middle Manager).  Unintended consequences of condensed timelines could include a lack of opportunity for  ?somebody to argue, or to appeal a decision that was made? (I-5, Middle Manager) and may not allow for major decisions to be made if there is ?not enough time to fully vet [decisions], and have the necessary conversations that need to be had? (I-5, Middle Manager). Constructively, senior managers agreed that middle managers needed more clarity on timelines and more timely responses.  ?Managers need more clarified timelines, I think, as far as what it is they?re expected to do by when. And then, whatever group it is that does the ranking -- and currently too much of it still falls to the executive -- we need to take the time to do it.? (I-11, Senior Manager).   While condensed timelines aren?t always preventable and no process can be carried out to meet the individual needs of every participant, understanding how upstream decisions can affect lower levels of the organization and attempting to harmonize timelines may be very helpful in preventing reductions to engagement and may facilitate high quality proposals and implementation.  4.2.2.4 Lack of Monitoring and Oversight  While literature has tended to focus on the preparation and decision making phases of priority setting and resource allocation processes, an equally important component is the follow through and execution of decisions that are made (44). In order to ensure high quality and sustainability   74 of these resource allocation decisions, continual monitoring and evaluation of the decisions themselves are essential. As such, the element related to this strength is located in the process domain of the evaluation tool and high performance framework.  While some middle managers described a very thorough and formal process in which the organization?s finance department monitored the organization?s performance on a bi-annual basis, others admitted that they have not had to report out on any of their accepted investment proposals (D-17, Middle Manager)(I-5, Middle Manager). There was agreement that even though greater follow up may cause some additional reporting and work on their part, ?in the interest of the integrity of the program? [evaluation] needs to be done to have [the process] make sense to people and be respected? (I-5, Middle Manager).  4.2.2.5 Coordination of the Process Across the Organization  Rather than being implemented as a set of discrete tools, literature has suggested that priority setting and resource allocation processes should be conceived of as a management process (68). Therefore, it should be coordinated with other management processes that affect resource use including capital and information technology (IT) processes. ?If priority setting mechanisms conflict with budget setting mechanisms within provider organizations, it is unlikely that priority setting methods will lead to changes in the allocation of resources? (69).  Coordination of processes is included as an element in the structure domain of the evaluation tool and high performance framework.   75 For this organization, capital and IT represented the second two biggest draws on resources next to operational costs. In both these areas, processes had been ?not related? and ?pretty much separate? from each other and from the operational priority setting and resource allocation process (I-4, Senior Manager)(I-23, Middle Manager). In fact, despite fundamental similarities between the IT and operational priority setting and resource allocation process (with common steps including ranking criteria and the evaluation of proposals) representatives from IT had a very poor understanding of the operational priority setting and resource allocation process.   ?I just don?t know how they do a weighted decisions between things. I haven?t seen that methodology. So, how do you decide that a pitch for a new IT resource is more important than the pitch for labs? I?m unclear how that happens. Is it whoever writes the best proposal or gets the most executive support? Is it - I don?t know? (I-23, Middle Manager).  Both the IT and operational processes also experienced similar challenges including difficulties mapping their services, collecting evidence to justify proposals, and re-allocating resources from areas of low-priority to high-priority. While improvements in coordination have brought the processes closer together compared to past cycles when they were ?ships in the night? (I-24, Middle Manager), independent timelines for each process have reportedly resulted in some capital purchases going forward without the operational resources and support required.  ?The PBMA process is going on right now, so as long as everything is incorporated in that process, then we can get capital planning approval processes in March for April start. If they were to get out of sync, we have a problem. For example, you?ve bought this piece of equipment, but you don?t have the extra $20,000 for the maintenance agreement that you?ve been found to require. You're going to find that out of your operating budget somewhere else. That?s a bit of a tough consequence, but it?s happened a few times now? (I-8, Middle Manager).   In order to treat organizational priority setting and resource allocation as a management process (rather than a discrete set of tools), both capital expenditures and IT expenditures must be   76 coordinated with an organization?s clinical and corporate operations. Despite fundamental similarities between these processes in the test organization, limited coordination has led to unforeseen hidden costs and ?tough consequences? for the organization. One of the challenges faced by participants in all three of these processes was program budgeting prior to proposal development. The following sub-section will address this area for improvement in greater detail.  4.2.2.6 Program Budgeting  In order to facilitate explicit comparison of services, a map of current activity and expenditure is recommended in the literature. Program budgets document how resources are being spent within organizations, and can be used as a starting point for managers to identify high and low priority programs in their respective portfolios (3). To create a program budget, activity and cost data from each service must be accessible through administrative data or be collected prospectively. High level costing will generally suffice since fine precision is not the aim of program budgeting (3)(42). Once an overview of resource expenditure has been collated, organizational criteria can be applied to each service to create a ranking of programs. Low priority services or aspects of services can then become options for disinvestments.  Despite operationalizing their strategic plan in the form of weighted criteria applied consistently to proposals using an assessment tool, the test organization has not trained or educated its managers to perform reviews of their portfolios so that they might be able to better identify low and high priority services. Both middle and senior managers reported that significant variation exists in the extent to which different portfolios map their services since it is performed at the   77 ?discretion? of individuals, and is often not applied ?the way it is supposed to be? in a comprehensive manner (I-5, Middle Manager)(G-1, Middle Manager).  As described in the communication weakness section, a misunderstanding took place between middle managers and senior managers as to the definition of the 3% disinvestment request (I-11, Senior Manager)(I-14, Senior Manager). Some middle managers misinterpreted the request by senior management for disinvestment options and determined that they were required ?to find 3% cuts everywhere? (I-23, Middle Manager). Senior managers recognized that the message that went out was ?[senior managers were] looking to cut 3% off programs, which was never what we were trying to do. We just wanted 20 million dollars of ideas that potentially could be disinvestments? (I-14, Senior Manager).  ?My sense is that it?s not well understood [at lower management levels]. That it?s not really being viewed any differently than any proposal around disinvestment that might have been in the past where managers were asked to, Okay we?ve got to trim money from the budget, come up with where you would trim it and people put forward things from that approach rather than the approach of what aligns and what doesn?t align as well and therefore from a value-add perspective in question? (I-5, Middle Manager).   This ?trimming off the budget? approach was readily reported among middle managers. Rather than performing a comprehensive review of their portfolio and identifying low and high priority services, managers admitted to very informal processes for determining which ideas were put forward for disinvestment.  ?Okay so this actually is sort of rising to the top let?s put forward this and nobody?s really utilizing this or the service is not aligning well so let?s score that one and see if we disinvest? (I-5, Middle Manager).   78 In this way, managers were simply putting forward easy disinvestments or whatever was ?rising to the top?, rather than ranking all of their services and submitting the low priority programs. Easy disinvestments included items ?like inventory and some education or travel? (I-5, Middle Manager). For example, the middle manager in the following quote recognized that her easy disinvestment was very poorly aligned with the organization?s strategic plan, and that it would be difficult to deal with in the coming year. She submitted it nonetheless because it was easier than going through a process of identifying low priority services.  ?We cut out $50,000 from our supplies budget last year?Well you know why we didn?t use $50,000? Because I was short two FTEs. The nuclear meds person went on vacation or was sick for six months and we couldn?t replace him. So, we didn?t have to purchase all the isotopes and all this kind of stuff because he was gone. But next year I?m going to actually need that $50,000. So, there?s a lot of apprehension that once we actually do fill those vacant positions then what is that even more so going to mean to us when we?ve had to cut costs because then we?ll actually be over budget because we?ve actually been able to successfully recruit into those positions? (G-1, Middle Manager).  Although this approach has been passable in previous priority setting and resource allocation cycles, both middle and senior managers acknowledged that a ?comprehensive review [of portfolios] is needed? because ?next year [they] will have skinned off all of that low-hanging fruit? (G-1, Middle Manager)(I-5, Middle Manager), and without a ?firmer framework? for mapping and ranking portfolios, disinvestments will continue to be ?more in the order of efficiencies rather than? the lowest priority programs? (I-11, Senior Manager)(I-4, Senior Manager).  Although a lack of program budgeting was demonstrated most starkly in - the approach to and quality of - disinvestments, it would stand to reason that investment proposals are not optimally   79 aligned with the strategic priorities of the organization either. This would not be as evident since issues with disinvestments are generally more controversial and visible within an organization. Comprehensive program budgeting would allow managers to bring their portfolio closer to the organization?s vision by identifying low priority programs for disinvestment and high priority programs for investment.   Possible reasons put forth for the lack of program budgeting included lack of skills and time to perform a comprehensive review. Managers described the task of creating a program budget for their portfolios as ?daunting?, ?overwhelming?, and ?definitely not something that could be done off the side of one?s desk? (I-6, Senior Manager)(I-5, Middle Manager)(I-4, Senior Manager).  Without an explicit process for identifying disinvestments, indirect consequences of this informal approach have reportedly included disheartened staff, lack of buy-in and reduced trust in the process (G-1, Middle Manager)(I-5, Middle Manager).  Although the test organization as a whole was largely unsuccessful in implementing program budgeting, one department did stand out in its ability to create a comprehensive program budget for its portfolio. The following sub-section describes this department?s approach, and how they incorporated program budgeting and priority setting into their strategic plan       80 4.2.2.6.1 Example of Successful Program Budgeting  Before launching into a program budget, this department first considered how it could operationalize the organization?s strategic plan in its own area, and what work needed to be done ?to make [the plan] a reality? (I-9, Middle Manager).  As a team, they then grouped all of their 250 functions into 26 categories, and used the organizational evaluation criteria to rank each category. By assigning a score that was dependent on the evaluation criteria, they were able to easily identify categories that were high priority. Low priority programs naturally manifested as by-products of the exercise, and became candidates for disinvestment. The process reportedly took the equivalent of two days work (I-9, Middle Manager). In this way, a staff-driven approach to program budgeting created a ranking of services in their department with clear options for disinvestment. This approach reportedly benefited from significant staff engagement as well.  ?[The staff] understand what kind of work needs to be done on [the strategic plan] to make it a reality. And I can tell you something, it generates a lot of excitement because people focus to say these are the things in the organization that has to be done. They see the bigger picture and they see their role of actually making these things a reality? (I-9, Middle Manager).  This example clearly demonstrates that although an organization can be struggling with a particular element of its priority setting and resource allocation process as a whole, certain pockets of the organization may in fact be excelling within these same elements. Further investigation may reveal that particular groups within the organization may have a template for   81 training or communication that could be standardized and implemented across the entire organization.  4.2.3 Recommendations  To help move the test organizations towards this goal of improvement, recommendations were created to address the weaknesses and areas for improvement reported above. Since the recommendations delivered to the test organization are not directly relevant to the research questions in this thesis, they are not included in the results section. However, because they are significant in the context of the test organization?s evaluation, the author has made them available in Appendix D. Table 4.4 provides an overview of those recommendations   Table 4.4 Recommendations for Improvement Elements of High Performance Recommendation in Test Organization 1. Training - P3, A5, O3  Expand and enhance training provided to lower level managers. Leverage experience of VWG 2. Communication - P2, A5, O3 Increase and enhance communication with a special focus on rationale for proposal decisions 3. Circumvention of Proposals Document proposals that circumvent the normal PSRA process 4. Monitoring and Oversight - P4 Increase the monitoring and evaluation of proposals 5. Program Budgeting - P1 Build upon existing success to expand the practice of program budgeting to other departments in the organization    82 Having addressed the strengths and weaknesses identified in the test organization?s priority setting and resource allocation process, the thesis now turns to respond to the final research objective, that being how the tool itself might be refined in order to be used more appropriately going forward in other organizations. The following section details changes that were made to the evaluation tool after this test application.   4.3 Refinement of the Tool Over the course of the evaluation tool implementation, revisions and refinements were realized to the evaluation tool. Edits were based on 1) observations by the investigator during interviews; 2) results of data analysis; and 3) comments by participants during interview debriefs. In the follow sub-sections, data from each source will be presented along with refinements made to the tool itself.   4.3.1 Observations During Interviews:   During interviews, field notes and observations were recorded to guide refinements of the evaluation tool. In addition to noting lack of time for certain interviews and technical difficulties with video and tele-conferencing, the interviewer also made note of interviewee responses. Questions that yielded short responses or were difficult for some participants to comprehend were recorded.     83 To address these issues, changes were made to the language and wording of several interview questions. Additional probing questions and some formatting changes (eg. including 1-5 likert scales) were the most common adaptations. One element that was modified to a larger extent was outcome element #4 that included ?Improved Health of the Population? as a target. During interviews, the investigator observed that many of the participants were giving one word answers to the question in this element ? ?Is your current process achieving better health for your population than another process?? As a result, the intended information was not being elucidated and in order to improve the line of questioning additional items were added. See Table 4.4 for a before and after comparison:  Table 4.5 Version 1 and 2 of Outcomes Element #4  Version 1 and 2 of Outcome Element #4 in the Evaluation Tool Element Description Resource allocation decisions are justified in light of the organization?s established and agreed upon core values. Progress is made toward identified strategic goals and objectives.  Improved health (broadly defined) is achieved as a result of decisions made through the RA process. Effective, efficient, equitable, accessible, safe, and high quality care is delivered. Resource allocation decisions are justified in light of the organization?s established and agreed upon core values. Progress is made toward identified strategic goals and objectives.  Improved health (broadly defined) is achieved as a result of decisions made through the RA process. Effective, efficient, equitable, accessible, safe, and high quality care is delivered. Questions Would you agree that your organization?s core values and strategic goals were reflected in the last priority setting cycle?  Do you feel that you will be able to meet those objectives with your current process?  Based on available measures and your perception, would you say that your current process for priority setting and resource allocation is improving the health of the population you serve?  What available evidence is there to support that claim? Would you agree that your organization?s core values and strategic goals helped guide the last priority setting cycle? (yes/no/don?t know)  On scale of 1-5 (1 being ?not at all? and 5 being ?definitely yes?), do you feel that you will be able to meet those objectives with your current process?  Would you say that your current process for priority setting and resource allocation leads to decisions that will improve the health of the population you serve?   Or if they have recently launched a new process:  Thinking back to the resource allocation decision made in the last budget cycle, do you think your organization would have made the same decisions without your current process?  Do you think these decisions have led to better overall health for the population you serve?  What evidence would you use to support that claim?    84 The second version of the evaluation tool (with these changes) can be found in Appendices F, G. Both the themes of resource constraints and optimal medium for evaluation tool delivery will be addressed in the discussion section of the thesis.   4.3.2 Results of Data Analysis  While no elements were removed, analysis of participants? responses led to the addition of two elements. As detailed in section 4.2.2.6, program budgeting was a weakness of this organization?s process. While mapping of services is key aspect of any PBMA exercise, program budgeting was not included explicitly in the first version of the evaluation tool as an element or sub-element. Given its ability to help managers identify investment and disinvestment options more easily and the clear impact it had in this test organization, program budgeting was added as a sub-element to the first element: ?Formal Process? of the ?Process? Domain.  Given the difficulty in measuring tangible outcomes of the test organization?s process, the author of the thesis felt that an intermediary outcome  (that still maintains a causal relationship with the quality of the process) could be more easily measured. Transcript analysis identified several senior managers who referred to the ?lack of proposal alignment with organizational strategic priorities? as an issue in their process. ?Investment and Disinvestment Proposal Quality? was therefore added as an element to the evaluation tool in the ?Outcome? Domain. ?Good? proposals would be well supported by evidence, feasible, reflective of the organization?s strategic goals and criteria, and be based on the investment and disinvestment priorities of the organization. In the semi-structured interviews, SMT members would be questioned about the investment and   85 disinvestment proposals they receive. Poor assessments would suggest that there are one or several issues with the current process (eg. managers creating the proposals do not have appropriate training, communication is inadequate, leadership is lacking).   While, the complete second version of the evaluation tool can be found in Appendix F, G. Table 4.5 below gives a summary of all the changes made as a result of interview observation and transcript analysis.   Table 4.6 Summary of Changes made to Elements in Evaluation Tool Domain Element Change Process P1a: Explicit Process - *New sub-element - Addition of 1-5 scale - Additional sub-element P1b: Weighted Criteria - Language changed P1c: Assessment Tool - Additional Probing questions P1d: Evidence - Additional Probing questions P1e: Decision Review - Additional Probing questions P2: Effective Communication - Addition of 1-5 scale - Additional Probing questions P3: Skill Development - Addition of 1-5 scale P4: Follow Through - Addition of 1-5 scale - Additional Probing questions P5: Project Coordinator - Addition of 1-5 scale - Additional Probing questions     86 Domain (continued) Element (continued) Change (continued) Structures S1: Ability to re-allocate financial resources  S2: Engagement of Staff - Addition of 1-5 scale - Language changed S3: Coordination - Additional Probing questions - Language changed S4: Stability  S5: Time and Resources - Additional Probing questions Attitudes and Behaviours A1: Relationships - Additional Probing questions A2: Culture of Improvement  A3: Strategic Alignment  A4: Social and Community Values  A5: Strong Leadership - Addition of 1-5 scale - Additional Probing questions Outcomes O1: Re-allocation - Addition of 1-5 scale O2: Endorsement - Addition of 1-5 scale O3: Understanding  O4: Proposal Quality - *New Element O5: Strategic Alignment and Outcomes of decisions - Addition of 1-5 scale - Additional Probing questions   4.3.3 Interview Debriefs  Participants felt that the evaluation tool was based on a comprehensive set of elements that captured the relevant aspects of a high performing priority setting and resource allocation process. While some participants appreciated the global approach to priority setting and resource allocation, several suggested that certain elements of high performance including follow through and leadership could have been explored in greater depth.  ?I think there was a good global view and a good kind of global discussion. I think they?re (all the elements of high performance discussed) all very pertinent? (H-7, Middle Manager).    87 Respondents also determined that the questions used in the evaluation tool were clear and meaningful, and that the interview structure allowed for ?a little bit of drift that will keep the flow of conversation as opposed to keeping it very narrow? (B-3, Middle Manager).  ?I think the questions were balanced in gathering information from the domains as you outlined them and that they gave me an opportunity to explain why my answer would be one way or the other. I didn?t feel there were any leading questions or any things that were biased in any way? (C-2, Middle Manager).  Some participants did acknowledge that conducting every interview face-to-face might be too resource intensive, and that video or tele-conferencing would be viable alternatives. A few also admitted that they would not provide as rich of responses if they were to answer the same questions on an online survey.  ?For me, personally, online I wouldn?t give you online as much as I gave in person, I would say. And I suppose physically in person could be better but I?m comfortable with this venue [video conferencing]? (D-21, Middle Manager).   Based on the experience of the interviewer, conducting the interviews face-to-face was preferable but not necessary for data collection. Although, a use focused evaluation methodologist may suggest that face-to-face evaluation may be more likely to elicit action on the part of the knowledge users in comparison to a tele or video conference.  All interviewees agreed that one hour was a reasonable amount of time to invest in the evaluation of their organization?s priority setting and resource allocation process.  ?With the importance of this in our everyday life? Then I think the time is well worth ensuring that there was a robust - the ability to have this number of questions. So, yes I think it was quite manageable? (G-1, Middle Manager).   88  ?One hour is perfect? (I-16, Middle Manager).  When asked about their level of comfort during the interviews, many participants stated that they were comfortable sharing the weaknesses of their organization?s priority setting and resource allocation process. However, they did articulate mixed feelings as to whether they would be as comfortable sharing the weaknesses if the interviewer was someone internal to their organization. The general consensus seemed to be that due to the controversial nature of some of the questions, they would be less likely to be as frank in their responses with an internal interviewer.  ?The fact that you're not part of my organization is helpful. Interviewer: Do you think if it was someone from your organization that you?d give a different response? Participant: I?d probably be a little more guarded. I think some of those questions are delicate questions, especially when you start talking about alignment with strategic direction? (E-19, Middle Manager).  As part of the meta-evaluation and measurement of face and content validity for the tool itself, interviewees were asked at the end of the interview whether any elements of high performing priority setting and resource allocation processes were not covered in order to determine whether any aspects of a high performing priority setting process were missing in the evaluation tool. After this application, investigators felt that this purpose would be better served by asking interviewees what they thought the strengths and weaknesses of their organization?s process were before the interview took place. The responses given could then be compared against the existing tool to determine if any gaps existed.    89 In addition to this ?existing strengths and weaknesses? question, the following were also added to transition the participant into the rest of the tool: Could you give a quick description of your role in your organization? How would you describe your priority setting and resource allocation process? What is your role in the process? What are the strengths and weaknesses of your process?  4.3.4 Summary of Changes  Overall feedback from participants suggested that the evaluation tool was comprehensive and implemented in an appropriate manner. Several respondents suggested that substituting an external interviewer for someone who was internal to the organization may create some discomfort among respondents, and that an online medium of implementation would also yield responses that were ?less rich?.  The majority of changes to the tool included minor adaptions to language and format. The most notable transformation focused on the restructuring of questions in outcome cell #4 and the addition of an element and sub-element to both the outcome and process domains. The following discussion section will address the three research objectives described at the beginning of the thesis including whether an evaluation tool could be developed from a high performance framework and applied to a test organization, and what refinements are needed to improve the tool for future application.     90 Chapter 5: Discussion  The evaluation tool was successfully implemented in the test organization to identify strengths and weaknesses in their priority setting and resource allocation process. Beginning with one of the most comprehensive frameworks for high performance in priority setting and resource allocation to date, investigators were able to operationalize all 19 elements from the framework into an evaluation tool that could be implemented in the form of semi-structured interviews. A test implementation was carried out in a healthcare organization where strengths and weaknesses of their process, as well as further refinements to the tool, were identified. The follow sections will address each of these research objectives in greater depth by discussing results from development and implementation of the evaluation tool.  5.1 Development of the Tool  5.1.1 Change in Structure  In the same way that the balanced scorecard extended performance indicators from financial measurements to include customers, processes, and growth, the high performance framework (that was used to create the evaluation tool) expanded upon existing conceptual frameworks to include a broader set of elements for high performance in priority setting and resource allocation.     91 The elements and sub-elements within the framework provided a solid foundation to develop the evaluation tool. Once the elements of the high performance framework were operationalized into questions, the only other significant change was made to the sequence of domains. In the high performance framework, the order of domains from left to right was structures, processes, attitudes/behaviors, and outcomes. In the evaluation tool, the process domain was placed above the structures domain. Since the process domain included more objective and less controversial questions than the other domains, the investigator felt that this was a more appropriate way to begin the interview with participants.  Both a methods and values approach to evaluation were incorporated into the content of the tool by including outcomes of a priority setting process (eg. re-allocation of resources, health outcomes) and aspects of the process related to lower level stakeholders (eg. engagement, training, communication, alignment with community values). In this way, the evaluation tool was successfully developed using the framework for high performance and a sound theoretical frame. The final evaluation theory applied, a use focused approach, was most heavily drawn upon during the implementation of the tool. The reader should note that the evaluation tool is meant to be applied at the budgeting level of priority setting and resource allocation rather than at a more strategic level. While certain elements are directly correlated with a strong strategic vision for the organization, this level of priority setting was not directly targeted by the tool.      92 5.2 Implementation of the Tool  During each stage of implementation, methodological rigor was incorporated to ensure validity of results. A sampling matrix was used during sample selection of participants for interviews in order to capture perspectives from different geographical areas, roles, and levels of expertise within the organization. To ensure that data were being collected consistency and accurately, a more experienced investigator conducted an audit of several interviews. A similar approach was also used during data analysis where several strength and weakness determinations were carried out by two investigators independently and subsequently compared to test for agreement. Debriefs with individual participants after evaluation interviews served as a method to test and refine the tool. A ?use focused approach? prescribes that a debrief with the senior management team should also be held to discuss the final report. This will be addressed in greater detail in the ?future research? section.   Despite challenges inherent to healthcare (e.g., resistance to change, turnover of managers) and geography (e.g., dispersion of population and services, difficulty filling positions), our test organization successfully introduced a formal process for priority setting and resource allocation between 2010-2012. While key weaknesses were identified, their process is backed by a strong link to their strategic plan, trust in leadership, and an explicit framework to assess and evaluate resource allocation proposals. Key areas for improvement include training, communication, program budgeting, and process integrity (i.e., staying true to the process). Detailed recommendations delivered to the test organization to address these areas are included in Appendix E.    93  Although further study is required to determine how the test organization will choose to implement these recommendations and their impact on the organization itself, based on this application it is clear that the tool was able to identify the strengths and weaknesses of the organization?s process and achieve the second thesis objective.  5.3 Refinement of the Tool  Refinements to the tool were based on observation during interviews, results of data analysis, and comments by participants during debriefs after the evaluation interviews. Feedback from participants during these debriefs suggests that the tool was comprehensive and logistically sound. While the tool itself proved to be successful in capturing the strengths, areas for improvement, and weaknesses of this organization?s priority setting and resource allocation process, refinements were also identified for future applications. These included minor changes to language and formatting as well as several additional questions at the outset to more smoothly transition participants into the interview. One element and sub-element were also added to capture new facets of an organization?s process. In this respect, the third objective of the thesis was achieved. Further discussion of the limitations and areas for future research related to these refinements are included in the following sub-sections.       94 5.4 Broader context  In this section, some of the themes that manifested during the application of the tool are discussed along with questions that arose for future research. As mentioned in the introduction, the most comprehensive effort to evaluate priority setting and resource allocation to date has been the work of Sibbald et al. in 2010. Their set of evaluation criteria was developed using a Delphi approach and was implemented in one pilot hospital in Ontario. The evaluation tool described in this thesis work was created based on a high performance framework that followed the recommendations of Sibbald et al.?s study to capture lessons from additional priority setting experiences to create a more robust set of industry ?best practices?. In this way, the evaluation tool was built upon current priority setting and resource allocation literature and incorporates the experiences of six case study organizations.   The fact that many of the evaluation tool elements agree with the criteria in Sibbald et al.?s tool (despite disparate methods and geographical areas of development) suggests that we are collectively moving closer to a unified framework for successful high performing priority setting and resource allocation processes. The application of these tools in healthcare organizations of differing sizes and from different provinces (a regional health authority in our study versus an Ontario hospital in Sibbald et al?s study) suggests that wider applicability of these evaluation frameworks is not unreasonable. While implementation across Canada may be warranted, the scope of application could also potentially extend to other countries given similarities between this tool and frameworks developed internationally (70)(71)(43).     95 One issue that priority setting and resource allocation evaluation initiatives seem to struggle with is the ?methods-approach? to evaluation that poses the question ?does a particular type of priority setting or resource allocation process make an organization or population better off than another process?? (1). As part of this approach, an evaluator must search for measures of priority setting or resource allocation accomplishments that go beyond outputs to outcomes (e.g., morbidity rates in the population served by the health authority). However, since priority setting or resource allocation processes are complex interventions that occur within dynamic organizations affected by internal and external politico-social factors ? it would be a very challenging task to ascribe any type of causal relationship between a change in morbidity and a change in priority setting and resource allocation practice (1). It is perhaps for this reason that ?health outcomes? were not included within Sibbald et al.?s evaluation framework (11). In this study, investigators attempted to address the health outcomes aspect in element four (O4) of the outcomes domain. However, interviews with participants quickly demonstrated that questions from this element were not providing clear responses that could be used to evaluate the priority setting and resource allocation process based on this measure. To improve this line of questioning, this part of the tool now asks respondents to compare their current process against a past process, and determine which one provided better health outcomes for the population their organization serves (rather than requiring a ?yes? or ?no? response). Although this approach still heavily relies on the subjective account of respondents, in a time-constrained evaluation it may be better than simply asking for a ?yes - improved? or ?no did not - improve? response. To carry out the type of assessment clearly desired, a longer-term evaluation with additional in-depth data collection is needed (1). Since this will require greater resource investment (and the goal of the   96 entire priority setting exercise is to achieve optimal resource allocation), it is possible that this methods approach may be cost prohibitive or at least not cost effective for many organizations. An alternative could be the use focused approach that has been used by this study and others (13). One final aspect for our study is debriefing with the intended users (i.e., senior executive team) of the evaluation report, which is discussed in the ?future research? sub-section below.   Another issue addressed by Sibbald et al. was the method of data collection during the evaluation (39). In their research they suggested that in order to make the process of evaluation more streamlined, online surveys should be used rather than interviews, the rationale being that online surveys conducted by individuals internal to the organization would be more time and cost efficient. As described in the methods section, the theoretical framework and balanced scorecard development process from the literature led investigators to choose qualitative interviews as the most appropriate method for data collection. In order to investigate this ?method of implementation? question further, debriefs with participants (after the evaluation interview) focused on their preference for evaluation medium and interviewer. The general consensus was that participants would feel more comfortable speaking to someone external to the organization, and that they would not provide as in-depth responses with an online medium. While Sibbald et al.?s cost-efficiency argument is well taken, and an online survey is perhaps necessary to achieve broader application, lessons from the debriefings with participants in this research should be noted and incorporated. For example, an online self-assessment tool should limit the number open-ended questions since participants may be less likely to provide in-depth responses and perhaps include more Likert style questions (eg. How would you rate your understanding of your organization?s priority setting and resource allocation process? ? 1-5 scale). Additionally,   97 participant comfort (and perhaps engagement) could potentially be increased by third party administration and analysis of the data. In this way, a trade-off between usability and cost may be required in choosing an appropriate evaluation approach.    To summarize, a gap in practice currently exists with a large proportion of Canadian healthcare organizations lacking a formal priority setting and evaluation process. Consequences of this gap could include a sub-optimal allocation of resources, and a failure to improve current activity. In order to address this issue and advance the literature on this topic, this thesis work has operationalized a framework for high performance in healthcare priority setting and implemented the resulting evaluation tool into a test organization. This implementation has clearly shown that an evaluation tool can be developed from a high performance framework, and, further, that it can be implemented in a manner so as to provide direct insight into strengths, weaknesses and areas for improvement with respect to priority setting and resource allocation activity. The test implementation has also identified several possible refinements in the tool itself as well as areas for future research, which are addressed below.   5.4.1 Limitations  A possible criticism of the tool could be the subjectivity inherent in the determination of elements as strengths, areas for improvement, or weaknesses. Prior to this determination, transcript data from evaluation interviews must be categorized into the elements in the high   98 performance framework (performed using template analysis in this study). In our application of the tool, members of the test organization provided consistent descriptions of elements during evaluation interviews. This made the process of template analysis and language examination relatively uncomplicated. However, the final step of determining whether elements were ?strengths?, ?areas for improvement?, or ?weaknesses? (using quotes and examples from interviews as well as the high performance framework definitions) was more subjective. Since there was very little disagreement between quotes from participants related to particular elements in this organization, determining whether an element was a strength or weakness was fairly straightforward. On the other hand, an application of the evaluation tool in a different organization may not have this same level of agreement between participants. In that case, determining whether an element is a ?strength? or ?weakness? may not be as clear-cut. Indeed, even in this test organization the line between an element that was an ?area for improvement? or ?weakness? was blurred at times. To address this, each determination was discussed with the research team during analysis. Quotes and examples that supported each determination were examined and each decision was agreed upon unanimously.   Certain evaluation tool refinements could aid evaluators in making these determinations during future applications as well. First, the addition of 1-5 scales for many of the elements could provide a quantitative framework for determination of strength. After calculating an average score as reported by all participants in the organization, certain thresholds (eg. < 1.5 = weakness, 1.5 - 3 = area for improvement, > 3 = strength) could be established to guide evaluators. Specific reference points for certain elements could also help. For example, if an organization does not have a list of formal criteria for assessment of proposals - their explicit process (P1) is weak. Or,   99 if any middle managers from a given department cannot describe how operational priority setting and resource allocation is performed, then coordination of processes (S3) is an area for improvement in that organization. By explicitly stating these guidelines, the evaluation tool could be implemented by evaluators with less expertise and applied on a wider scale.   Limited resources and time constraints also prevented interviews with a larger sample of members from the test organization from taking place. Although a large group of senior and middle managers were interviewed, a values-based evaluation approach would dictate that the views of frontline workers, the public, and patients should be included to ensure that broader social impacts are considered. In this study, senior and middle management seemed to have a good understanding of the issues facing their frontline workers and population through various engagement efforts. This was evidenced by the fact that senior managers brought up the same issues as middle managers during interviews. While engagement could certainly be improved, conducting evaluation interviews with a broader set of groups may not have altered the determinations of strength and weakness. Rather, additional interviews may have offered greater insight into particular elements (eg. engagement, communication, alignment with societal values) and potentially could have enhanced certain recommendations to improve the test organization?s process. It is also the case that selecting representative members from these groups may be challenging for evaluators, and the additional interviews would require more resources. Ultimately, however, future applications of the tool should consider including representation from these groups, and weigh the benefits of doing so against the costs.   In post-interview debriefs, several participants suggested that certain elements of the evaluation   100 tool could have been explored in greater depth during the interviews themselves. This was to be expected since the tool is designed to provide a high-level evaluation of organizational elements related to priority setting and resource allocation. While the strengths, weaknesses, and recommendations delivered would give management an overview of their process, they may desire more detailed analysis of particular areas. To enable this, evaluations tools with greater specificity that focus on a particular element should be designed and implemented. In the future, a priority setting and resource allocation process evaluation may start with the implementation of evaluation tool developed in this thesis followed by micro-evaluations of individual elements using various other tools. This is one of the many other opportunities for future research that will be discussed in the following section.  5.4.2 Future Research  In order to continue to refine the tool and encourage process improvement on a wider scale, the study investigators would like to extend the use of this tool to other organizations across Canada. While this test implementation was in a single health authority, future applications of the tool could include other organizations and agencies such as hospitals and community health programs. As Sibbald et al. said ?by implementing the evaluation process in other organizations in different healthcare contexts, we could compare lessons between hospitals and understand the problems faced in different hospital contexts.? (39).  In these future applications, a wider sample size including a greater number of frontline managers and staff may be sought if time and resources permit. While decision making for   101 priority setting and resource allocation often occurs at the apex of an organization, perspectives from the lower levels of management could provide deeper insight into particular issues (eg. training, communication, leadership, etc.).  Organizations that are particularly struggling with elements related to public (eg. engagement, alignment with community values, etc.) should also consider including members of the public and patients in their interview sample.  Future applications should also consider the pre-existing relationships between new test organizations and the researchers performing the evaluation. Although participants in the test organization for this thesis were more comfortable with having an external evaluator rather than an internal evaluator, it is possible that this comfort may have been established through previous collaborations (projects that were co-led by members of research team and members of the test organization). This may have led to greater trust (an essential component for case study methodology) which may not exist in an organization that is unfamiliar with the researchers prior to an evaluation. Creating an internally implementable self-assessment tool for organizations who are new to priority setting and resource allocation evaluation may be a first step in building rapport and trust towards a more robust external evaluation. This could be especially valuable if organizations are anxious about their performance and are unlikely to let an external evaluator perform a full evaluation. As a start, these self-assessments may also be less resource intensive.  In addition to adaptions in future applications, further analysis of the data obtained from this organization could also be carried out. This could include comparisons of responses between members from different geographical regions and roles within the organization. By comparing responses, evaluators could determine where weaknesses or strengths were most prominent   102 across distinct areas. While investigators in this study determined that such analysis would not be well supported due to the limited representation from differing areas and roles, additional interviews with members of particular areas or levels of management could add the necessary data to justify this analysis. This would also extend to the role that politics played in the test organization?s priority setting and resource allocation process. As described in the results section, this organization was able to use their formal process to challenge some of the external political factors. However, this does not mean that politics had no effect on their priority setting. In fact, many participants described political implications as an underlying criterion in every resource allocation decision. Since this evaluation followed a use-focused theoretical approach, the strengths and weaknesses presented were all actionable by the organization. For this reason, an analysis of political factors was not included since it would be beyond the abilities of the test organization to change the political nature of healthcare. Nonetheless, the data collected in these interviews may be very valuable to investigation of political influence on priority setting and resource allocation. Further analysis to this end is therefore justified.  Ultimately, investigators in this project would like to develop a standardized evaluation tool that can be applied to healthcare organizations across Canada. Part of this initiative could include a self-assessment tool, and perhaps a network for organizations to be able to share experiences and learn from one another. While the current version of the evaluation tool requires the expertise of an external expert, further application and refinement of the tool resulting in quantifiable scores using Likert scales and specific descriptions related to different levels of performance would greatly facilitate implementation for internal members of healthcare organizations. Longer term follow up with organizations could also allow for investigators to determine the impact of the   103 evaluation process, and examine the sustainability of changes to a priority setting and resource allocation process.        104 Chapter 6: Conclusion  The lack of formal priority setting and resource allocation process implementation and evaluation, combined with significant resource constraints, has created an onerous environment for Canadian healthcare decision makers. Having a tool to improve this practice is thus crucial to maximizing the limited resources available. It is evident from the results of this study that an evaluation tool can indeed be developed from a high performance framework for priority setting and resource allocation. While future application may precipitate further refinement to content, assessment suggests that the tool currently encompasses the core elements of this process.  Implementation of the tool clearly identified areas of strength, improvement, and weakness in the test organization?s priority setting and resource allocation process. Given the size and location of the organization, this is the first evaluation tool implementation of its kind. Since this application was truly novel, it is not surprising that potential refinements were discovered and incorporated into the tool for future use. Among the changes were language of questions and addition of elements. Additional application in disparate contexts is certainly warranted to further refine the tool for broader use.  Most immediately, documenting the response of the test organization to the evaluation report presented to the senior management team can advance this research. Discussions with the senior management team will be held to accomplish this goal in the near term. Future research should certainly include further application of the tool to aid decision makers in other organizations as   105 well as efforts to develop a self-assessment tool for organizations to be to evaluate their own priority setting and resource allocation processes independently.   106 Bibliography  1.  Smith N, Mitton C, Cornelissen E, Gibson J, Peacock S. Using evaluation theory in priority setting and resource allocation. J Health Organ Manag. 2012 Aug 31;26(5):655?71.  2.  Mitton C, Donaldson C. Health care priority setting: principles, practice and challenges. Cost Eff Resour Alloc CE. 2004 Apr 22;2(1):3.  3.  Donaldson C, Mitton C. Priority Setting Toolkit: Guide to the Use of Economics in Healthcare Decision Making. London: BMJ Publishing Group; 2004.  4.  Farrar S, Ryan M, Ross D, Ludbrook A. Using discrete choice modelling in priority setting: an application to clinical service developments. Soc Sci Med 1982. 2000 Jan;50(1):63?75.  5.  Spending and Health Workforce National Health Expenditure Trends, 1975 to 2012. Canadian Institute for Health Information; 2012.  6.  Health funding formula helps Ottawa, burdens provinces - Politics - CBC News [Internet]. [cited 2013 Aug 3]. Available from: http://www.cbc.ca/news/politics/story/2012/01/12/pol-cp-pbo-health-transfers-cut.html 7.  Matier C. Renewing the Canada Health Transfer: Implications for Federal and Provincial -Territorial Fiscal Sustainability [Internet]. 2012 [cited 2013 Aug 3]. Available from: http://www.parl.gc.ca/pbo-dpb/documents/Renewing_CHT.pdf 8.  Stewart M, Beckman K, Hodgson G. Ontario?s Economic and Fiscal Prospects: Challenging Times Ahead [Internet]. The Conference Board of Canada; 2012 [cited 2013 Aug 3]. Available from: http://www.conferenceboard.ca/temp/ce7e459a-d513-4641-8ecd-b185900810ab/12-205_ontario-cashc-rpt.pdf 9.  Canada?s wealthiest province cuts deep as Alberta embraces austerity [Internet]. Globe Mail. [cited 2013 Aug 4]. Available from: http://www.theglobeandmail.com/news/national/canadas-wealthiest-province-cuts-deep-as-alberta-embraces-austerity/article9474700/ 10.  Budget and Fiscal Plan (2012//13 - 2014/15) [Internet]. Ministry of Finance; [cited 2013 Aug 29]. Available from: http://www.bcbudget.gov.bc.ca/2012/bfp/2012_Budget_Fiscal_Plan.pdf 11.  Sibbald SL, Singer PA, Upshur R, Martin DK. Priority setting: what constitutes success? A conceptual framework for successful priority setting. BMC Health Serv Res. 2009;9:43.  12.  Mitton C, Donaldson C. Setting priorities in Canadian regional health authorities: a survey of key decision makers. Heal Policy Amst Neth. 2002 Apr;60(1):39?58.    107 13.  Mitton CR, Donaldson C. Setting priorities and allocating resources in health regions: lessons from a project evaluating program budgeting and marginal analysis (PBMA). Heal Policy Amst Neth. 2003 Jun;64(3):335?48.  14.  Birch S, Chambers S. To each according to need: a community-based approach to allocating health care resources. CMAJ Can Med Assoc J J Assoc Medicale Can. 1993 Sep 1;149(5):607?12.  15.  Mitton C, Prout S. Setting priorities in the south west of Western Australia: where are we now? Aust Heal Rev Publ Aust Hosp Assoc. 2004 Dec 13;28(3):301?10.  16.  Daniels N, Sabin J. Limits to health care: fair procedures, democratic deliberation, and the legitimacy problem for insurers. Philos Public Aff. 1997;26(4):303?50.  17.  Mitton C, Donaldson C. Twenty-five years of programme budgeting and marginal analysis in the health sector, 1974-1999. J Health Serv Res Policy. 2001 Oct;6(4):239?48.  18.  Gibson JL, Martin DK, Singer PA. Evidence, economics and ethics: resource allocation in health services organizations. Healthc Q Tor Ont. 2005;8(2):50?9, 4.  19.  Gibson J, Mitton C, Martin D, Donaldson C, Singer P. Ethics and economics: does programme budgeting and marginal analysis contribute to fair priority setting? J Health Serv Res Policy. 2006 Jan;11(1):32?7.  20.  Daniels N. Just health care: studies in philosophy and health policy. Cambridge [Cambridgeshire]; New York: Cambridge University Press; 1985.  21.  Menon D, Stafinski T, Martin D. Priority-setting for healthcare: who, how, and is it fair? Heal Policy Amst Neth. 2007 Dec;84(2-3):220?33.  22.  Teng F, Mitton C, MacKenzie J. Priority setting in the provincial health services authority: survey of key decision makers. BMC Health Serv Res. 2007 Jun 12;7:84.  23.  Mitton C, McGregor J, Conroy M, Waddell C. Making choices in healthcare: the reality of scarcity. Hosp Q. 2002;6(1):48?54.  24.  Dionne F, Mitton C, Smith N, Donaldson C. Evaluation of the impact of program budgeting and marginal analysis in Vancouver Island Health Authority. J Health Serv Res Policy. 2009 Oct;14(4):234?42.  25.  Mitton C, Dionne F, Damji R, Campbell D, Bryan S. Difficult decisions in times of constraint: criteria based resource allocation in the Vancouver Coastal Health Authority. BMC Health Serv Res. 2011;11:169.  26.  Smith N, Mitton C, Bryan S, Davidson A, Urquhart B, Gibson JL, et al. Decision maker perceptions of resource allocation processes in Canadian health care organizations: a national survey. BMC Health Serv Res. 2013;13:247.    108 27.  Nieva VF, Sorra J. Safety culture assessment: a tool for improving patient safety in healthcare organizations. Qual Saf Health Care. 2003;12(suppl 2):ii17?ii23.  28.  Yusof MM, Kuljis J, Papazafeiropoulou A, Stergioulas LK. An evaluation framework for Health Information Systems: human, organization and technology-fit factors (HOT-fit). Int J Med Inf. 2008 Jun;77(6):386?98.  29.  Graham ID, Harrison MB, Brouwers M, Davies BL, Dunn S. Facilitating the Use of Evidence in Practice: Evaluating and Adapting Clinical Practice Guidelines for Local Use by Health Care Organizations. J Obstet Gynecol Neonatal Nurs. 2002;31(5):599?611.  30.  Glickman NJ, Servon LJ. More than bricks and sticks: Five components of community development corporation capacity. Hous Policy Debate. 1998;9(3):497?539.  31.  Using the Balanced Scorecard as a Strategic Management System [Internet]. [cited 2013 Aug 7]. Available from: http://hbr.org/2007/07/using-the-balanced-scorecard-as-a-strategic-management-system/ar/1 32.  Kershaw R, Kershaw S. Developing a BALANCED SCORECARD to Implement Strategy at St. Elsewhere Hospital. Manag Account Q. 2001 Winter;2(2):28?35.  33.  Jones C. Developing a Scorecard for Service Quality. Manag Serv. 2004 Apr;48(4):8?13.  34.  Kaplan RS, Norton DP. The Balanced Scorecard--Measures That Drive Performance. Harv Bus Rev. 1992 Jan;70(1):71?9.  35.  Baker GR, Pink GH. A Balanced Scorecard for Canadian Hospitals. Healthc Manage Forum. 1995;8(4):7?13.  36.  Alkin MC, Christie CA. An evaluation theory tree. Eval Roots Tracing Theor Views Influ. 2004;12?65.  37.  Government of Canada CI of HR. A Guide to Evaluation in Health Research - CIHR [Internet]. 2012 [cited 2013 Feb 22]. Available from: http://www.cihr-irsc.gc.ca/e/45336.html 38.  Patton MQ. Utilization-focused evaluation. Thousand Oaks, Calif.: Sage Publications; 2008.  39.  Sibbald SL, Gibson JL, Singer PA, Upshur R, Martin DK. Evaluating priority setting success in healthcare: a pilot study. BMC Health Serv Res. 2010 May 19;10(1):131.  40.  Stufflebeam DL, editor. Evaluation Models: New Directions for Evaluation, Number 89. 1st ed. Jossey-Bass; 2001.  41.  Shadish WR, Cook TD, Leviton LC. Foundations of Program Evaluation: Theories of Practice. 0 ed. Sage Publications; 1990.    109 42.  Peacock S. An evaluation of program budgeting and marginal analysis applied in South Australian hospitals. Melb Cent Heal Program Eval Monash Univ. 1998;  43.  Kapiriri L, Martin DK. Successful priority setting in low and middle income countries: a framework for evaluation. Heal Care Anal HCA J Heal Philos Policy. 2010 Jun;18(2):129?47.  44.  Smith N. High Performance in Healthcare Priority Setting and Resource Allocation: Key Elements (Manuscript Under Review).  45.  High Performing Healthcare Systems: Delivering Quality by Design. Longwoods Publishing; 2008.  46.  Smith N, Hall W, Mitton C, Bryan S, Urquhart B. Narratives of High Performance in Priority Setting and Resource Allocation (Manuscript Under Review).  47.  Creswell JW. Qualitative inquiry and research design: choosing among five traditions. London: SAGE; 2007.  48.  Questioning Techniques - Communication Skills Training from MindTools.com [Internet]. [cited 2013 Jun 20]. Available from: http://www.mindtools.com/pages/article/newTMC_88.htm 49.  Questioning Techniques [Internet]. Gnosis Learning; 2009 [cited 2009 Aug 8]. Available from: http://www.gnosislearning.com/_document/Questioning+Techniques.pdf 50.  Questions and Questioning Techniques [Internet]. Yale Graduate School; [cited 2013 Aug 8]. Available from: http://www.yale.edu/graduateschool/teaching/forms/QuestionsandQuestioning.pdf 51.  Stake RE. The Art of Case Study Research. SAGE; 1995.  52.  Qualitative Research & Evaluation Methods [Internet]. SAGE. [cited 2013 Oct 21]. Available from: http://www.sagepub.com/books/Book9906?prodId=Book9906 53.  Cassell C, Symon G. Essential guide to qualitative methods in organizational research. London; Thousand Oaks: SAGE Publications; 2004.  54.  Holsti OR. Content analysis for the social sciences and humanities. Addison-Wesley Pub. Co.; 1969.  55.  Ocloo JE, Fulop NJ. Developing a ?critical? approach to patient and public involvement in patient safety in the NHS: learning lessons from other parts of the public sector? Health Expect. 2012;15(4):424?32.    110 56.  Abelson J, Forest P-G, Eyles J, Smith P, Martin E, Gauvin F-P. Deliberations about deliberative methods: issues in the design and evaluation of public participation processes. Soc Sci Med 1982. 2003 Jul;57(2):239?51.  57.  Contandriopoulos D. A sociological perspective on public participation in health care. Soc Sci Med 1982. 2004 Jan;58(2):321?30.  58.  Bruni RA, Laupacis A, Levinson W, Martin DK. Public involvement in the priority setting activities of a wait time management initiative: a qualitative case study. BMC Health Serv Res. 2007;7:186.  59.  Litva A, Coast J, Donovan J, Eyles J, Shepherd M, Tacchi J, et al. ?The public is too subjective?: public involvement at different levels of health-care decision making. Soc Sci Med 1982. 2002 Jun;54(12):1825?37.  60.  Bruni RA, Laupacis A, Levinson W, Martin DK. Public views on a wait time management initiative: a matter of communication. BMC Health Serv Res. 2010;10:228.  61.  Guindo LA, Wagner M, Baltussen R, Rindress D, van Til J, Kind P, et al. From efficacy to equity: Literature review of decision criteria for resource allocation and healthcare decisionmaking. Cost Eff Resour Alloc CE. 2012 Jul 18;10(1):9.  62.  Dionne F, Mitton C, Smith N, Donaldson C. Decision maker views on priority setting in the Vancouver Island Health Authority. Cost Eff Resour Alloc CE. 2008;6:13.  63.  Mitton C, Mackenzie J, Cranston L, Teng F. Priority setting in the Provincial Health Services Authority: case study for the 2005/06 planning cycle. Healthc Policy Polit Sant?. 2006 Jul;2(1):91?106.  64.  Mitton CR, Donaldson C, Waldner H, Eagle C. The evolution of PBMA: towards a macro-level priority setting framework for health regions. Health Care Manag Sci. 2003 Nov;6(4):263?9.  65.  McCafferty S, Williams I, Hunter D, Robinson S, Donaldson C, Bate A. Implementing world class commissioning competencies. J Health Serv Res Policy. 2012 Jan;17 Suppl 1:40?8.  66.  Gibson J, Mitton C, DuBois-Wing G. Priority setting in Ontario?s LHINs: ethics and economics in action. Healthc Q Tor Ont. 2011;14(4):35?43.  67.  Hunter DJ. Coping with uncertainty: decisions and resources within health authorities. Sociol Health Illn. 1979;1(1):40?68.  68.  Donaldson PC, Bate DA, Peacock DS, Ruta DD. Priority setting in the public sector: turning economics into a management process. 2006 [cited 2013 May 29]; Available from: http://eprint.ncl.ac.uk/pub_details2.aspx?pub_id=4483   111 69.  Peacock SJ, Mitton C, Ruta D, Donaldson C, Bate A, Hedden L. Priority setting in healthcare: towards guidelines for the program budgeting and marginal analysis framework. Expert Rev Pharmacoecon Outcomes Res. 2010 Oct;10(5):539?52.  70.  Williams I, Dickinson H, Robinson S. Rationing in health care: the theory and practice of priority setting. Bristol; Chicago: Policy Press; 2012.  71.  Austin D. Priority Setting: an Overview [Internet]. [cited 2013 Sep 15]. Available from: http://www.nhsconfed.org/Publications/Documents/Priority%20setting%20an%20overview.pdf    112 Appendices Appendix A    Allocation Elements of High Performance in Organization-wide Resource    113 Appendix B  Conceptual Criteria from Sibbald et al. Process Criteria Outcome Criteria Stakeholder Engagement Stakeholder Understanding Explicit Process Shifted Resources Clear and Transparent Information Management Decision Making Quality Consideration of Values and Context Stakeholder Acceptance and Satisfaction Revision or Appeals Mechanism Positive Externalities                     114 Appendix C  Original Evaluation Tool ? Senior Manager Version Processes Element Questions for SMT Responses Process, Cell 1: PSRA at the organization-wide level is based on economic and ethical principles  Which of the following options is the best description of priority setting and resource allocation in your organization?  1. We have a formal process that we use to set priorities and allocate resources  2. Each department and program expects to receive the same amount of resources as in past years  3. Our spending patterns are largely determined by external provincial government requirements and expectations  Process, Cell 1a: Well-defined, weighted criteria which reflect the organization?s values and strategic priorities When evaluating resource allocation proposals, does the senior management team (SMT) use explicit criteria?   Could you give some examples of criteria that are commonly considered and how they are applied in decision making?   On a scale of 1-5, 1 being poor and 5 being excellent, how would you rate the criteria which are applied in your organization?s priority setting process with respect to: ! Clarity ! Their ability to facilitate ranking of funding options ! Their ability to capture the benefits and costs of proposals   Process, Cell 1b: Use of an assessment tool to operationalize criteria in ranking individual proposals.         Do you have a formal tool that enables the consistent application of criteria (eg. one that is based on assigning points to proposals based on criteria, and where you could easily rank proposals based on their scores)? (Yes/No/Don?t know)     Process, Cell 1f: A decision review mechanism  Is there a decision review process for revisiting and revising decisions made during priority setting? [Yes/No/Don?t know]  Could you please describe it ? how would a member of your organization appeal for a decision review?            115 Element Questions for SMT Responses Process, Cell 3: Skill development occurs throughout the organization including managers, directors, senior executive and clinical leaders, focusing on the process as well as underlying economic and ethical principles. Could you give examples of how your organization is supporting skill development/ongoing training around priority setting?  [Eg. workshops, handouts or summaries, peer support sessions etc.]  Process, Cell 4: Follow through on decisions: SMT puts in place appropriate change management strategies, with performance measurement, tracking of outcomes, and responds as needed. After resource allocation decisions are made, how would you rate your organization?s ability to implement these decisions (1=poor, 5=excellent)?  Describe the monitoring your organization has in place to evaluate funding decisions once they have been implemented?  What barriers does your organization face when implementing priority setting decisions?  Process, Cell 5: Facilitation of priority setting process by a skilled project manager/coordinator.  Is there someone in your organization who oversees/coordinates the entire priority setting process? [Yes/No/Don?t Know]  [eg. organizes meetings, keeps everyone on track, plans for next budget cycle, coordinates investment proposal review]                                            116 Structures Element Questions for SMT Responses Structure, Cell 1:  SMT has the ability and authority to move financial resources within and across silos. Does your SMT have the authority/ability to re-allocate resources?  [Eg. take resources from an area of lesser health gain, and deploy those resources to an area with higher health gain]  What are some of the key barriers to re-allocation in your organization?  Structure, Cell 2:  Mechanisms are established for engagement of staff (clinical and non-clinical) in PSRA decisions, with particular though not exclusive attention to physicians. How do you involve lower level staff in decision-making?  How do you think involvement could be improved?  [eg. Better engagement tools, wider engagement etc.]  When resource allocation decisions are implemented, is resistance often encountered from physicians or other staff?   Structure, Cell 2a:  Mechanisms may include the use of incentives to encourage participation and foster active engagement What, if any, incentives does your organization use to foster participation in priority setting?  [eg. financial incentives ? if you disinvest x dollars we will give you a percentage back to re-invest as you see fit, threats ? if you don?t participate we will make these decisions for you]   Structure, Cell 3:  Coordination of priorities/criteria/processes across all organizational planning processes (eg. HR, IT, capital)  How well coordinated is your operational priority setting process with other organizational processes that impact the allocation of resources? [Eg. IT, Capital, HR]  Can you think of a resource allocation decision that did not consider impacts on other organizational processes? What mistakes we made in that case? How come there wasn?t cooperation?  Structure, Cell 4:  Relative Stability of Organizational Structure and Continuity of Personnel  How much change in senior management and board of governors personnel has your organization undergone in the last three years?  - Major Change (about half or more of senior management positions have turned over) - Moderate Change (about a quarter of senior management positions have turned over) - Minimal Change (a few positions have changed, but generally quite stable)  How do you think this stability/turnover of the team has affected priority setting in your organization?  Structure, Cell 5: Adequate but not excessive time and resources are committed to support PSRA at the SMT and Staff level Are the time and resources currently allocated to priority setting by your organization adequate?     117 Attitudes & Behaviours Element Questions for SMT Responses Attitudes Cell 1: Working relationships within the SMT are respectful and characterized by jointly addressing challenges, mutual trust, honesty, and the open and frank exchange of views  If you have a concern (about a decision or an aspect of the priority setting process), how comfortable do you feel voicing it to your SMT?   Are all members of the SMT given the opportunity to contribute to discussions?  Are members of the SMT direct and honest with each other?  Do members of the SMT accept criticism without becoming defensive?  Do you feel that backroom deals and collusion is common in your priority setting process?  Attitudes Cell 2: There is a culture of improvement. The SMT strives for excellence, and is willing to seek out and learn from what peers and leading organizations are doing. Would you characterize your organization as having a culture of improvement (ie. wanting to always be better/the best)?  How does that manifest itself in your priority setting?  [eg. opportunities for skill development, benchmarking, focus on data collection]  Attitudes Cell 3: Decisions are made with a system-wide perspective, and a view to their long-term strategic alignment: - Senior leaders adopt a system-wide point of view while considering how decisions will be experienced across departments and over a multi-year timeframe. - SMT is willing to look beyond incremental spend to re-assess base budgets, i.e., to pursue marginal analysis and disinvestment opportunities.  How is your organizational strategic vision/goals/objectives integrated in your priority setting and resource allocation decisions?  Is there a history of releasing resources from your base budget to re-invest in strategic priorities?  Would you characterize the approach to resource allocation of individual members of the SMT as siloed or collective?    Attitudes Cell 4: Fit of priority setting decisions with social and community values is sought:   - Public participation and input is valued; it is sought and integrated into decisions in meaningful ways  - Alignment with external partners and the larger regional or provincial health system exists  Do you engage the public specifically in the priority setting process? How?  Does your process take the actions of external partners into account? Does your organization make an effort to collaborate or fit decisions with the actions of external partners?  [eg. other hospitals or health authorities, community care centers]  Attitudes Cell 5: SMT displays strong leadership for PSRA? SMT is aware of and manages the external environment and other constraining factors, and is willing to take and stand behind tough decisions. Does the SMT do a good job at motivating and communicating the need for explicit priority setting?   In the face of external pressures and tough decisions, how does the SMT team respond? Could you give some examples?            118 Outcomes Element Questions for SMT Responses Outcome Cell 1: Actual reallocation of financial resources is achieved.  In the past budget cycle, were resources removed from one area of the organization and invested into another (that would ostensibly yield greater health gains)?  Outcome Cell 2: The process has the understanding and endorsement of key internal and external stakeholders (e.g., Board of Directors, staff and medical leadership, Ministry, public).  Is your process well understood by your internal and external stakeholders?  [e.g., Board of Directors, staff and medical leadership, Ministry, public]  Does your priority setting process have their endorsement?  Outcome Cell 3: There is greater understanding among participants of the organization as a whole, and of PSRA practice. After undergoing a priority setting and resource allocation process, do you feel you have a greater understanding of your organization & the process you have gone through?  Outcome Cell 4: Resource allocation decisions are justified in light of the organization?s established and agreed upon core values. Progress is made toward identified strategic goals and objectives. Would you agree that your organization?s core values and strategic goals helped guide the last priority setting cycle?  Do you feel that you will be able to meet those objectives with your current process?  Outcome Cell 5: Improved health (broadly defined) is achieved as a result of decisions made through the RA process. Effective, efficient, equitable, accessible, safe, and high quality care is delivered. Would you say that your current process for priority setting and resource allocation leads to decisions that will improve the health of the population you serve? What available evidence would you use to support that claim?                     119 Appendix D   Original Evaluation Tool ? Middle Manager Version Processes Element Questions for Staff Responses Process, Cell 1: PSRA at the organization-wide level is based on economic and ethical principles  Which of the following options is the best description of priority setting in your organization?  1. We have a formal process that we use to set priorities and allocate resources  2. Each department and program expects to receive the same amount of resources as in past years  3. Our spending patterns are largely determined by external provincial government requirements and expectations   Process, Cell 1a: Well-defined, weighted criteria which reflect the organization?s values and strategic priorities Are you aware of the criteria that the SMT uses to make priority setting decisions?  Could you give some examples of criteria that are considered?   Do you think those criteria are appropriate (ie. do they capture all the benefits of proposals)?  Do you think that they are consistently applied to every proposal in the same way?  Process, Cell 1b: Use of an assessment tool to operationalize criteria in ranking individual proposals.    Process, Cell 1e: Mechanisms for incorporating best available evidence  When you are creating an investment or disinvestment proposal, what are some of the common types of evidence that you include as part of the application?   [eg. - Published Research Evidence - Ministry Data - Actions of peer organizations - Formal evaluations of own programs - Program Statistics/Performance Measurement and monitoring - Experience and Opinions of Program staff - Experience and Opinions of Patients - Public Opinion in community]  Does the SMT find these types of evidence sufficient or are you often asked to collect more evidence?  How would you rate your abilities to collect the necessary evidence for investment/disinvestment proposals?           120 Element Questions for SMT Responses Process, Cell 1f: A decision review mechanism  Is there a decision review process for revisiting and revising decisions made during priority setting? [Yes/No/Don?t Know]  Have you ever used this process to revisit a decision? With what result? Were you satisfied with the process?  Process, Cell 2: SMT ensures effective communication (both internally and externally) around its priority setting and resource allocation -- leading to transparency. Do you feel that internal communication about priority setting and resource allocation process is adequate?  [eg. outcomes of decisions, reasoning, next steps, decision review, general steps]  Which aspects of the process would you like to have more information about?  Process, Cell 3: Skill development occurs throughout the organization including managers, directors, senior executive and clinical leaders, focusing on the process as well as underlying economic and ethical principles. Are you aware of or have you taken part in skill development or capacity building around priority setting in your organization?  Do you think you or other colleagues would benefit from more training in this area?  [Eg. workshops, peer support, handouts/summaries]   Process, Cell 4: Follow through on decisions: SMT puts in place appropriate change management strategies, with performance measurement, tracking of outcomes, and responds as needed. After resource allocation decisions are made, how would you rate your organization?s ability to implement these decisions (1=poor, 5=excellent)?  What barriers does your organization face when implementing priority setting decisions?  Process, Cell 5: Facilitation of priority setting process by a skilled project manager/coordinator.  Is there someone in your organization who oversees/coordinates the entire priority setting process? [Yes/No/Don?t Know]                             121 Structures Element Questions for Staff Responses Structure, Cell 1:  SMT has the ability and authority to move financial resources within and across silos. Have you observed resources being re-allocated from one area of your organization to another in the past year?   Structure, Cell 2:  Mechanisms are established for engagement of staff (clinical and non-clinical) in PSRA decisions, with particular though not exclusive attention to physicians. How does the SMT involve frontline staff in decision-making?  How do you think involvement could be improved?   [eg. Better engagement tools, wider engagement etc.]  When resource allocation decisions are implemented, is resistance often encountered from physicians or other staff?   Structure, Cell 2a:  Mechanisms may include the use of incentives to encourage participation and foster active engagement Are there any formal incentives that encourage you to participate in your organization?s priority setting process?  If so, what are they?  Structure, Cell 3:  Coordination of priorities/criteria/processes across all organizational planning processes (eg. HR, IT, capital)  How well coordinated do you think your priority setting process with other organizational processes that impact the allocation of resources? [Eg. IT, Capital, HR]  Can you think of a resource allocation decision that did not consider impacts on other organizational processes? What happened?  Structure, Cell 4:  Relative Stability of Organizational Structure and Continuity of Personnel  How much change in senior management personnel has your organization undergone in the last three years?  - Major Change (about half or more of senior management positions have turned over) - Moderate Change (about a quarter of senior management positions have turned over) - Minimal Change (a few positions have changed, but generally quite stable)  How do you think this stability/turnover of the team has affected priority setting in your organization?  Structure, Cell 5: Adequate but not excessive time and resources are committed to support PSRA at the SMT and Staff level  Do you feel that you have enough time and resources to complete priority setting tasks?   (eg. research supporting evidence for proposals, filling out investment proposals, implementing resource allocation decisions, following up on decisions, planning to meet strategic priorities)           122 Attitudes & Behaviours Element Questions for Staff  Attitudes Cell 1: Working relationships within the SMT are respectful and characterized by jointly addressing challenges, mutual trust, honesty, and the open and frank exchange of views  How would you describe the relationships between members of the SMT?  [eg. honest, trusting, belligerent, combative, on-edge]    Attitudes Cell 2: There is a culture of improvement. The SMT strives for excellence, and is willing to seek out and learn from what peers and leading organizations are doing. Would you characterize your organization as having a culture of improvement?  Do you feel that learning and professional-development (related to priority setting and resource allocation) is encouraged in your organization?    Attitudes Cell 3: Decisions are made with a system-wide perspective, and a view to their long-term strategic alignment: - Senior leaders adopt a system-wide point of view while considering how decisions will be experienced across departments and over a multi-year timeframe. - SMT is willing to look beyond incremental spend to re-assess base budgets, i.e., to pursue marginal analysis and disinvestment opportunities.    Attitudes Cell 4: Fit of priority setting decisions with social and community values is sought:   - Public participation and input is valued; it is sought and integrated into decisions in meaningful ways  - Alignment with external partners and the larger regional or provincial health system exists    Attitudes Cell 5: SMT displays strong leadership for PSRA? SMT is aware of and manages the external environment and other constraining factors, and is willing to take and stand behind tough decisions. How would you characterize the leadership of your organization with respect to priority setting?  Could you give examples of when leadership was present or lacking?                        123 Outcomes Element Questions for Staff  Outcome Cell 1: Actual reallocation of financial resources is achieved.    Outcome Cell 2: The process has the understanding and endorsement of key internal and external stakeholders (e.g., Board of Directors, staff and medical leadership, Ministry, public).  Overall, do you understand and endorse your organization?s priority setting process?  Outcome Cell 3: There is greater understanding among participants of the organization as a whole, and of PSRA practice. After undergoing a priority setting and resource allocation process, do you feel you have a greater understanding of your organization & the process you have gone through?  Outcome Cell 4: Resource allocation decisions are justified in light of the organization?s established and agreed upon core values. Progress is made toward identified strategic goals and objectives.  Improved health (broadly defined) is achieved as a result of decisions made through the RA process. Effective, efficient, equitable, accessible, safe, and high quality care is delivered. Would you agree that your organization?s core values and strategic goals were reflected in the last priority setting cycle?  Do you feel that you will be able to meet those objectives with your current process?  Based on available measures and your perception, would you say that your current process for priority setting and resource allocation is improving the health of the population you serve?  What available evidence is there to support that claim?                       124 Appendix E   Recommendations From Test Organization Evaluation Report Elements of High Performance Recommendation in Pilot Organization 1. Training - P3, A5, O3 -  Expand and enhance training provided to lower level managers. Leverage experience of VWG 2. Communication - P2, A5, O3 Increase and enhance communication with a special focus on rationale for proposal decisions 3. Circumvention of Proposals Document proposals that circumvent the normal PSRA process 4. Monitoring and Oversight - P4 Increase the monitoring and evaluation of proposals 5. Program Budgeting - P1 Build upon existing success to expand the practice of program budgeting to other departments in the organization   1. Training - Without a formal training program for PSRA, this organization runs the risk of losing trust and engagement from staff that could manifest in the form of attempts to game the process and poorly constructed proposals.  Although training to members of the organization in general was lacking, the education and skill development given to members of the Validation Working Group (VWG) was much more comprehensive.  Members of the VWG were introduced to the process and coached by an external expert in PSRA who also served as a mentor during implementation. They also had the opportunity to communicate directly with the individuals of the senior management team, and a particular   125 member found it very useful to get the experience of ranking several proposals using the organization?s assessment tool.   While this level of training is perhaps excessive for a frontline staff member, elements of VWG training could be used to enhance general education for members of the organization.   This organization could also leverage the knowledge gained by VWG members by encouraging them to act as supports for staff and managers with less experience in the PSRA process. Mentorship from VWG members would be of particular strategic value in areas where many of the managers were relatively new to their positions and process.  This organization should also consider encouraging as many members organization to join the VWG as possible in order to create an even greater pool of expert mentors. Special consideration should also be given to frontline clinicians and support services personnel (eg. capital and IT) who are interested in joining the group in order to foster greater engagement from this subset of the organization.  For the remaining members of the organization, a more formalized training program could include ?PowerPoint presentations, practical/hands-on workshops, working through proposals together, figuring out required information for proposals, available resources, and a how-to manual.? (C-2, Middle Manager).    126 Since reports from participants suggest that the organization could benefit from greater integration of their operational, capital and IT processes, particular effort should be placed on educating members of the capital and IT teams. Not only could greater involvement from these groups encourage collaboration, but it could also help ensure that hidden costs are not overlooked in future proposals.  One manager suggested that any formal training that takes place should occur early so that individuals ?have lots of time and it won?t be a hurry up and get it done and sorry we can?t get out to you or we can only do it for a day and a half? (G-1, Middle Manager). Their innovation and development commons (IDC), which runs brown paper bag education sessions, was recommended along with champions from areas that have achieved high levels of performance to be included in the planning and implementation of the training program (G-1, Middle Manager).  A final suggestion was that staff participants should be given paid time to attend sessions.   ?They are going to get paid for it and I would have it during work hours. I would not make them come in on a weekend or an evening. That would be coup de grace or fate of death if you ask people to come in on off time. That?s what I would do.? (I-16, Middle Manager)  If implemented, a more formal training process would deliver greater understanding for members of the organization, facilitate wider participation, and encourage greater collaboration in the process.     127 2. Communication ? During the evaluation, both frontline managers and middle managers reported issues with process understanding, frustration with timelines, and a desire for more feedback regarding the rationale for resource allocation decisions.  In order to address these issues, investigators suggest that this organization re-focus on distributing reminders (both in person and online) to managers before the PSRA process begins, and on the status of their proposals when it has begun.   Keeping managers well informed on the status of their proposals would allow them to relay this information to frontline staff and managers in order to maintain engagement, and allow them to prepare for future steps in the process or potential implementation of their proposal.  Since there was a very high demand from participants to understand why their proposal was accepted or rejected, investigators recommend that a formal one-page document be developed - to be filled out be senior management after a decision was reached on each individual proposal.   This document would detail why a proposal was accepted or rejected and provide the scores from the assessment tool. It would address whether a manager should re-submit a proposal and whether additional evidence is required to support the proposal. This formal tool would standardize communication related to decisions, give managers the information they requested, and allow them to potentially pursue or abandon proposals depending on the feedback they received.    128 3. Circumvention of some proposals - Since some proposals will be necessary but may never score well enough with organizational criteria in the assessment tool, a circumvention process may be needed. However, in order to increase transparency, these ?fast-tracked? proposals should be documented and a rationale should be provided as to why they were pre-approved.  4. Monitoring and Oversight - In order to ensure that funds are being allocated appropriately, and that managers are receiving adequate support to implement allocation decisions, monitoring and oversight is necessary.  While some managers reported excellent oversight, others did report a deficiency in follow up from senior management and finance.  Senior managers admitted several years ago next to no oversight and evaluation of decisions was taking place, and that currently ?the level of monitoring depends on the amount of resources shifting? (I-6, Senior Manager).  In order to foster trust in the process and ensure decisions are implemented appropriately across the whole organization, this trend of improvement should continue by introducing minimum levels of monitoring to all proposals (while maintaining heightened levels for decisions dealing with large quantities of resources).    5. Program Budgeting - Although a ?trimming? approach to disinvestments has dominated this organization?s PSRA process, both senior and middle managers recognize that some form of   129 program budgeting is necessary to identify disinvestments in low priority programs. Recommendations include improving communication, building capacity, and learning from prior successes.  Communication: In order to clarify the intentions of senior management, requests could be made to lower level managers for ?low priority programs? rather than ?disinvestment proposals?. While ?disinvestments? were interpreted by managers as cuts that were no different from past budget cycles, ?low priority programs? is much more specific. It also forces managers to implement some sort of program budgeting exercise in their portfolio to identify programs that are low priority.  The communication to managers should clearly state that even certain services within high priority programs should be submitted in a disinvestment proposal if the marginal cost will be less than marginal gain were the resources invested elsewhere.   Eg. Reducing the evening shift of a very effective clinic if very few patients are seen in the evening, and resources from the clinic could be more effectively used in a very busy local hospital at that time.  Leadership must also communicate their support and desire for disinvestment proposals even if they are politically contentious.     130 Building Capacity: Interviews with managers have revealed that they lack the time and skills to be able to conduct program budgeting in their respective departments. In the upcoming PSRA cycle, training and support should be delivered to managers responsible for submitting proposals well ahead of the submission deadline.  Although all levels of management should ideally understand and be able to perform program budgeting, a first step might be to educate regional middle managers and have them create high level program budgets. In future cycles, training could be disseminated to lower levels of management creating a more and more detailed map of organizational spending.  Learning from prior success: Despite struggles with program budgeting as an organization, one department showed outstanding success in this regard. To leverage and expand upon this success, the head of the department should be consulted in developing a standardized program budgeting plan for the entire organization.  Based on their approach, the following lessons should be considered: 1. Hold discussions with staff to determine how one?s department can operationalize the organization?s strategic plan in their own area 2. Map functions and services in department to determine how resources are being spent and the outcomes of expenditure ? categorize if necessary 3. Develop ranking criteria 4. Use ranking criteria to score each service and determine low and high priority programs    131 Although implementing a more robust program budgeting strategy will be resource intensive initially, these maps and rankings of current resource expenditure can be used to develop investment and disinvestment proposals for years to come.  ?Doing a comprehensive, robust review that we chunk out a block of time and we say, and then that can guide us for up to probably the next two or three or four years to say, ?We did this huge review. This is what we set for priority programs and all that kind of stuff for our HSDA and how we?re going to manage services and impacts to other programs and things like that? (G-1, Middle Manager).                                  132 Appendix F   Adapted Evaluation Tool ? Senior Manager Version Introductory Questions Questions Responses Could you give a quick description of your role in your organization?   How would you describe your PSRA process?   What is your role in the process?   What are the strengths and weaknesses of your process?     Processes Element Questions for SMT Responses Process, Cell 1: PSRA at the organization-wide level is based on economic and ethical principles  On a scale of 1-5 (1-?not at all? and 5 ? ?very well?), how well does each of the following statements describe your organization?s priority setting and resource allocation process.  1. We have a formal process that we use to set priorities and allocate resources (1-5)  2. Each department and program expects to receive the same amount of resources as in past years (1-5)   3. Our spending patterns are largely determined by external provincial government requirements and expectations (1-5)     Process, Cell 1a: Well-defined, weighted criteria which reflect the organization?s values and strategic priorities When evaluating resource allocation proposals, does the senior management team (SMT) use explicit criteria?   Could you give some examples of these criteria and how they are applied in decision making?   On a scale of 1-5, (1 ? ?poor? and 5 ? ?excellent?), how would you rate the criteria which are applied in your organization?s priority setting process with respect to: ! Clarity (1-5) ! Their ability to facilitate ranking of funding options (1-5) ! Their ability to capture the benefits and costs of proposals (1-5)            133 Element Questions for SMT Responses Process, Cell 1b: Use of an assessment tool to operationalize criteria in ranking individual proposals.  Do you have a formal tool that enables the consistent application of criteria (eg. one that is based on assigning points to proposals based on criteria, and where you could easily rank proposals based on their scores)? (Yes/No/Don?t know)  Overall, how would you rate the proposals that come to you to be assessed?   Do they have adequate evidence support?   Are they tied to your organization?s strategic values?  Are they well formatted and easily understood?   Are the benefits and weaknesses  of the proposals clear?    Process, Cell 1e: Mechanisms for incorporating best available evidence  When considering investment proposals, what are some of the common types of evidence that your SMT uses for evaluation? Could describe how these types of evidence are considered in decision making?   [eg. - Accreditation Standards from Accreditation Canada - Published Research Evidence - Ministry Data - Actions of peer organizations - Formal evaluations of own programs - Program Statistics/Performance Measurement and monitoring - Experience and Opinions of Program staff - Experience and Opinions of Patients - Public Opinion in community]  Does your organization provide support for collecting evidence to support proposals?  How would you rate that support on a scale of (1-unsupported and 5-well supported)  Process, Cell 1f: A decision review mechanism  Is there a decision review process for revisiting and revising decisions made during priority setting? [Yes/No/Don?t know]  Could you please describe it ? how would a member of your organization appeal for a decision review?  At what stage of the process would this decision review take place?         134 Element Questions for SMT Responses Process, Cell 2: SMT ensures effective communication (both internally and externally) around its priority setting and resource allocation -- leading to transparency. On a scale of 1-5 (1-poor and 5 -excellent), how would you rate the communication of the priority setting process in your organization to a) staff and b) external stakeholders? [eg. public, ministry, other healthcare organizations in your region]  On a scale of 1-5 (1-poor and 5 -excellent), how well understood is the process by staff and external stakeholders?  How well do you think staff understand how the SMT comes up with their priority setting decisions on scale of 1-5 (1 ?not at all? -5 ?completely?).  How do you think communication could be improved?  Process, Cell 3: Skill development occurs throughout the organization including managers, directors, senior executive and clinical leaders, focusing on the process as well as underlying economic and ethical principles. Could you give examples of how your organization is supporting skill development/ongoing training around priority setting?  [Eg. workshops, handouts or summaries, peer support sessions etc.]  On a scale of 1-5 (1-poor and 5 -excellent), how would you rate the effectiveness of this training?  How do you think this training could be improved? What type of training would you benefit most from?  Process, Cell 4: Follow through on decisions: SMT puts in place appropriate change management strategies, with performance measurement, tracking of outcomes, and responds as needed. After resource allocation decisions are made, how would you rate your organization?s ability to implement these decisions (1 ? ?poor? and 5 ? ?excellent?)?  What barriers does your organization face when implementing priority setting decisions? How is resistance from staff addressed?  Describe the monitoring your organization has in place to evaluate funding decisions once they have been implemented?  How would you rate the monitoring, evaluation, reporting of proposals once they have been implemented on a 1-5 scale?  Are priority setting tasks part of your performance review?               135 Element Questions for SMT Responses Process, Cell 5: Facilitation of priority setting process by a skilled project manager/coordinator. Are there explicit timelines for your process?  On a scale of 1-5, (1 being ?never?, and 5 being ?always?) are priority setting process deadlines enforced and respected in your organization?  Is there someone in your organization who oversees/coordinates the entire priority setting process? [Yes/No/Don?t Know]  [eg. organizes meetings, keeps everyone on track, plans for next budget cycle, coordinates investment proposal review]     Structures Element Questions for SMT Responses Structure, Cell 1:  SMT has the ability and authority to move financial resources within and across silos. Does your SMT have the authority/ability to re-allocate resources?  [Eg. take resources from an area of lesser health gain, and deploy those resources to an area with higher health gain]  What are some of the key barriers to re-allocation in your organization?  Structure, Cell 2:  Mechanisms are established for engagement of staff (clinical and non-clinical) in PSRA decisions, with particular though not exclusive attention to physicians. How do you involve lower level staff in decision-making?  On a scale of 1-5 (1 - ?poor? and 5 ??excellent?), how would you rate the engagement of staff in the process?  How do you think involvement could be improved?   [eg. Better engagement tools, wider engagement etc.]   Structure, Cell 2a:  Mechanisms may include the use of incentives to encourage participation and foster active engagement What, if any, incentives does your organization use to foster participation in priority setting?  [eg. financial incentives ? if you disinvest x dollars we will give you a percentage back to re-invest as you see fit, threats ? if you don?t participate we will make these decisions for you]   Structure, Cell 3:  Coordination of priorities/criteria/processes across all organizational planning processes (eg. HR, IT, capital)  How well coordinated is your operational priority setting process with other organizational processes that impact the allocation of resources? [Eg. IT, Capital, HR]  Can you think of a resource allocation decision that did not consider impacts on other organizational processes? What mistakes were made in that case? Why wasn?t there cooperation?     136 Element Questions for SMT Responses Structure, Cell 4:  Relative Stability of Organizational Structure and Continuity of Personnel  How much change in senior management and board of governors personnel has your organization undergone in the last three years?  - Major Change (about half or more of senior management positions have turned over) - Moderate Change (about a quarter of senior management positions have turned over) - Minimal Change (a few positions have changed, but generally quite stable)  How do you think this stability/turnover of the team has affected priority setting in your organization?  Structure, Cell 5: Adequate but not excessive time and resources are committed to support PSRA at the SMT and Staff level Are the time and resources currently allocated to priority setting by your organization adequate?  If you had an extra 2 hours in the day or could clone yourself, what priority setting tasks would you spend more time on?   Attitudes & Behaviours Element Questions for SMT Responses Attitudes Cell 1: Working relationships within the SMT are respectful and characterized by jointly addressing challenges, mutual trust, honesty, and the open and frank exchange of views  If you have a concern (about a decision or an aspect of the priority setting process), how comfortable do you feel voicing it to your SMT?   Are all members of the SMT given the opportunity to contribute to discussions? Do they contribute?  Are members of the SMT direct and honest with each other?  Do members of the SMT accept criticism without becoming defensive?  Do you feel that backroom deals and collusion is common in your priority setting process?  Would you describe the attitudes of individual members of the SMT as more ?collective? ? interested in what?s best for the organization or ?siloed? ? interested in what?s best for their department?  Is there a way to make their attitudes more collaborative?  Attitudes Cell 2: There is a culture of improvement. The SMT strives for excellence, and is willing to seek out and learn from what peers and leading organizations are doing. Would you characterize your SMT? as having a culture of improvement (ie. wanting to always be better/the best)?  How does that manifest itself in your priority setting?  [eg. opportunities for skill development, benchmarking, focus on data collection]       137 Element Questions for SMT Responses Attitudes Cell 3: Decisions are made with a system-wide perspective, and a view to their long-term strategic alignment: - Senior leaders adopt a system-wide point of view while considering how decisions will be experienced across departments and over a multi-year timeframe. - SMT is willing to look beyond incremental spend to re-assess base budgets, i.e., to pursue marginal analysis and disinvestment opportunities.  How is your organizational strategic vision/goals/objectives integrated in your priority setting and resource allocation decisions?  Is there a history of releasing resources from your base budget to re-invest in strategic priorities?  Would you characterize the approach to resource allocation of individual members of the SMT as siloed or collective?    Attitudes Cell 4: Fit of priority setting decisions with social and community values is sought:   - Public participation and input is valued; it is sought and integrated into decisions in meaningful ways  - Alignment with external partners and the larger regional or provincial health system exists  Do you engage the public specifically in the priority setting process? How?  Does your process take the actions of external partners into account? Does your organization make an effort to collaborate or fit decisions with the actions of external partners?  [eg. other hospitals or health authorities, community care centers]  Attitudes Cell 5: SMT displays strong leadership for PSRA? SMT is aware of and manages the external environment and other constraining factors, and is willing to take and stand behind tough decisions. Does the SMT do a good job at motivating and communicating the need for explicit priority setting?   In the face of external pressures and tough decisions, how does the SMT team respond? Could you give some examples?   On a scale of 1-5 (1 ? ?poor? and 5 ? ?excellent?), how would you rank your SMT?s knowledge, ease, comfort, and confidence with priority setting and your organization?s process?   Could you give examples of when leadership was present or lacking? Or where you would like some more direction?  What are the consequences for staff or members of the SMT for not participating in the process?   Outcomes Element Questions for SMT Responses Outcome Cell 1: Actual reallocation of financial resources is achieved.              In the past budget cycle, were resources removed from one area of the organization and invested into another (that would ostensibly yield greater health gains)?  On a scale of 1-5 (1 - ?the exact same place? and 5 - ?much more advanced towards your strategic objectives?) do you think you would be in the same place without your current process for priority setting?    138 Element Questions for SMT Responses                     Outcome Cell 2: The process has the understanding and endorsement of key internal and external stakeholders (e.g., Board of Directors, staff and medical leadership, Ministry, public).  On a scale of 1-5 (1 ? ?poor? and 5 ? ?excellent?), how would you rank the understanding of your process by a) your internal and b) your external stakeholders?  [e.g., Board of Directors, staff and medical leadership, Ministry, public]  On scale of 1-5 (1 being ?not at all? and 5 being ?definitely yes?), does your priority setting process have their endorsement?   Outcome Cell 3: There is greater understanding among participants of the organization as a whole, and of PSRA practice. After undergoing a priority setting and resource allocation process, do you feel you have a greater understanding of your organization & the process you have gone through?  Outcome Cell 4 (*new): Proposals submitted for evaluation have a high quality with respect to their use of evidence and diversity (disinvestment proposals include service cuts, not supports exclusively eg. supplies, travel, education) On a scale of 1-5, how would you rank the quality of proposals that come to you for evaluation?  Do they have adequate evidence support?   Are they tied to your organization?s strategic values?  Are they well formatted and easily understood?   Are the benefits and weaknesses clear?   Outcome Cell 5: Resource allocation decisions are justified in light of the organization?s established and agreed upon core values. Progress is made toward identified strategic goals and objectives.  Improved health (broadly defined) is achieved as a result of decisions made through the RA process. Effective, efficient, equitable, accessible, safe, and high quality care is delivered. Would you agree that your organization?s core values and strategic goals helped guide the last priority setting cycle? (yes/no/don?t know)  On scale of 1-5 (1 being ?not at all? and 5 being ?definitely yes?), do you feel that you will be able to meet those objectives with your current process?  Would you say that your current process for priority setting and resource allocation leads to decisions that will improve the health of the population you serve?   Or if they have recently launched a new process:  Thinking back to the resource allocation decision made in the last budget cycle, do you think your organization would have made the same decisions without your current process?  Do you think these decisions have led to better overall health for the population you serve?  What evidence would you use to support that claim?    139 Appendix G   Adapted Evaluation Tool ? Middle Manager Version Introductory Questions Questions Responses Could you give a quick description of your role in your organization?   How would you describe your PSRA process?   What is your role in the process?   What are the strengths and weaknesses of your process?     Processes Element Questions for Staff Responses Process, Cell 1: PSRA at the organization-wide level is based on economic and ethical principles  On a scale of 1-5 (1-?not at all? and 5 ? ?very well?), how well does each of the following statements describe your organization?s priority setting and resource allocation process.  1. We have a formal process that we use to set priorities and allocate resources (1-5)  2. Each department and program expects to receive the same amount of resources as in past years (1-5)  3. Our spending patterns are largely determined by external provincial government requirements and expectations (1-5)   Process, Cell 1a: Well-defined, weighted criteria which reflect the organization?s values and strategic priorities Are you aware of the criteria that the SMT uses to make priority setting decisions?  Could you give some examples of criteria that are considered?   Do you think those criteria are appropriate (ie. do they capture all the benefits of proposals)?  Do you think that they are consistently applied to every proposal in the same way?  Process, Cell 1b: Use of an assessment tool to operationalize criteria in ranking individual proposals.                      140 Element Questions for SMT Responses Process, Cell 1e: Mechanisms for incorporating best available evidence  When you are creating an investment or disinvestment proposal, what are some of the common types of evidence that you include as part of the application?   [eg. - Published Research Evidence - Ministry Data - Actions of peer organizations - Formal evaluations of own programs - Program Statistics/Performance Measurement and monitoring - Experience and Opinions of Program staff - Experience and Opinions of Patients - Public Opinion in community]  Does the SMT find these types of evidence sufficient or are you often asked to collect more evidence?  How would you rate your abilities to collect the necessary evidence for investment/disinvestment proposals?  Is their support available to help you collect evidence when constructing a proposal?   Process, Cell 1f: A decision review mechanism  Is there a decision review process for revisiting and revising decisions made during priority setting? [Yes/No/Don?t Know]  Have you ever used this process to revisit a decision? With what result? Were you satisfied with the process?  Process, Cell 2: SMT ensures effective communication (both internally and externally) around its priority setting and resource allocation -- leading to transparency. Do you have a good idea of what the different steps in your organization?s priority setting process are? Do you know when they will take place each year?  How would you characterize your organization?s priority setting process timeline?   Do you feel that internal communication about priority setting and resource allocation process is adequate?  [eg. outcomes of decisions, reasoning, next steps, decision review, general steps]  Which aspects of the process would you like to have more information about?                 141 Element Questions for SMT Responses Process, Cell 3: Skill development occurs throughout the organization including managers, directors, senior executive and clinical leaders, focusing on the process as well as underlying economic and ethical principles. Are you aware of or have you taken part in skill development or capacity building around priority setting in your organization?  Do you think you or other colleagues would benefit from more training in this area?  [Eg. workshops, peer support, handouts/summaries]  How do you think the training could be improved?  Process, Cell 4: Follow through on decisions: SMT puts in place appropriate change management strategies, with performance measurement, tracking of outcomes, and responds as needed. After resource allocation decisions are made, how would you rate your organization?s ability to implement these decisions on a 1-5 scale (1 ? ?poor? and 5 ? ?excellent?)?  What barriers does your organization face when implementing priority setting decisions?  After your proposal is accepted, is their monitoring or evaluation that takes place?  Process, Cell 5: Facilitation of priority setting process by a skilled project manager/coordinator.  Does your organization have explicit timelines for their PSRA process?  On a scale of 1-5, (1 being ?never?, and 5 being ?always?) are priority setting process deadlines enforced and respected in your organization?  Is there someone in your organization who oversees/coordinates the entire priority setting process? [Yes/No/Don?t Know]  [eg. organizes meetings, keeps everyone on track, plans for next budget cycle, coordinates investment proposal review]    Structures Element Questions for Staff  Responses Structure, Cell 1:  SMT has the ability and authority to move financial resources within and across silos. Have you observed resources being re-allocated from one area of your organization to another in the past year?  Structure, Cell 2:  Mechanisms are established for engagement of staff (clinical and non-clinical) in PSRA decisions, with particular though not exclusive attention to physicians. How does the SMT involve frontline staff in decision-making?  On a scale of 1-5 (1 - ?poor? and 5 ??excellent?), how would you rate the engagement of staff in the process?   How do you think involvement could be improved?   [eg. Better engagement tools, wider engagement etc.]         142 Element Questions for SMT Responses Structure, Cell 2a:  Mechanisms may include the use of incentives to encourage participation and foster active engagement Are there any formal incentives that encourage you to participate in your organization?s priority setting process?  If so, what are they?  Structure, Cell 3:  Coordination of priorities/criteria/processes across all organizational planning processes (eg. HR, IT, capital)  How well coordinated do you think your organizational PSRA process with other organizational processes?  Can you think of a resource allocation decision that did not consider impacts on other organizational processes? What happened?  Structure, Cell 4:  Relative Stability of Organizational Structure and Continuity of Personnel  How much change in senior management personnel has your organization undergone in the last three years?  - Major Change (about half or more of senior management positions have turned over) - Moderate Change (about a quarter of senior management positions have turned over) - Minimal Change (a few positions have changed, but generally quite stable)  How do you think this stability/turnover of the team has affected priority setting in your organization?  Structure, Cell 5: Adequate but not excessive time and resources are committed to support PSRA at the SMT and Staff level  Do you feel that you have enough time and resources to complete priority setting tasks?   If you had an extra 2 hours in the day or could clone yourself, what priority setting tasks would you spend more time on?  (eg. research supporting evidence for proposals, filling out investment proposals, implementing resource allocation decisions, following up on decisions, planning to meet strategic priorities)   Attitudes & Behaviours Element Questions for Staff  Attitudes Cell 1: Working relationships within the SMT are respectful and characterized by jointly addressing challenges, mutual trust, honesty, and the open and frank exchange of views  How would you describe the relationships between members of the SMT?  [eg. honest, trusting, belligerent, combative, on-edge]    Attitudes Cell 2: There is a culture of improvement. The SMT strives for excellence, and is willing to seek out and learn from what peers and leading organizations are doing.        Would you characterize your organization as having a culture of improvement?  Do you feel that learning and professional-development (related to priority setting and resource allocation) is encouraged in your organization?      143 Element Questions for SMT Responses Attitudes Cell 3: Decisions are made with a system-wide perspective, and a view to their long-term strategic alignment: - Senior leaders adopt a system-wide point of view while considering how decisions will be experienced across departments and over a multi-year timeframe. - SMT is willing to look beyond incremental spend to re-assess base budgets, i.e., to pursue marginal analysis and disinvestment opportunities.    Attitudes Cell 4: Fit of priority setting decisions with social and community values is sought:   - Public participation and input is valued; it is sought and integrated into decisions in meaningful ways  - Alignment with external partners and the larger regional or provincial health system exists    Attitudes Cell 5: SMT displays strong leadership for PSRA? SMT is aware of and manages the external environment and other constraining factors, and is willing to take and stand behind tough decisions. How would you characterize the leadership of your organization with respect to priority setting?  On a scale of 1-5 (1 ? ?poor? and 5 ? ?excellent?), how would you rank your SMT?s knowledge, ease, comfort, and confidence with priority setting and your organization?s process?   Could you give examples of when leadership was present or lacking?  What are the consequences for staff or members of the SMT for not participating in the process?   Outcomes Element Questions for Staff Responses Outcome Cell 1: Actual reallocation of financial resources is achieved.    Outcome Cell 2: The process has the understanding and endorsement of key internal and external stakeholders (e.g., Board of Directors, staff and medical leadership, Ministry, public).  Overall, do you understand and endorse your organization?s priority setting process?  Outcome Cell 3: There is greater understanding among participants of the organization as a whole, and of PSRA practice. After undergoing a priority setting and resource allocation process, do you feel you have a greater understanding of your organization & the process you have gone through?  Outcome Cell 4 (*new): Proposals submitted for evaluation have a high quality with respect to their use of evidence and diversity (disinvestment proposals include service cuts, not supports exclusively eg. supplies, travel, education)            144 Element Questions for SMT Responses Outcome Cell 5: Resource allocation decisions are justified in light of the organization?s established and agreed upon core values. Progress is made toward identified strategic goals and objectives.  Improved health (broadly defined) is achieved as a result of decisions made through the RA process. Effective, efficient, equitable, accessible, safe, and high quality care is delivered. Would you agree that your organization?s core values and strategic goals were reflected in the last priority setting cycle?  Do you feel that you will be able to meet those objectives with your current process?  Based on available measures and your perception, would you say that your current process for priority setting and resource allocation is improving the health of the population you serve?  Do you think that you would have been able to improve the health of the population you serve to the same extent with a different/past process?  What available evidence is there to support that claim?   

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
http://iiif.library.ubc.ca/presentation/dsp.24.1-0107199/manifest

Comment

Related Items