• Large Print

Dialectical Program Planning Model- An Approach to Capacity Building in Planning and Evaluation of Public Health Programs and Projects

State: TX Type: Promising Practice Year: 2018

Houston Health Department
Dialectical Program Planning Model- An Approach to Capacity Building in Planning and Evaluation of Public Health Programs and Projects
Location and demographics of population served The City of Houston, fourth largest city in the nation, is home for approximately 2.3 million people. The 2016 American Community Survey data indicates that the racial/ethnic proportions for Houston are: white 24.1%, black 22.0%, Hispanic 44.8%, and Asian 7.4%. The 20.8% of residents live below the Federal Poverty Level (FPL) compared to 14.0% of U.S. residents. Poverty is highest among children under 18 years of age, and among minority children. Public Health Issue? As the public health landscape evolves—particularly about funders' demands for evidence-based, impact-oriented interventions—it is important that HHD play an active role in shaping the trajectories of its community stakeholders. Goal: To establish an outcome-oriented planning and evaluation process across all HHD and its community stakeholders' programs and projects Objectives: 1. Develop an innovative model that incorporates all components for outcome-driven program/project planning 2. Facilitate a capacity building strategy using the Dialectical Program Planning Model (DPPM) on HHD programs and those of community-based organizations. How was the practice implemented? Preliminary meetings between the HHD Capacity Building Team and the leadership of Fifth Ward Enrichment Program (FWEP), to gauge the nature of their need. The project team crafted and presented a proposal that aligned their capacities with the client's needs and a project charter that clarified roles and expectations of the team members, followed by a 12-session capacity building process aimed at enhancing programs' staff's ability to plan and evaluate programs. Each two-hour session explored the fundamental aspects of program planning and evaluation through hands-on experience. Consequently, each session was designed to yield a product as the deliverable and required the active engagement of each participant. Starting with program foundations, primarily - problem definition and goal setting (5 sessions); the sessions moved sequentially through the selection of strategic approaches (2 sessions); the crafting of strategic objectives (1 session); the elaboration of activities, their associated processes, and the drafting of operational objectives for each activity (1 session); and the creation of an evaluation plan (3 sessions). After the production of programs' plans, the HHD capacity building team's primary role shifted into one of technical advisor and reviewer of future drafts of program plan components. Results and Outcomes Formal assessment of the evaluability of FWEP Staff clarifying the need statement for FWEP FWEP operational staff and administration working as a team Participants gaining analytical skills such as the use of Connection Circles” to identify main issues FWEP seeks to address Staff identifying and writing the need statement for FWEP Staff developing and writing the specific goal for FWEP Staff identifying the SMART objectives for the FWEP Staff engagement in documenting FWEP process (SOPs) Staff engaging in the development of the Evaluation Plan for FWEP Goals and Objectives were met. Factors leading to success The goals and objectives of the DPPM were met because: 1. The facilitation techniques allowed participants to dynamically generate diverse ideas, make their assumptions explicit, engage in constructive conflict, and build consensus around coherent, comprehensive models. 2. The set of tools and guidance through the planning process were based on strong evidence. 3. Interactive, participatory, team building environment (all levels of responsibility were involved in the process) 4. An emphasis on continuous quality improvement ensured a high level of responsiveness. 5. A significant amount of time (~2 hours for each hour of in-session time) went into meticulously planning the content of sessions, ensuring its relevance. 6. Adequate time was allocated for the delivery of session content. 7. Adequate resources (including space and materials) were available both for session planning and the delivery of session content. 8. The stakeholders were committed to embracing challenges along the way. 9. The intimacy of in-person sessions and hands-on activities facilitated deeper learning. 10. The HHD Capacity Building team brought to bear their program planning expertise by supplementing program staff's practice wisdom with research from the evidence-base, best practices and useful frameworks for planning, documenting, and evaluating effective programs. Public health impact on practice Employing the DPPM, programs improved their ability to monitor the progress of implementation, manage process documentation, and further quality improvement. Furthermore, the evaluations will document changes in communities where programs were implemented. Programs that benefited from capacity building will be multipliers in their contexts. As an example, the FWEP will expand to other communities in Houston through My Brother's Keeper initiative. The HHD Capacity Building Team will continue to provide technical assistance to our community partners. LHD WEBSITES http://www.houstontx.gov/health/ http://www.houstonstateofhealth.com/
PUBLIC HEALTH ISSUE Capacity building around program planning and evaluation has far reaching implications for the Houston Health Department both internally and externally. There are myriad opportunities for HHD to shore up the planning and evaluative capacities of its own program staff and thereby enhance the quality of the services it offers to the public. With the recent undertaking of initiatives such as the 1115 Waiver Program, planning and evaluation have become an even more crucial aspect of the work in which HHD engages. Moreover, scaling up its capacity building efforts internally and externally will contribute to HHD's goal of being nimble and adaptable to the rapidly changing public health landscape in the future. As the public health evolves – particularly funders' requirements for evidence-based, impact-oriented interventions-it is important that HHD play an active role in shaping the trajectories of its community stakeholders. Capacity building in the realm of program planning and evaluation offers such an opportunity. Continuing challenges in conceptualization of the program planning process and achieving buy-in from the program staff served as the impetus to create a more effective model. Only a limited number of staff were engaged in the planning process, and those tasked with implementation rarely had insight into the activities that generated the roadmaps they were to follow. Additionally, the program activities were rarely contextualized for staff in a meaningful and relatable way. A lack of a shared mental model of the problem meant that program staff did not know how they were contributing to achieving the mission of the organization. To address this issue, particularly for programs that already existed, retroactive planning activities and complete reconceptualization of program foundations were required. This clear need for a tangible and coherent link between everyone, the program, and the larger context fueled the adoption of a system dynamics based approach to program planning. Typically, HHD programs that are evaluated regularly, do so primarily in compliance with their program audits or grant/funding requirements. Often programs were not evaluated because it was not required by their funding sources, lacked budgetary allocations for program evaluation, unskilled staff that did not meet core competencies and their programs were being restructured . This trend in program planning and evaluation internal to HHD, has been slow to shift. Externally, when small nonprofit organizations talk about program evaluation, they often feel overwhelmed. Too often we assume program evaluation requires lots of time and resources that smaller organizations simply do not have (Aikins, Lloyd & Joyner, 2007). HHD's mission is to work in partnership with the community to promote and protect the health and well-being of Houstonians in the environment which they live. Aligned with its role as a facilitator for the advancement of population health, HHD seeks to enhance the ability of its partner non-profit organizations to fulfill their missions efficiently through capacity building. In the summer of 2016, the Fifth Ward Enrichment Program (FWEP)- a youth leadership and development program for at risk males ages 12-19 located in Houston, approached the HHD Director, Mr. Stephen Williams, MPA, for assistance in evaluating their program for effectiveness. Upon a series of meetings between HHD Capacity Building Team and FWEP leadership, the FWEP Capacity Building project was initiated. The purpose of the project was identified as follows: To guide Fifth Ward Enrichment Program (FWEP) leadership and staff through a set of didactic and interactive sessions intended to simultaneously enhance their capacity to document and evaluate program processes, thereby o enabling FWEP to contribute to an emerging database of youth-related intervention indicators for the City of Houston. By the end of the project timeline, FWEP leadership and staff will have produced a functional monitoring and evaluation tool that will help in the continued structuring and evolution of their intervention. TARGET POPULATION All institutions and organizations that implement interventions with public health implications are considered target. Of interest are non-profit organizations and smaller health departments. This type of capacity building is more pertinent for organizational level targets. WHAT HAS BEEN DONE? The HHD provides services designed to promote, protect the health, well-being and the environment in which Houstonians live. However, program evaluation within the HHD has been irregular. HHD's progress towards program planning and evaluation is listed below: In 2007, a study showed that 55% of the HHD's programs were evaluated regularly as part of program audits or grant requirements. The remaining programs did not because there was no such requirement from funding sources; lack of budgetary allocations for program evaluation, unskilled staff, and program restructuring (Aikins, Lloyd & Joyner, 2007). From 2013 to date, HHD has been a recipient of CMS funds supporting fifteen (15) 1115 Waiver programs focused on the triple aim: reducing the cost of care, improving the delivery of health care services and improving the quality of life of the population. The deliverables include the program plans (2013), implementation and data collection (2014 onwards), and program outcome evaluation (2016 onwards). Program managers/administrators are engaging in program outcome evaluations as required by the 1115 Waiver funding. In 2014, the HHD received accreditation from the Public Health Accreditation Board (PHAB). HHD became the first city in Texas and the second large city in the U.S. to be accredited. Public Health accreditation process adopts PHAB's standards and measures as the framework for evaluating the local health department's processes and services, their outcomes, and progress toward specific goals and objectives. Community Health Assessment (CHA) and the Community Health Improvement Plan (CHIP)- For both the Accreditation and Reaccreditation processes, HHD conducted (2013, 2016 respectively) the CHA. Stakeholders including residents, businesses, nonprofit organizations, and government agencies provided input to the CHA. Data from CHA was analyzed, and used to inform community decision-making, the prioritization of health problems, and the development and implementation of Community Health Improvement Plans (CHIP). 1115 Medicaid Waiver: Since 2013, HHD has implemented 15 waiver projects that meet the HHS and CMS aim criteria of 1) improving population health, 2) delivering better health care, 3) reducing health system costs. 2017 -2021 HHD Strategic Plan: Every five years, the Houston Health Department (HHD) conducts an extensive assessment and planning process to determine goals and priorities that will guide activities for the coming years. The current Strategic Plan for 2017-2021 is based on the CHA and internal stakeholder input [Strength, Weakness, Opportunity, Threats (SWOT) analysis]. WHY IS THE CURRENT PRACTICE BETTER? Considerable efforts have been made to improve the performance of public health programs and projects both nationally and locally. Significant progress has been made, but the biggest challenge continues to be moving from output-reporting to evidence-based analysis of outcome-driven strategies. HHD's Dialectical Program Planning Model, a practical approach to capacity building, is supported by tools that provide flexibility and an instructional guide along the different stages of planning, while serving as a template for documenting the planning process. As such, the HHD Evaluation Guidance Manual was the first locally contextualized guideline for use in overall planning and implementation of outcome and impact evaluation at the HHD. It provides a framework for participating programs and projects to plan and conduct their own outcome evaluations. Similarly, the HHD Program Planning Template allows flexibility while assuring rigor and quality, linking objectives, activities and outcomes with resources. The DPPM includes all the components of program planning up to, and including evaluation and quality improvement. These two documents have met the planning needs of both internal and community-based programs. IS CURRENT PRACTICE INNOVATIVE? The Dialectical Program Planning Model is innovative in its approach to delivering a program planning curriculum. Rather than focusing narrowly on program processes and outcomes in isolation, the methodology expressly cultivates a systems-based understanding of the specific planning activities. As such, stakeholders are challenged to conceptualize their efforts within a larger ecosystem of actors and forces. Throughout the process of engagement, clear and tangible links are highlighted between the individual, the program, the coalition, and the ecosystem in which these entities exist. This extends traditional notions of stake” or buy-in” to include staff at every programmatic level, thereby emphasizing the critical nature of everyone's contributions to the planning process. Moreover, this extension opens the planning process to include individuals who may not traditionally participate actively—indeed it necessitates this kind of radical openness. This model promotes the integration of grassroots practice wisdom with the academic evidence base upon which public service organizations have typically relied. IS THE PRACTICE EVIDENCE-BASED? Yes, the DPPM is evidence-based. As public health institutions continue to strategically reposition themselves within their local ecosystems according to the Public Health 3.0 framework, new models for engagement must emerge to ensure continuity of support for the communities they serve even as the nature of these relationships evolves (Public Health 3.0, n.d, 2017). Program planning represents an area where public health departments' expertise aligns with the needs of community organizations. The HHD Capacity Building Team adapted the Framework for program evaluation in public health (CDC, 1999) thereby developing a locally contextualized manual OPERE Evaluation Guidance Manual (available upon request). This manual has been used to provide training to different programs and services of the department. Furthermore, the HHD Program Planning Template, an instructive planning tool, was developed by combining other existing models (available upon request). This tool is consistent with others, such as the RAND Corporation health planning tool (Published Research, 2016). The CDC's Framework for Evaluation in Public Health guides public health professionals in their use of program evaluation. It is a practical, non-prescriptive tool, designed to summarize and organize essential elements of program evaluation. The CDC developed the framework for program evaluation to ensure that, amidst the complex transition in public health, programs and projects remain accountable and committed to achieving measurable health outcomes. Announced by Dr. Frieden in November 2012, the evaluation guidelines are a set of recommendations to inform evaluation planning and implementation. According to the Dialectical Program Planning Model, most components of program evaluation referenced are based on concrete evidence. Data collection, analysis and interpretation are essential parts of the process of monitoring and evaluation. On the other hand, the use of evidence obtained through the implementation is the most important source for quality improvement planning. Therefore, the DPPM facilitates the HHD's efforts to fulfill its essential functions as a public health agency. These functions, as outlined in its mission, have direct implications for evidence-based practice. This list of functions includes but is not limited to the following: Supporting community-based, and non-for-profit organizations/coalitions that engage in design, implementation, evaluation and dissemination of strategies addressing health disparities; Providing infrastructure to successful practices and evidence-based approaches to communities' health problems; Supporting national organizations and agencies by providing accurate information and data relevant to the current national goals and objectives to eliminating disparities of any kind that affect health and wellbeing of our diverse community. Information about the different HHD program/projects and collaborations is available at: https://www.houstontx.gov/health HHD programs and projects also guarantee local health sector response by embracing the essential functions of public health: -To inform, educate and empower people about health issues, and -To create and mobilize community partnerships to identify and solve health problems. References Aikins, J., Lloyd, L., & Isaac, J. (2007). A Scan of Program Evaluation at the Houston Department of Health and Human Services. Public Health Reports, 122(5), 707–711. Chapman, B. (2010). How long does it take to create learning. ChapmanAlliance, available at: http://www. chapmanalliance. com/howlong/(accessed 30th July, 2015). Hovmand, P. S. (2014). Community based system dynamics. New York: Springer. Koplan, J. P., Milstein, R. L., & Wetterhall, S. (1999). Framework for program evaluation in public health. Atlanta, Georgia.: US. Department of Health & Human Services. Mattox, T., & Kilburn, M. R. (2016). Supporting Effective Implementation of Evidence-Based Practices. Scheirer, M. A. (2005). Is sustainability possible? A review and commentary on empirical studies of program sustainability. American Journal of Evaluation, 26(3), 320-347. Scott, R. J., Cavana, R. Y., & Cameron, D. (2016). Recent evidence on the effectiveness of group model building. European Journal of Operational Research, 249(3), 908-918. Vennix, J. A. (1996). Group model building: facilitating team learning using system dynamics (No. 658.4036 V4).
Food Safety|HIV in the U.S.|Nutrition, Physical Activity, and Obesity|Tobacco|Healthcare-associated Infections|Motor Vehicle Injuries|Teen Pregnancy
Dialectical Program Planning Model: An approach to capacity building in planning and evaluation of public health programs and projects. GOALS AND OBJECTIVES Goal: To establish an outcome-oriented planning and evaluation process across all HHD and its community stakeholders' programs and projects Objectives: 1. Develop an innovative model that incorporates all components for outcome-driven program/project planning 2. Facilitate a capacity building strategy using the Dialectical Program Planning Model (DPPM) on HHD programs and those of community-based organizations. WHAT DID YOU DO? The following documentation represents the steps taken to validate the Dialectical Program Planning Model with a local community-based organization. Document A - Project Charter Project Name: HHD Planning with Fifth Ward Enrichment, Director's Office Prepared By: OPERE (Deborah Banerjee, Ekundayo Azubuike, Angela Gala Gonzalez, Janet Aikins) Date Issued: June 2016 Charter issued by: OPERE Project Scope Project Objective/Purpose/Task/Charge (Describe briefly the purpose of the project. What objective and tangible result is to be achieved?) The purpose of this project is two-fold: - To guide Fifth Ward Enrichment Program (FWEP) leadership and staff through a set of didactic sessions intended to enhance their capacity to document and evaluate program processes. By the end of the project timeline, FWEP leadership and staff will have produced a functional monitoring and evaluation tool that will help in the continued structuring and evolution of their intervention. (See Planning for Evaluation” addendum for detailed session schedule and specific learning objectives) - To provide HHD with an opportunity to build evidence in support of the effectiveness of the Dialectical Program Planning Model. Project Boundaries: 1. Delivery of facilitated planning sessions 2. Revision of program planning and evaluation document drafts Project Customers (Identify who will use the final deliverable) 1. Fifth Ward Enrichment Program leadership and staff 2. Houston Health Department staff Customer Needs: (What is the problem that is supposed to be solved by the final deliverable) - FWEP seeks to clarify and document its program processes and evaluation plan. - The Houston Health Department has not consistently applied a coherent framework with which it can facilitate capacity building related to planning and evaluation for programs and projects. Final Deliverable: (Identify the product, service, process or plan that will be delivered. This must satisfy customer needs and requirements). 1. Twelve facilitated planning sessions 2. Problem statement and justification 3. Causal map of issues to be addressed by the program 4. Logic model a. S.M.A.R.T. objectives b. List of activities and associated resources c. Outcomes 5. Evaluation plan Customer's Criteria for Acceptance HHD and Plan Houston Strategic Priorities (Which HHD priority does this project seek to affect) HHD: o Goal 4 Give Children a Healthy Start ? 4.2 Develop and Implement Best Practice Child Safety and Injury Prevention Strategies o Goal 7 Reduce Health Disparities ? 7.1 Act as a change agent to facilitate the elimination of health disparities Plan Houston: o People ? 1: Equal access to opportunity and prosperity. ? 3: Strong social ties supported by social, civic, and faith organizations. ? 5: Supportive services for disadvantaged and at-risk groups. o Place ? 8: A safe, secure community. ? 12: High-quality community facilities that provide for the diverse needs of residents. ? 13: A city that enables healthy, active lifestyles, and social well-being. o Culture ? 16: A thriving local arts and creative community. o Education ? 17: Quality learning opportunities from early childhood onward. o Public Services ? 29: Active regional cooperation and collaboration among governments, community leaders, and residents. Project Stages ? Stage I (6/13/2016—6/30/2016): responsible party - OPERE o Preliminary informational meetings o Project planning ? Establish learning objectives ? Draft curricular outline o Draft project charter o Review project charter ? Stage II (7/14/2016—4/28/2017): responsible parties - OPERE, FWEP o Facilitated planning sessions o Planning and evaluation document authorship o Client evaluation of sessions ? Stage III (5/26/2017): responsible party: OPERE o Document review and feedback ? Stage IV (6/2017): responsible parties: OPERE, FWEP o Final product review Interim Reports ? 7/4/2016, responsible parties: OPERE, FWEP o Final project charter o Final schedule of sessions ? After Each Session, responsible parties: OPERE o Planning session output feedback ? 6/2017, responsible parties: OPERE, FWEP o Review of final products Project Resources and Dependencies ? People: OPERE team, FWEP leadership and staff ? Supplies: meeting space, laptop, projector, flip chart, markers, Internet connection, pens, paper ? Expertise: knowledge of FWEP, program planning, evaluation ? Assumptions: o All members of the OPERE team will be present and all sessions. o All members of the core FWEP leadership and staff will be present at all sessions. o All parties will remain flexible and amenable to changes in scheduling or content of sessions based upon ongoing feedback. Risks and Failure Points: (What can cause this project to fail? What will happen if project fails?) ? Potential Failure Points: o Lack of clarity around project goals and processes o Lack of commitment to full participation in the process on the part of OPERE team or FWEP leadership and staff o Insufficient communication of specific needs during project planning or implementation ? If the project fails… o …OPERE may lose the opportunity to collaborate effectively with a valuable community partner. o …the relationship between HHD and FWEP may be negatively impacted. o …a key community player may miss out on an opportunity to increase their impact and enhance their value to stakeholders. o …the larger Houston community will be deprived of a model for planning and evaluation of interventions. Document B – DPPM Curriculum ? Unit 1: Program Foundations o Unit Summary: This unit will address the foundational aspects of the program's theory of change: problem identification, population specification, mission articulation, and program objective setting. o Unit Essential Questions: What is the problem your program ultimately seeks to address? How do you know that this is a problem? How does this problem impact different demographics within the community? What is your program's mission as regards this problem within this population? o Unit Enduring Understandings: The problems we identify often have complex, deeply rooted causes. There should be a logical relationship between the roots of a problem and the diverse strategies employed to address them. Furthermore, there should be a logical relationship between the achievement of specific objectives and the realization of the program's ultimate goal. Data are essential when conceptualizing a problem and its potential solutions. Different populations have unique, specific experiences of a particular problem and must be attended to as such. Objectives should be S.M.A.R.T. o Unit Knowledge Objectives: ? Participants will know… The difference between proximate and distal causation. The definitions of and difference between a goal, an objective, and an activity. o Unit Skill Objectives: ? Participants will be able to… Construct an Ishikawa diagram to illustrate causality. Employ the 5 Why's method to determine causality. Identify the root causes of a given issue. Identify the ways in which a problem specifically impacts a population of interest. Justify the need for their program with data. Articulate the program's mission in the To-For-By-So That framework. Build S.M.A.R.T. objectives. ? Unit 2: Process Planning and Documentation o Unit Summary: This unit focuses on the documentation and execution of the program's essential processes. o Unit Essential Questions: What resources are you devoting to the achievement of your objectives and long-term goals? What resources could your program potentially leverage to realize its objectives and long-term goals? What outputs (activities and participation) do you plan to realize as a result of allocating these resources? How will you document the realization of the outputs you expect? How will you assure the quality of the outcomes you expect? o Unit Enduring Understandings: The resources that one allocates must be sufficient for and relevant to the achievement of goals and objectives. Process documentation is central to quality assurance and evaluation. There must be a logical and justifiable connection between program processes and outputs. o Unit Knowledge Objectives: ? Participants will know… The definition of an input/resource. The definition of an output (activity versus participation). How resource allocation relates to outputs. How objectives relate to outputs. o Unit Skill Objectives: ? Participants will be able to… Describe their program processes. Adequately document their program processes. Appropriately relate resources to their relevant outputs (activities and participation). Evaluate the feasibility of selected objectives and modify them accordingly. ? Unit 3: Outcomes and Planning for Evaluation o Unit Summary: This unit will address the processes of evaluation planning and implementation. It is aimed at linking together the constituent aspects of the program planning process and measuring programmatic success in terms of outcomes. o Unit Essential Questions: What is an outcome, and how is it measured? What outcomes can we expect from the program in question? How do the outcomes relate to the program goals? What are the essential components of an outcome evaluation? o Unit Enduring Understandings: Planning for evaluation is an essential and primary part of program planning. Evaluation is the tool we use to assess program success. o Unit Knowledge Objectives: o Participants will know… The difference between an output and an outcome. What an indicator is. The difference between short, medium, and long-term outcomes. o Unit Skill Objectives: o Participants will be able to… Construct valid indicators for program evaluation. Find and construct tools to measure specific aspects of program success. Construct a complete logic model. Collect and manage data for the evaluation process. Document C – List of Sessions ? Session I o Icebreaker o R.O.P.E.S. o Project Introduction & Discussion o Closing Journal o ?|+ o Homework Assignment ? Session II o Check-in o Homework Review o Presentation o Break-Out Group Discussion o Report Back o Formulating a Problem Statement o Homework Assignment o ?|+ ? Session III o Refining a Problem Statement o Workshopping Problem Statements o Report Back & Group Discussion o Problem Statement Synthesis o ?|+ o Homework Assignment ? Session IV o Check-in o Problem Statement Review & Discussion o Presentation: Goal Statements & S.M.A.R.T. Objectives o Causation Web and 5 Why's o S.M.A.R.T. Objectives and Report Back o Goal Statement o ?|+ & Homework ? Session V o Intro o Project Charter o Problem & Goal Statements o Connection Circles o Practice Wisdom & Literature o Draft S.M.A.R.T. Objectives ? Session VI o Introduction o Select Strategic Areas of Focus with Criteria o How to Write Strategic Objectives (Presentation) o Draft Strategic Objectives o ?|+ ? Session VIII o Opening o Process planning template (PPT) o Group work (completing process planning) o Share/Discuss o S.O.P. template (Example) o Group work (completing 1 S.O.P. per group) o Share/Discuss o Homework o ?|+ ? Session IX o Opening o Product Review ? objectives and strategic approaches ? activities and processes ? questions o Workshop Activities o Resource Documentation ? Session X o Update on Progress o Introduction to Evaluation o Program evaluation Feasibility o Process Objectives o Group Work o ?|+ ? Session XI o State of Program Plan o Moving into Evaluation o Presentation: Process & Outcome Evaluation o Activity: Listing Measures ? Session XII o Product Feedback o Presentation: Evaluation o Activity: Evaluation Plan Document D – Example Detailed Agenda ? Icebreaker: Ball Toss (15 minutes) o The facilitator welcomes the group and leads them in an activity aimed at increasing the members' familiarity with one another and eliciting information about their expectations and concerns for the process. ? R.O.P.E.S. (10 minutes) o The facilitator leads the group in an interactive process of establishing ground rules and expectations for the project. ? Project Introduction and Discussion (30 minutes) o The project team walks through the charter and curriculum with the goals of answering any remaining questions and drawing attention to the intersections of the participants' expectations and what was laid out in the documents. ? Closing Journal (15 minutes) o The facilitator leads the group in reflecting on the ways in which they intend to contribute to the unfolding of the program planning process. ? Delta + (15 minutes) o The facilitator solicits feedback from the group about the session. ? Homework Assignment (5 minutes) o The facilitator provides a topic for the working group to contemplate in preparation for the next meeting. SELECTION CRITERIA Community-based, governmental, and other non-profit public service organizations that are implementing strategies to impact community health are targets for this intervention. Once selected for participation, all levels of program staff (e.g. managerial, operational, executive, et cetera) participate in the didactic sessions. TIMEFRAME The practice is designed to be delivered within 12, two-hour sessions and can be dynamically adapted to the needs and constraints of the particular program or organization. STAKEHOLDER INVOLVEMENT The primary involvement of stakeholders is limited to the project planning phase. Input regarding programmatic needs as well as the specific timing and content of sessions is mutually determined and agreed upon by both the project team and the stakeholder group. The following initiatives represent HHD's traditional role as a convener of community partners as well as a participant in local collaboratives aimed at improving health outcomes: ? Affordable Care Act (ACA) – HHD led partners in the Gulf Coast Marketplace Collaborative to provide outreach and enrollment assistance for the Affordable Care Act. ? AIM (Assessment, Intervention, Mobilization) – visits targeted low-income neighborhoods to collect data, provide links to services, and mobilize action. ? Project Saving Smiles – provides dental screenings and sealants for permanent teeth for high-need second graders in Houston area schools. (give # served in recent year?) ? Vision Partnership – provides vision screening events and glasses to low-income students in Houston area schools (give # served in recent year). ? 1115 Medicaid Waiver – The fifteen HHD initiatives that meet the HHS and CMS triple” aim criteria of a) improving population health 2) better health care delivery 3) reduce costs of health system. ? My Brother's Keeper Houston (MBK Houston) – serves as the backbone organization for over 200 partners that are working to improve outcomes and provide second chances for men and boys of color. ? Community Re-Entry Network – re-integrates parolees with their families and communities and reduces recidivism. By leveraging these relationships, HHD is strategically positioned to intervene with these and associated organizations through capacity building. As an example, the capacity building partnership with FWEP extended from the organization's initial engagement as a partner with the MBK program. ESTIMATE OF START-UP COSTS/ BUDGET BREAKDOWN ? List of resources: o People: OPERE team (3 person), FWEP leadership and staff o Supplies: meeting space, laptop, projector, flip chart, markers, Internet connection, pens, paper o Expertise: knowledge of FWEP, program planning, evaluation o Time: planning (3 hours/session), delivery (2 hours/session) The cost of implementation of this model is variable depending upon the resources available to the program in question. HHD was able to leverage its own facilities and employees as well as some of the material resources of its organizational partner (e.g. projector, laptop) to implement the model and as such incurred and charged no direct costs. The monetary values in the following budget represent estimations of the cost of the services rendered, not an amount paid by the stakeholders. All services were rendered in-kind. The hours associated with each task were calculated with the methods proposed by Bryan Chapman (2010) in his work How Long Does It Take to Create Learning?” The percentages in parentheses represent the proportion of time devoted to the specific task out of the total time. Needs Assessment: 43.2 hours (3%) Prepare Project Plan: 28.8 hours (2%) Conduct Course Content/Learning Analysis: 360 hours (25%) Develop Prototype Lesson: 345.6 hours (24%) Develop Flowcharts: 14.4 hours (1%) Develop Script/Storyboards: 100.8 hours (7%) Produce/Acquire Media (Photos, audio, video): 43.2 hours (3%) Author Course: 432 hours (30%) Evaluate the Course (In-Process Reviews): 72 hours (5%) Total Hours to Create Course: 1440 hours (100%) Instructor Prep Time: 216 hours Labor Costs Instructional Designer Labor Cost: $40,320 Instructor Labor Cost: $18,144 Total Cost: $58,464 Miscellaneous Costs: Equipment: $350 Outside Vendor Consultant Video Other: $160 Grand Total: $58,845
The project team conducted qualitative evaluations of intervention processes and outcomes in two ways. First, at the end of each session, they solicited feedback from participants about their experiences that day. Using a facilitated group activity called ? | + (Delta-Plus), the project team solicited specific feedback from stakeholders about which aspects of the session went well and which aspects needed improvement. Positive feedback that participants consistently offered related to the extent to which the facilitated sessions allowed for productive collaboration. This collaboration was particularly helpful as it provided participants with opportunities to work across levels of experience and seniority in ways that contrasted with their traditional mode of operation. For instance, the facilitators observed a positive change in the frequency and nature of contributions from operational-level staff over time. Initially, managerial- and executive-level staff members contributed more frequently, and were more likely to challenge or critique the ideas and contributions of others. This kind of constructive dialogue became more balanced between the groups with progressive sessions. The discussion-based nature of the sessions was also very well received as it allowed for the exchange and understanding of coworkers' ideas. Throughout the process, participants indicated that they were able to clarify their comprehension of key programmatic foundations. One particularly relevant anecdote relates to the formulation of a problem statement for the intervention. Through a group process of idea elicitation, it became clear to the group that each individual had differing ideas about the primary issue their program exists to address. Responses to the question, What is the problem your program ultimately seeks to address?” ranged from the general (e.g. poverty”) to the specific (e.g. parental unemployment”), the individual (e.g. personal hygiene”) to the communal (e.g. community violence”). When presented with this diversity of ideas, it became apparent to the group that a lack of common understanding of the program's foundations was at the root of many of their planning and implementation challenges. These sorts of realizations—those that result from the exposure of unspoken assumptions about shared ideas—were observed frequently and were identified as a benefit to engaging in this sort of facilitated group process. A contrasting anecdote relates to the Connection Circles activity and the group's journey toward aligning their practice wisdom with the scientific evidence base. In constructing visual models of the relationships between risk and resilience factors they had observed in practice and comparing those models to the literature, it emerged that many of their experiences and intuitions were validated by research. As opposed to the previous anecdote where the discordance between ideas was highlighted, this was an instance of discovering the concordance between validated studies and professional experience. Participants indicated that this added value to their journey through the planning process and deepened their understanding of the work they do. A challenge that was identified early in the process was time management. Participants indicated that they placed high value on starting and ending sessions on time, regardless of progress through the day's lesson plan. As such, the project team adopted new time management strategies that allowed for more flexibility in the distribution of activities between sessions. In the event that a particular lesson plan remained incomplete at the end of the allotted time, activities were rolled over and incorporated into the following session's lesson. At the midpoint of the process, the project team met with program leadership to gauge impressions and expectations vis-à-vis the plan laid out in project charter and the stakeholders' experiences to date. In addition to these evaluative activities, the project team engaged in direct observation of stakeholder progress within each session and throughout the process. As part of the team's continuous quality improvement methodology, adjustments were made to various aspects of the process as deemed necessary based on the learning and behavioral expectations set out in the project charter.
Studies show that sustainability for learning/capacity building in nonprofit context is driven by: (a) Program flexibility - program can be modified over time, (b) a champion” is present, (c) a program fits” with its organization's mission and procedures, (d) benefits to staff members and/or clients are readily perceived, and (e) stakeholders in other organizations provide support (Scheirer, 2005). In this project, HHD's role as a facilitator, agent of change embodies all the factors listed above. The FWEP and HHD have collaborated as partners for decades. For example, the FWEP is in one of HHD's facilities. HHD considers this capacity building project as an expansion of its partner roles in the community ensuring that its mission is accomplished. By leveraging this and other existing relationships, HHD is strategically positioned to continue and expand its capacity building efforts with these organizations. As an example, the capacity building partnership with FWEP extended from the organization's initial engagement as a partner with the MBK program. Secondly, all the services and tools were donated on an in-kind basis, thus at no cost to FWEP. Having embraced the process and commitment to include both the planning and implementation in their program, FWEP administrators can include regular planning and evaluations to assure a continuous quality improvement. Thirdly, funding agencies are attracted to evidence-based projects. This project if implemented as planned should enhance its attraction to funding agencies thereby increasing its sustainability. One of the most significant lessons learned throughout this process is that the DPPM has implications that go beyond the production of a program implementation and evaluation plan. The most significant impacts of this sort of facilitated process relate to the ways that participants grow in their understanding of the significance of their roles and how they fit into a larger ecosystem of actors. This growth enables them to perform at a higher level in their professional functions, guided by the planning and evaluation products that they develop throughout the process.
Colleague in my LHD

Driving Walking/Biking Public Transit  Get Directions