Program and Service System Evaluation (PSSE) Print

Our Team:

Joy S. Kaufman, Ph.D. and Cindy A. Crusto, Ph.D., Directors
Amy Griffin, MA, Senior Evaluation Consultant
Doreen Fulara, MSW, Evaluation Consultant
Katina Gionteris, M.A., Statistician/Data Manager
Cindy Medina, B.S., Research Assistant
Jessenia Medina, B.S., Research Assistant
Emily Melnick, MA, Evaluation Consultant
Diane Purvin, Ph.D., Evaluation Consultant
Joanne Richardson, BS, Program Coordinator
Charlene Voyce, MPH, Evaluation Coordinator/Research Associate
Kathryn Young, M.A., Research Associate II/Data Manager

Evaluation has been an integral part of The Consultation Center since its inception more than 30 years ago. The overall mission of The Center’s evaluation services is to enhance scientific knowledge about a given question, program, or service, and to inform public policy. Evaluations are conducted by Center faculty and professional staff and are tailored to meet the needs of an individual client or project. Both qualitative and quantitative methods are employed in evaluations, and in many instances, are combined in a single design to understand a particular issue in context. Evaluations are conducted on a local, regional, or national basis, and may involve single or multi-year assessments. Evaluation activities that we conduct include need and resource assessments, process and outcome evaluations, focus group studies, cost-outcome evaluations, service system analyses, and assessments of community coalitions.

Evaluations conducted in collaboration with The Consultation Center faculty and staff combine scientific rigor with the practical realities of implementing feasible evaluations that are ultimately responsive to local needs. As a result, evaluations are consistently useful; they inform practice, program planning and management, and policy development. Center staff and faculty provide trainings and technical assistance to community-based organizations with the goal of enhancing the evaluation capacity of these organizations. In addition to designing evaluations that yield data that is useful in understanding the processes and outcomes of a given project or organization, we strive to develop evaluation infrastructures that are sustainable beyond our tenure so that programs and organizations can continue to utilize data to inform their program and policy planning.

Our team takes a collaborative approach to evaluation where we join with key stakeholders (e.g. funders, policy makers, program staff, consumers) in the development of data collection variables, methods, and outcomes. We first work with stakeholders to create measureable language to assist with identifying or articulating the goals, objectives, indicators, and outcomes that relate to the overall vision of their work. We then create structures to collect and report key process and outcome data to inform progress toward their outcomes and goals. Data is analyzed and fed back to stakeholders at regular intervals; both through presentations and in written form through mechanisms such as quarterly reports and newsletters. Our presentations and reports are developed so that they will be accessible and relevant to multiple stakeholders including community members, program staff, and policy makers. The creation of structures for ongoing data collection and reporting provides an opportunity for continuous quality improvement and also educates stakeholders on how to interpret and effectively use data. Our ultimate goal in any endeavor is to build the capacity of community-based organizations and their funders to collect and use data to inform program and policy decision making. Our evaluation results have been used to inform funders about structures for program replication, provide process and outcome data to secure additional funding for sustainability, and offered lessons learned used to create policies and procedures for programming. Additionally, faculty and staff from the PSSE area partner with our community collaborators to prepare manuscripts that highlight strategies that bridge the gap between science/evaluation and practice.

Examples of current evaluation projects include:

  • Evaluations of statewide children’s mental health systems of care;
  • Evaluations of the implementation of trauma-focused evidenced-based clinical interventions for children and their families;
  • National cross-site evaluation of a demonstration initiative to reduce the rate of homicides resulting from domestic violence;
  • Evaluations of state-wide public health initiatives;
  • Evaluation consultation and technical assistance to state departments implementing programs for populations at risk for poor health outcomes;
  • Provision of training and technical assistance to community-based organizations regarding the planning and implementation of needs assessments and program evaluations on a fee-for-service basis.
Examples of recent publications include:
  • Case, AD; Byrd, R; Claggett, E; DeVeaux, S; Perkins, R; Huang, C; Sernyak, MJ…..Kaufman, JS. Stakeholders’ Perspectives on Community-Based Participatory Research to Enhance Mental Health Services. American Journal of Community Psychology, in press.
  • Roberts, YH; Huang, CY; Crusto, CA; Kaufman, JS. Health, ED use, and early identification of young children exposed to trauma. The Journal of Emergency Medicine, 2014, DOI: 10.1016.
  • Roberts, YH; Campbell, C; Ferguson, M; Crusto, CA. Role of parenting stress in young children’s mental health functioning after exposure to family violence. Journal of Traumatic Stress, 2013, 26(5), 605-12. doi: 10.1002/jts.21842 NIHMSID: NIHMS584622.
  • Crusto, CA; Whitson, ML; Feinn, R; Gargiulo, J; Holt, C; Paulicin, B; Simmons, W; Lowell, DI Evaluation of a mental health consultation intervention in preschool settings. Best Practices in Mental Health: An International Journal, 2013, 9(2), 1-21.
  • Schlauch, R., Levitt, A; Connell, CM; Kaufman, JS. The moderating effect of family involvement on substance use risk factors in adolescents with severe emotional and behavioral challenges. Addictive Behaviors, 38, 2333-2342; 2012.
  • Kaufman, JS; Crusto, CA; Quan, M.; Ross, E.; Friedman, S.R.; O'Reilly, K.; Call S. Utilizing program evaluation as a strategy to promote community change: Evaluation of a comprehensive community-based family violence initiative. American Journal of Community Psychology, 2006, 38 (3-4), 191-200.