Peer Outcomes Protocol Project
/ POP Protocol: Background and Purpose
The Peer Support Outcomes Protocol (POP) Project involved developing, field-testing, and disseminating an evaluation protocol to measure service and programmatic outcomes for mental health community-based peer support programs. Funded as part of the 1995-2000 University of Illinois at Chicago’s (UIC) National Research and Training Center (NRTC) on Psychiatric Disability, it was directed by Dr. Jean Campbell of the Missouri Institute of Mental Health, University of Missouri (www.cstprogram.org). The project was largely designed, directed, and implemented by researchers, advocates, and providers who also have a mental illness. As such, the POP reflects specific areas of concern to individuals in recovery who are served in the public system.
During the first project phase, Dr. Campbell designed and administered
a telephone survey to 40 peer-run programs across the country assessing
their data needs. Her findings are summarized in a report
that is available on this website. This effort represents
the first survey of peer-run programs' computer and research needs to be
conducted in the United States on a wide scale. Next, Dr. Campbell convened
a group of people with and without mental illness to generate potential
items to develop the POP. She led a two-day concept mapping meeting
in Missouri, in which domains for outcome evaluation were identified, ranked
in their importance, and analyzed to generate a three-dimensional map showing
the relationships of domains to each other. This analysis
also helped to generate draft items to represent each of the domains for
the POP. Based upon the concept mapping report and input from the project's
external Advisory Board, the first draft of the POP was constructed.
Download the report [PDF, 237KB].
During the second project phase, the POP peer research team conducted a preliminary field-test of the protocol with respondents from a self-help mental health center in St. Louis. The field-test elicited respondent feedback about the length of the survey, specific survey items, ease of understanding (for both face-to-face and telephone versions), and reading level (for the mailed version). In addition, the clarity of instructions for interviewers and respondents, interview flow, scaled response set, skip patterns, and the three types of protocol administration were assessed. Based on this field-test and extensive consultation with UIC NRTC collaborators, the POP peer research team revised the instrument.
This second version of the POP was reviewed by an external panel of six leading researchers in mental health services outcome assessment and program evaluation across the country. Several of these were consumer researchers on faculty at universities or employed as evaluators in service delivery settings. These experts commented on the protocol's overall organization, clarity of items and instructions, respondent burden, client sensitivity, and ease of administration. These comments were carefully reviewed and addressed by both Dr. Campbell's research team and the UIC NRTC staff. The field-testable version of the POP was finalized on the basis of this review.
During the third project phase, Dr. Campbell and her team developed an Interviewers' Training Manual to be used in training peer interviewers. This manual covers topics such as an interviewer's responsibilities in administering a research protocol, procedures to obtain informed consent, strategies for asking questions appropriately and sensitively, methods for tracking completed and outstanding interviews, and techniques for debriefing research participants. This manual was reviewed by researchers at the UIC NRTC and their comments were incorporated into a revised version.
Also during this phase, Dr. Campbell and her colleagues coordinated the psychometric testing of the POP. This involved a two and a half day interviewer training session with nine peers who were hired to conduct the psychometric testing interviews. During this training, the researchers reviewed the purpose and content of each module of the POP, along with each question and its relevant prompts. The attendees also participated in role-playing and practiced with partners in front of the group to further develop their interviewing skills.
The psychometric testing occurred over a three-month period. A sample of 100 participants in a local community mental health program consented to be interviewed, and a random sample (n=41) agreed to be re-interviewed two weeks later to assess test-retest reliability. A number of established scales were used during these interviews to assess the concurrent validity of the POP. After interviewing was completed, a series of statistical analyses explored responses to the various modules. These analyses indicated the need for revisions to some POP items as well as suggestions for re-formatting the instrument. Following these recommendations, the authors developed the final draft of the POP instrument, its administration manual, a question-by-question guide, and a set of response cards. These materials were edited and formatted for public distribution by the University of Illinois at Chicago’s Survey Research Laboratory (www.srl.uic.edu).