PLA 2018 | Philadelphia | March 20 (all day) | Project Outcome Training Workshop
FREE to PUBLIC LIBRARIES – www.projectoutcome.org
What is an Outcome? Knowledge | Confidence | Application | Awareness
What is an Outcome Measurement?
Need Assessment (What does our community need?) | Output (How much did we do?) | Outcome (What good did we do?) | Patron Satisfaction (What should we do better?)
Why Measure Outcomes? To better measure and improve your library’s impact on the community it serves | To support planning and assessment over time | To help better manage services and resources | To demonstrate a need for funding and other support
Examples: Sacramento knitting club, Jacksonville PL for funding justification (SRP and story time) Richard Mott said, “Parents that attended our programs, 96% said because of program attendance, they felt more confident to help their children learn.” Tells funders that libraries are essential.
Process: 1. Identify Needs 2. Measure Outcomes 3. Review Results 4. Take Action
Data Collection Team: Set up additional accounts to share training resources and set up a training plan that includes an overview and then the appropriate level of training for their part, for example, survey administration. Build internal support and get staff buy in.
Strategies for Building Internal Support:
- Start Engagement Early – Make everyone aware and give folks a chance to voice concerns and see who is interested in the process (Teen services, for example).
- Build Internal Support – Identify library leadership/Board/staff who believe in the value of outcome measurement to help carry the message and make the case.
- Be Upfront with What You Expect to Find Out – Know WHY you are doing this. Be transparent about what kind of information you are trying to capture with outcome data. It could be seen as threatening and feel apprehensive about the change in the process. Goal is to provide the best service possible – what is working or not and change what isn’t working to make it better. Ex: Summer Reading Program. Forward thinking. What ways have libraries found to gather feedback about the internal process? Seek out examples of how to check in with staff during the process.
Feedback from Dan Hensley, Carnegie Library of Pittsburgh, the process gave the library “piles of beautiful data” and served as a “great advocacy tool” to tell stories. (His video is archived at the project web site.)
Q&A: How many libraries think about and document outcomes WHILE planning the program – so you know what success looks like before you start? This project is adaptable, so it could be used to gather data for hyperlocal goals, outside the prescribed Project Outcome goals.
Outcome Measurement Continuum: From Patron-reported learning (immediate survey) to Patron-reported adoption/application of learning (follow-up survey) to Deeper analysis and long-term benefits (outcome measurement guidelines).
Survey Topics: Civic/Community Engagement | Digital Learning | Economic Development | Education/Lifelong Learning | Early Childhood Literacy | Job Skills | Summer Reading
Sample Immediate Survey – easiest, quickest option. Multiple choice with open ended questions, too. Online or paper, but survey can be edited/customized. Example: Plano, TX included survey in with STEM kits that could be checked out and discovered a lack of knowledge of library programs, so a schedule/calendar was added to the kit. 90% of brochures were kept by patrons!
Follow-Up Survey – longer, 2 pages, with more response space. “Patron-reported adoption” – any change of behavior? Skill used in life or work? 4-8 weeks later (or earlier with computer classes). Takes more staff time – administered differently (ask patron if OK to contact, then gather content, then contact them for an interview.
Summer Reading Survey is only available as an immediate survey. Includes a question, “What could the library do to help your child continue to learn more?” Also it is different for Caregivers, Teens, and Adults.
Process for Choosing the Right Survey: Identify Community Needs > Identify Library Goals (from Strategic Plan) > Choose Program & Survey Topic (avoid survey fatigue) > Choose Survey Type
Example of Survey Questions for Civic/Community Engagement Immediate Survey. Each Topic has a unique set of questions.
- You are more aware of some issues in your community
- You feel more confident about becoming involved in your community
- You intend on becoming more engaged in your community
- You are more aware of resources and services provided by the library
- What did you like most about the program?
- What could the library do to better assist you with your involvement in the community?
Follow-up survey questions:
- I became more involved in the community
- I used what I learned to do something new or different in the community
- I discussed or shared with others what I learned or experienced
- I checked out a book, attended another program, or used another library service or resource
- What did you like most about this program or service?
- What could the library do to help you continue to learn more?
Survey creation process is well designed and seems easy to use – we have to be mindful of what data we want to pull out when creating and naming the surveys. Custom questions can be added to the canned/standardized survey questions. The standardized survey questions can’t be edited and the survey must be given in its entirety, if you want your surveys included in the aggregated online project system. Keeps the data clean. Can add up to 3 open-ended questions per survey, common questions are in a drop-down menu, be mindful of survey fatigue, be mindful of confidentiality, and do not ask for contact information on surveys. Anonymous. Use a separate process to gather contact information for follow up surveys. Example of canned questions, ‘How did you hear about this program?’ or zip code data.
LUNCH … so I’m going to publish the first part.
Administering the Survey – you can have a PDF paper survey or online survey (English or Spanish), unique for each survey. It’s tablet-friendly, can be emailed, or taken at a kiosk at the library. No translations for other languages, yet. OK to translate if you have a trusted translator (ask in discussion board for Russian).
Survey Best Practices: For the Immediate survey, hand out survey at the end of the program, email/text the link, give clear instructions, have a drop-box for completed surveys, build in time in the program to complete the survey, and have a volunteer to help. For Follow-up surveys, collect contact information at the end of program and explain what it will be used for. Send the survey 4-8 weeks after, if calling or interviewing, plan to get help. Push to FB or add to Vertical Response to participants
Survey Schedule: For the year, stagger surveys and audiences. If collecting a baseline, maybe it makes more sense to consistently survey one program all year.
How to Talk to Patrons about Surveys: Strategies to talk to patrons about the value of their feedback. Scripts. “There’s always room to grow. Even if you love the library and the programs, it is always useful to get patron feedback, so we can serve you better.” “We want your honest opinion.” New ideas, help us brainstorm. “This is part of a national outcome measurement initiative managed by PLA.” “The survey is 100% confidential and does not require any contact information.”
From the Web site:
How do I complete the survey?
[For Immediate Surveys] Please read the survey carefully. The surveys measure responses on a 5-point Likert scale, with the additional option of “Not Applicable.” The Likert scale reads from left (Strongly Disagree) to right (Strongly Agree). Please select one response option for each question and make sure to complete the open-ended questions below, which ask you what you liked most about the program or service and suggestions for improvement.
Survey Management Tool – Wow. You can archive older surveys, immediately see responses, you can draft or delete surveys, and you can copy surveys. To enter paper responses, there is a quick and easy button to do it one-by-one online when logged in or you can enter multiple responses through a form without logging in. Works well with volunteers. “The usefulness of your reports and dashboards relies on accurate data entry.” Tip: Mark surveys that have been entered, in case the pile falls on the floor…
You cannot EDIT the responses, you would need to delete and re-enter the responses. So be careful.
Review Results – PDF Summary report, data dashboard, raw survey data, qualitative data analysis, and tips for communicating data accurately
Report builder and step-by-step tools. Training videos are being made now. You can include a custom narrative and logo, for board presentations. PDF Report includes general information/canned verbiage about the process and an overview of the survey purpose, then Results with graphics, data, and comparisons. Eventually, we will be able to include a few choice open-ended responses. We can include attendance, then the response rate is calculated by the system. More blanket text included at the end of the report – “Implications for community impact“.
Data Dashboard – set of visualization tools. Interactive and use the same design elements for consistent presentation. Purple is positive, Green is needs improvement, Grey is neutral. Overview shows aggregate scores, including state and national averages. Matrix – Topic and Outcome matrix can be used to find gaps in service. Can apply filters to specify data. Chord Diagram – When you hover, correlations drawn between topics and outcome indicators. A way to actively manipulate the data and/or show strong connections. Detail – breaks down each question with bar chart and includes state and national scores for comparison. Map – Plots locations of library with outcomes and demographic data. Look at geographic areas of service. Library Info – pulls from IMLS data (older), but pulls in general output (statistical) data into a similarly formatted graphic. Consistent with other outcome data in look/feel.
This is research with a little “r” – so we must provide context if the sample size is small and may need provide additional information about why a program had a low response rate. We can access the raw data and do year-to-year comparisons and access open-ended questions. Dataset also shows comparison of print v. online response rate.
Open Responses also available through Detail dashboard with light filtering. Can filter by program name to group multiple programs together to analyze data and open-ended answers. Ex: Early Literacy programs, including all 3 story times. Includes standardized questions and any pre-determined additional questions or unique questions written by library.
Analysis of Qualitative Data – challenging area to approach – you can read through them, but how do you make decisions and identify trends?
- Condense & Categorize Data – group comments according to common topics
- Describe Categories – Describe what people said most often and any smaller categories that you found meaningful. Start with categories that have the most comments.
- Share Findings – “Share internally with staff, discuss results at staff meetings, identify opportunities for change, or plan to use in eternal advocacy messaging.”
Create a spreadsheet that includes all of the responses and the categories determined in step 1, then score each response. Subjective process that might benefit from working with others. Archived webinar to further explain the process. Determine which categories are most prominent and then “describe what people said most often and any smaller categories that you found particularly meaningful.” Describe the trends that you see – “makes for nice messaging.”
Communicating Data Accurately | Challenges:
- Results based on number of survey respondents – be clear, “based on survey respondents” and include number of responses and response rate. Don’t try to infer data to a larger group.
- Surveys measure patron’s perceived change – use results for program improvement and to determine if objectives are met, and back up big decisions with other data collection. Don’t use data for published research .
- Data is a community snapshot – only show patron’s perspective, triangulate with other data to show a complete experience, “bring in average scores over time for more reliability,” and frame library s one factor in the outcome. Don’t claim causality.
Using Survey Results – What Is Your Goal? (p. 47 of the workbook)
- General advocacy
- Justify Funding Requests
- Programming Decisions
- Community-Based Partnerships
- Julianne Rist – Community Goal of minutes read in response to Jefferson County Public Library (CO) Summer Reading surveys – also had a donation to animal shelter when/if community goal was met. Also used Project Outcome to evaluate 1000 Books Before Kindergarten. Used the follow-up survey to see changes in behavior and if readers complete the program. Track by zip code.
- Amy Koester – Skokie Public Library (IL), Village of 65,000, 90+ languages, 40% foreign born. Digital Learning Experience – targeted, modified surveys that include questions “Why did you sign up for this class?” (immediate) and “How successful was [class] at helping you achieve your goal?” (follow-up). Needs assessment built into Outcome survey.
- Christa Werle – Sno-Isle libraries – Had to come up with a common vocabulary and used many definitions determined by Project Outcome. “Issues that matter” programs – civic engagement survey to evaluate. Homelessness last year, mental health this year. Hyper-local or short duration interests, for example “Living with Bears” or “Solar Eclipse”
- Interesting Q&A. Good uses of the process. ROI to determine value of programs.
[Insert me trying to close the library early due to weather in the middle of all this.]
Roadmap – the meat of this workshop…a plan of attack!