This first phase of NILPPA’s research provides the foundation for national metrics to assess how library programming is impacting library services and users. A critical step in this process is finding a way to characterize and categorize the breadth and variety of public programs occurring in libraries of all sizes and types. What are the topics and formats in use? How are programs paid for? What audiences are being served? Who are the most valued community partners? How are programs evaluated to assure quality and meaningful impact? And, ultimately, what outcomes are evidenced through effective public programming?
To avoid redundancy and learn from others, NILPPA researchers began their work by delving into and leveraging the work of related projects. This includes the work of Project Outcome, for example, which was critical for inventorying program descriptions and dimensions, and Measures That Matter, a joint project of IMLS and the Chief Officers of State Library Agencies that has worked to streamline data from the nation’s public libraries since October 2016.
NILPPA researched numerous related reports and organizations to think about initial categories. The team also considered a number of other efforts related to programming. The team worked closely with Project Outcome staff to define a categorization scheme initially based on content or topic. Programming Librarian, an ALA website, provided their menu options, such as program budget, library type, program topic, program type, and audience. The site was recently re-designed based on how library workers use the information—categories that received few clicks were eliminated, while popular categories were kept. As the site continues to be refined and updated, it serves as a reflection of the library programming work in the field.
The University of Washington surveyed visitors’ use of technology in libraries, resulting in a categorization of technology use—relating to education, employment, health and wellness, civic engagement, and more—that could potentially inform library programming. To further examine ways programming is visualized and talked about elsewhere, the NILPPA team reviewed efforts such as the Pew Library Typology study, which confirmed the importance of library public programming to the American public. Meanwhile, the Public Libraries Survey, conducted by IMLS, provided NILPPA’s first working definition of program, and WebJunction’s competency index for the library field emphasized new overarching components, 21st-century skills, accountability, and community engagement—all of which are applicable to the goals of public programming.
Drawing from this rich background of research, the NILPPA team undertook a detailed meta-analysis that became the basis for a draft categorization scheme, reviewed and refined by the project advisors. The evidence-based classification creates four dimensions, each further refined by subdimensions, as you can see in FIGURE 1.
From October 2017 through February 2018, the researchers and advisors continued to refine the model and assure its relevance to all program types. One significant change was the addition of a single key question (or sub-dimension) for all dimensions. Each question relates to the program’s goals and intentions, an important component in NILPPA’s eventual objective of measuring impact.
- Library Profile: What type of library is it?
- Program Characteristics: What is the most important intended outcome? Categories may include such outcomes as education, recreation, or dialogue.
- Audience Scope: Is the program trying to appeal to the library’s entire audience or a subset? Subsets could be audiences divided by age groups or perhaps intended for special interests such as bilingual programs.
- Program Administration: How was the program developed? Is this a library-developed program, one developed with a partner, or perhaps something from a national organization, such as ALA?
From April through October 2018, the researchers conducted a Library Programming Validity Survey to corroborate the research team’s definitions and preliminary categorization schemes. The surveys, which reached a wide field (see FIGURE 2) of library professionals in a variety of library environments, focused on clarifying the anticipated outcomes of programs, which feature prominently in the next phases of the NILPPA initiative. Focusing on outcomes addresses the importance of developing programs intentionally rather than opportunistically. Furthermore, to determine collective, nationwide impacts, libraries must first measure their programs’ outcomes on an individual level.
Researchers also conducted a series of case studies of programs across library types that illustrate intended outcomes, and to determine whether new intended outcomes should be added to the framework. Based on these studies, one additional dimension emerged, one sub-dimension was adapted, and one was eliminated. This process produced the following intended outcomes representing library programs across the country.
- Participants learn new knowledge.
- Participants learn new skills.
- Participants change their attitudes.
- Participants change their behaviors.
- Participants gain awareness of library resources, services, or programs.
- Participants have fun or are inspired.
- Together, libraries and participants build stronger and healthier communities.
Research Findings: What Makes a Library Program?
Ultimately, this extensive testing process resulted in the Library Program Categorization shown in the framework in FIGURE 1.
Identifying the primary and sub-dimensions of library programs is essential to the subsequent phases of the NILPPA research. The second question of this study explores how best to prepare those library professionals responsible for building and delivering effective programs. Together these findings provide a more nuanced understanding of the nature of library public programs today.