Holding Yourself to a High Standard of Quality When Using Assessments

Stacie Cherner

There is an unprecedented level of attention being given to the value and applicability of assessment tools, particularly in the field of education. Certainly this positive development is in part a result of the vast amounts of data seemingly at our fingertips. Practitioners, target audiences, funders, local organizations and other key stakeholders recognize that there are ways to measure the programs, initiatives, curricula, or any other intervention in question. And while not every situation lends itself to assessment, the Jim Joseph Foundation has a guiding principle that if the results of an assessment will inform that educational opportunity and others, then, yes, assess! 
In too many instances for too many institutions, however, deciding to assess is the end of the conversation. Yet, a second, equally important, issue needs to be addressed: which assessment tool (or tools) will yield the most useful results? Not every assessment is high quality, and certain assessments are more effective than others for specific classroom settings or other educational environments. Educators and education leaders often focus—on improving learning outcomes or improving the learning experience. This same mindset should be applied to assessments, as there are always ways to improve how we measure our educational efforts and interventions. 
At the Jim Joseph Foundation, we are funding the development of an assessment of teen Jewish learning and growth outcomes. This work is part of our Cross Community Jewish Teen Education and Engagement Funder Collaborative, which is a platform for shared learning and collaboration among grant making professionals at Jewish foundations and federations. All involved parties plan to invest in (and in many cases already have) community-based Jewish teen education initiatives designed to achieve the group's shared measures of success (for example, engaging Jewish teens, and achieving sustainability).
The Foundation funds this assessment because, along with our partner communities, we want to glean as many learnings as possible from the Collaborative’s efforts. Which grantmaking strategies are most effective in which communities? What program characteristics lead to better learning and growth outcomes for Jewish teens? These are complex questions that require time and resources to answer. 
Developing a set of common outcomes for the initiative itself was no small feat. But under the leadership of The Jewish Education Project, the Collaborative came to agreement on what outcomes the various local initiatives would strive to achieve (i.e., Jewish teens establish strong friendships, and Jewish teens feel a sense of pride about being Jewish, to name just two). The evaluation team then developed a teen survey to measure initiatives against those outcomes through a rigorous process of expert interviews, teen focus groups and pilot testing to ensure the survey questions are measuring the intended construct. 
The survey was piloted in three communities this summer. Now the evaluation team is analyzing the survey results, seeking input from key stakeholders and experts, and conducting another round of cognitive testing—all in order to revise the survey items to even more effectively measure the impact of Jewish teen initiatives moving forward. Undoubtedly this is a lengthy process. But by “getting it right,” we will improve our assessment ability in this space, benefitting teens and the entire field. 
From the Foundation’s perspective, equipping grantees to assess their programs represents sound use of funder assets and grantee time. We welcome the decision of many grantees to contract with independent evaluation firms to help them develop assessment tools tailored to measure their programs and desired outcomes. A truly valuable resource in these efforts is the Jewish Survey Question Bank (JSQB) (funded in part by the Foundation), which gathers survey questions used across the Jewish education field and categorizes them by topic. This vast collection intends to make it easier and more efficient for schools, organizations and individuals to develop their own surveys to assess their efforts.
As we look to further advance the quality of assessment of Jewish education initiatives, the secular education arena is a good model to reference. There, many longstanding assessment tools exist, designed to be used by a range of education programs. From my past experience in this arena, I am aware of key questions asked before deciding whether to begin an assessment and—if so—which assessment to use. Some useful questions for day schools to keep in mind include:
1) Is the administration of the assessment a burden or relatively easy? For example, some schools have unreliable technology or Internet access, so a web-based assessment tool may be too cumbersome to administer. In other schools, a paper-and-pencil version may be better.

2) Does the timing of the assessment sync with our need for information? For example, some classrooms may benefit from an initial assessment at the beginning of the school year so the results can be used for diagnostic purposes. Other classrooms might benefit more greatly from a mid-course assessment. Either way, both assessments could be informative to the entire school, or even the broader field, and should be leveraged appropriately.

3) Does the assessment measure the learning outcomes we are trying to achieve? Naturally, some assessments are more aligned with the actual curriculum being taught than others. It is well worth the time to review multiple assessment frameworks before selecting the appropriate one. 

4) Are the results easy to understand and act upon? Some assessment reports are so complicated and data heavy that it becomes impossible to wade through or to glean best practices. The best reports offer clear findings and essentially lay out a road map of small tweaks or large-scale changes to improve the education experience being measured.

5) What is the value of having comparable data from previous years? While seeking the best assessment tool is always a worthy endeavor, there are real benefits, too, to comparing current results with past results or to a wider pool of respondents. If a program has been assessed a certain way for years, or even decades, the best decision may be to stick with that framework. 
Whether in Jewish or secular education, assessment is a best practice—and high quality assessment is an even better practice. From Jewish camping initiatives, to teacher training programs and other grants, we at the Jim Joseph Foundation and its more than three dozen major grantees have used assessment to improve existing efforts and to inform new ones. Its value certainly applies to day schools as well.


Stacie Cherner is a program officer at the Jim Joseph Foundation, which seeks to foster compelling, effective Jewish learning experiences for young Jews in the United States. Established in 2006, the Jim Joseph Foundation has awarded more than $350 million in grants to engage, educate, and inspire young Jewish minds to discover the joy of living vibrant Jewish lives. [email protected]

Return to the issue home page:
Image
HaYidion Taking Measure Fall 2015
Taking Measure
Fall 2015