14 Oct Measuring Jewish Day Schools through Surveys and Feedback Loops for Continuous Improvement
HAYIDION THE RAVSAK JOURNAL
Surveys, Feedback Loops and Continuous Improvement
by Sacha Litman
Issue: Taking Measure
It has become the norm for companies to seek feedback constantly via email or phone surveys after we make a purchase, or to track sentiment (and take corrective actions) about them on social media. Why the obsession? Because companies have figured out that in a competitive marketplace, managing customer sentiment and loyalty are critical to profits, and a channel for honest feedback is essential to good managerial decisions.
There is a growing body of evidence in the nonprofit sector that listening to customer feedback leads to greater impact. Major foundations have initiated programs like the Fund for Shared Insight to encourage nonprofits and themselves to do a better job of listening to the voices of their customers, and creating continuous improvement cycles.
So, too, for schools. When full-tuition families are paying $20,000-$40,000 a year for their children’s education over many years, often the greatest expense in their lives and in some cases more than their annual mortgage, the school had better be listening to them on an ongoing and systematic basis.
Parents’ perceptions of a Jewish day school play a critical role in enrollment. In fact, we’ve tested many other factors like tuition level and advertising, and none has the systematic effect of parent perceptions. In synagogue and while shopping, at Starbucks and drop-off lines, birthday parties and bar mitzvahs, parents exchange their impressions of their children’s school and other parents listen. While Measuring Success has collected feedback from nearly 100,000 parents, students and alumni in schools, we sense that the field still has large pockets of resistance to capturing and using feedback.
So where does the resistance come from? Primarily, from board members or administrators who have strongly held anecdotes, intuition and “gut” about the changes in the school that need to be made. These anecdotes tend to be formed by the family’s own experience with their child, or a complaint from a close friend during kiddush. And as we know, it is often the loudest voice or wealthiest funder at the table who gets the grease. But the problem is knowing whether that anecdote holds for the larger population in question. We have tested this by asking school leaders to hypothesize, based on their anecdotal knowledge, the answers to questions like, “Which demographic groups in your school community are the happiest or the least happy?” 80% of those anecdotes are not supported when we examine them against representative feedback data. Which means that potentially 80% of all the time and energy your school spends is on initiatives that are not going to make a difference. So how can we get more bang for the buck in terms of our time and resources? Listening to our customers via the data.
There are many ways to collect feedback data for schools. We have outlined a number of them here in a concentric circle model, and explain when each feedback mechanism should be used, what questions can be answered, and tips for doing it.
Capturing the parent body’s objective feedback on annual surveys—or increasingly “pulse surveys” that are shorter and more frequent throughout the year—is essential. It gives a school the best leading indicator for its likelihood to grow or shrink, and a chance to demonstrate customer responsiveness by quickly addressing the issues once they’ve been identified. It’s also a great way to learn what parents are absorbing from the multitude of communications they receive about the school. At Schechter Westchester, we measured parent perception of the school’s quality relative to local private and public schools, and used the insights to direct focus to areas that the data suggested mattered to parents. It was a revelation to see that in some academic areas in which the school objectively was performing well based on test scores, the parents still perceived the school as weak. The school was not communicating effectively and needed to improve its marketing efforts in that area.
Since we have been working over the past five years with over 100 Jewish day schools on parent surveys through a program originally funded by PEJE, we have had the opportunity to see longitudinal improvement in many of the schools that have made these surveys an annual or every-other-year occurrence. For many schools, the benchmarks comparing their scores against other Jewish day schools and the local private and public competition is very useful in year 1, but over time, it’s the longitudinal progress that matters to them most. Schechter Westchester’s improvement over time, such as a nearly 10 percentage point increase in likelihood to strongly recommend the school, enabled the professional leadership and board to see that the changes they had put in place were making a difference in the minds of the customer.
Naturally, parent sentiment is important not only for retention of current parents, but also to attract new families. Many schools wonder how they can attract families not yet in the school. The best way to do so is not advertising, but rather ensuring that families currently in the school have a high opinion of the school. Positive perception drives word of mouth, which is the most important marketing. Few new parents will spend tens of thousands of dollars without speaking several times to parents in their social networks who have already experienced the school. It’s a critical reference. In fact, Schechter Westchester’s enrollment ended up exceeding budgeted projections by 35 students due to lateral entry increasing substantially and attrition plummeting 43%.
For schools with high schools, the customer and buying decision includes not only the parents but the high school students as well. As with parent surveys, it is very important that student surveys be objective and protect anonymity, since students even more than parents fear that honest critical replies will be used against them. Administrators are concerned that students may “collude” with their friends on responses in order to negatively bias the results. The Yeshivah of Flatbush solicited student input to make improvements to electives, scheduling, Judaic studies, Hebrew and school culture. It experimented with various structures until arriving at having the students all take the survey at the same time during homeroom time, under the supervision of teachers. That took care of collusion. But in order to ensure students felt they could be honest, Flatbush made it clear that only Measuring Success as the objective third party would be analyzing the individual responses, and the Flatbush administration would only see the data in the aggregate in a manner that could not be traced to any individual student.
Feedback from families who left the school or those that applied but did not attend
What we have found most effective are qualitative interviews of as many of those families as are willing to speak with the school. The reason is twofold: there are usually only a dozen or two of these families (depending on the size of the school) each year, making it difficult to interpret quantitative results for lack of sufficient sample size, and these families are also less inclined to answer surveys because they opted out. Another important time to use qualitative feedback is to generate hypotheses to test in a quantitative survey, or to use focus groups to help interpret confusing survey results. The challenge with qualitative feedback is to ensure that it is representative of the populations in question.
These tend to serve three different purposes: demonstrating the school’s impact on alumni to prospective families (marketing), feedback to the administration on how to improve impact, and fundraising. Alumni have a perspective different from current students and parents in that they can assess how effectively the school prepared them—academically, ethically, socio-emotionally and Jewishly—for high school, college and career relative to their peers. As living proof of your school’s impact, alumni are a critical source of feedback as to whether your school has added value. For example, a large independent school in Nashville is capturing alumni feedback via surveys for each of the next five years to generate marketing content to prospective families.
But while young alumni are critical for measuring impact, older alumni become critical sources of fundraising, as they have sufficient earning power to make significant contributions. Many schools make the mistake of waiting until an alum is over 40 and has landed in the news for business success to start cultivating a relationship. By then, it is often too late, as the alum has loyalties to several other alma maters like their college and graduate school, as well as other charitable causes which have actively cultivated a relationship with them. But it’s never too late to catch up.
Charles E. Smith Jewish Day School ran an alumni survey this year in anticipation of the school’s 50th anniversary. Asking alumni for their feedback is often the first step in rebuilding trust and a relationship. The survey results were also uploaded into the school’s database for improved knowledge of their alumni. The survey data can be analyzed to predict which alumni have the best likelihood to make a significant gift. This approach has helped federations to identify which midlevel donors have the greatest propensity to become major donors of the future. Alumni offices can receive ranked lists to use their limited time and energy on those alumni with the greatest likelihood of success. Universities also use these predictive models frequently.
A survey of this final concentric circle is a tool in increasing demand because it identifies the potential size of the marketplace interested in Jewish day schools, as well as the individual prospective families that would be the best fit for the school. Charles E. Smith, for instance, sent a survey seeking to understand the educational perceptions of Jewish parents to the databases of 27 community organizations, including synagogues, JCCs, PJ Library and other early childhood programs. We were able to identify groups of parents most likely to attend the school, as well as identify a list of prospective families that would be the best fit that the admission office could use to cultivate.
While algorithms and analyses undergird much of the work identified above, the most important lessons we have learned are about trust and intention. Feedback is a powerful trust-building tool that tells your stakeholders that you are listening (especially when you take action on the feedback). Transparency about survey results, especially in areas where customers are critical of the school’s performance, also builds trust with your customers because you are acknowledging their feedback. The other lesson about intention, or kavannah, speaks to the importance of focus and accountability. Schools that are committed to listening to stakeholder feedback, setting a measurable goal, and acting on it consistently see statistically significant improvements in those areas they focused on.
One common question that comes up is how often should a given stakeholder group be surveyed for feedback? Some schools only run surveys as a requirement in their accreditation process every seven years, clearly too infrequently to be used in a regular feedback loop. Others would argue that parent surveys run every year or every other year are still too infrequent to capture feedback quickly enough to make a difference. On the other hand, others worry that if we survey too often it will result in survey fatigue for the respondents, dropping the response rates and causing stress for the school that hasn’t had the time to implement changes as a result of the last set of feedback data.
At one extreme is the argument that feedback must be real time and immediate. When I tried to lose weight for my reunion, for the first week I got on the scale each day, but my weight didn’t change at all despite my hoping it would! I had to track my calories during the day as I was consuming food. This monitoring of my behavior on a real-time basis was the true leading indicator, which quickly allowed me to modulate my food intake and lose the weight.
What would it mean to track data close to real time in schools?
I spent a few days this past summer with the leadership of Gann Academy in Waltham, Massachusetts, helping to think through how to track and measure the student experience. We discussed approaches to collecting data daily, such as asking students at the end of each day to complete a two-question survey on their mobile phone about how much the school had impacted or inspired them that day, to capture the sentiment at the moment and be able to chart the trends, along with a more comprehensive set of feedback every month. We also imagined asking faculty to notate on a phone app the mood in each interaction with each student. (Given that we estimated 20-25 unique faculty members having substantive interactions with a student each day, this could be quite an undertaking). Such a frequency of feedback no doubt would present a serious cultural change; could schools do this in a way that stays true to their educational mission? The quality of human interaction is so essential to outstanding teaching and learning; how might this activity be an enhancement rather than a distraction?
Perhaps this frequency of feedback is too much for your school, but make no mistake that this is part of a larger trend. Many foundations that we have worked with are investing in public schools tracking students’ “individual learning journeys,” where the school’s job is to ensure that each student maximizes his or her potential (and thus maximizes the value-add of the school). For the past year, we’ve been working with Southern Methodist University on individualized student journeys within West Dallas public schools. The tracking system links together islands of data from schools along with data from community programs in which students and their families participate, like afterschool, summer and social service programs. The combined datasets provides an eagle-eye view of a student’s academic as well as socio-emotional trajectory, helping educators and counselors perform more effective interventions when students are not achieving their potential.
Critics might say that this is what the smaller, more intimate environments of independent schools offer innately—individualized attention from faculty who provide academic and emotional support. Increasingly, though, schools are finding that this is occurring haphazardly, or that the 20% of kids with the greatest needs are getting 80% of the faculty’s attention.
Database Marketing through Targeted Communications
We can apply the same logic to our communications with our various stakeholders. Rather than sending out one-size-fits-all messages to our parents, students, alumni and donors, what if our schools automatically integrated a customer’s preferences, interests, prior feedback and social media activity into creating a customized set of messages aligned with that customer’s interests? “For example, Procter & Gamble takes daily data feeds from 86 brands covering 60 million customers and applies the information to customize the communications messages that each customer receives.
And that’s the takeaway message about feedback. No matter at what level of intensity, or with which stakeholders your school chooses to get feedback, the key is how your school uses the feedback to make improvements that enable each family to feel like their child is maximizing his or her potential at your school. And where you are failing to do so, the feedback lets you know so your school is empowered to fix it. Feedback loops, whether performed once every few years or every day, are critical tools for continuous improvement and allowing our schools to put the “custom” back in “customer?”
Sacha Litman is the managing director at Measuring Success, whose mission is to enable schools to harness the power of data analytics to increase enrollment, impact, and fundraising. Sacha and his team have worked with over 330 Jewish day schools and 400 independent schools, along with several dozen public and charter schools. firstname.lastname@example.org