November 18, 2011
The 2011 Annual Report for NSSE, the National Survey of Student Engagement, has now been released. It’s entitled “Fostering Student Engagement Campuswide.” As a member of the NSSE National Advisory Board, I had the chance to write the forward to this annual report. It deals with topics that appear frequently on this blog: learning, assessment, and accountability. Here’s the text:
I have had the privilege of being involved with NSSE—the National Survey of Student Engagement—nearly from the beginning. And my involvement has been in two roles: one as the president of a college that has regularly used NSSE, and the other as a member of the NSSE National Advisory Board.
I was drawn to NSSE by a simple, important question: Am I helping my students learn? For me, that has to be the most important question to ask, and ask again, and ask again, for anyone in higher education.
Educated as a political scientist, I was oriented to seek evidence or questions that could be answered empirically. As a young professor, I found that I could talk myself into anything I wanted to believe (depending on my mood) about whether students in my classes were learning. As a provost and later a president, I found myself frequently giving speeches to audiences of parents and prospective students in which I made forceful claims about the education we were offering. I believed what I said, but I went home at night with the sound of those claims still ringing in my ears, wondering why I was so sure and whether I had any warrant to be. The question, “Am I helping my students learn?” became a more insistent one. When I first heard about the Pew-funded project that would become the National Survey of Student Engagement in 1998, I was intrigued and sought to learn more. Earlham College was a very early adopter of NSSE. Through periodic use, NSSE became a key element in the college’s approach to assessment of its educational effectiveness.
While Earlham’s NSSE results provided evidence that the college was succeeding in ways we hoped it would, those results also pointed to some weaknesses and thus spurred efforts at the college to strengthen student learning. Our results underscored, for example, how unusual a college we were in giving students some international experiences (study abroad, second language learning), but also that we were less unusual than we liked to think in inducing close student-faculty interaction. Our data also showed us that we were a more ordinary college for our first-years than for our seniors; we seemed to hold the best of Earlham experiences until the end. The data spoke insistently.
I became a member of NSSE’s National Advisory Board in 2000, just after the survey had been developed and was beginning to be broadly available to colleges and universities as a valuable assessment instrument. From that vantage point, I’ve marveled at the speed at which NSSE has been adopted and embraced, and marveled, too, at the speed at which the superb NSSE staff has ramped up its capabilities—both to serve more institutions and to serve them better.
I quickly came to think of NSSE as a higher education utility. Most of the institutions that make up the higher education landscape are colleges and universities themselves, on the one hand, and membership organizations that gather colleges and universities for shared purposes. Utilities are a third kind of entity—operating organizations that provide valuable, trustworthy services for higher education institutions. And they are rare. Most colleges and universities prefer to do mission-related activities for themselves while they contract with for-profit firms for non-mission-related goods (equipment, supplies) and services (construction, food, cleaning). Utilities, in the way I’m using this term, provide mission-related services; they are not-for-profit organizations that are governed in ways that keep them faithful to the special missions of higher education institutions. NSSE is such a utility. It provides assessment services to colleges and universities and is steered by a National Advisory Board composed of teacher-scholars who are deeply committed to education and the assessment of educational effectiveness. The National Advisory Board meets twice each year. In my 11 years, we have considered dozens of things, but three large issues have regularly drawn our attention: disclosure, use of NSSE data, and improving NSSE.
Disclosing results. NSSE was created with an explicit intention to change the discussion about quality, both within and beyond the academy. We wanted to redirect the focus away from rankings and prestige and toward considerations of learning and teaching. That meant, certainly, that we wanted to encourage not only the use of NSSE, but also the disclosure of NSSE results. So should NSSE itself make public the results of each institution that participates? We’ve discussed that many times and always come to the conclusion that it is the colleges and universities that should make the judgment about whether, when, and how to make their NSSE results public. To facilitate disclosure, NSSE’s staff has worked very hard to make public presentation easier and more comprehensible to a range of publics.
Using NSSE data. When NSSE began, our focus was on promoting adoption of the instrument. As colleges and universities embraced it, we quickly realized that an equally big challenge would be to help institutions make use of their data to improve the quality of undergraduate education. So NSSE has devoted a great deal of attention to improving how the data are reported and to sponsoring workshops and presentations to help faculty members and administrators make sense of their NSSE results and connect their findings to what they are learning from other sources.
Improving NSSE. NSSE is an instrument that opens a window on teaching and learning, but it is even more an initiative to improve learning. The NSSE instrument emerged out of decades of prior research about the contexts and activities that lead to learning. Right from the beginning, we knew that NSSE itself would have to learn and improve. We knew we had more to learn. So another frequent focus of National Advisory Board meetings has been how to make NSSE better. We have made aggregate data available to researchers and encouraged them to use it. We have listened to criticism, tried to learn from it when that has seemed appropriate, and tried to voice our disagreement when that has seemed warranted. Next year, we’ll see a new, improved NSSE, one that reflects learning from the experience and discussions of the first decade.
For me, NSSE has modeled the best values and practices of the academy.
Douglas C. Bennett, President Emeritus, Earlham College