Skip to main content

The A Word

The halls of public universities are buzzing about the push for accountability, especially as Texas works to catch up with states that have already taken up the mantle — and dealt with some of the inherent difficulties — of a data-driven examination of higher education.

Lead image for this article

For years, as the careers of public educators have increasingly been tied to precise measurements of student performance, the inhabitants of the higher education realm have continued to live in what Kay McClenney, a senior lecturer at the University of Texas, calls “a world of anecdote.” 

But recently, she says, a “sea change” seems to be emerging, as public institutions of higher learning increasingly find themselves evaluated less on long-standing reputation and more on what the numbers actually say. Higher ed is abuzz with a push for “accountability” that has taken multiple forms of late. Examples include a document developed by the Texas A&M University System comparing every faculty member’s generated revenue to his or her salary and a recently enacted law requiring that professors post syllabi and curriculum vitae online for student scrutiny. Those relatively small steps likely signal the start of a much more comprehensive effort, especially as Texas catches up with those that have already taken up the mantle — and encountered some of the inherent difficulties — of a data-driven examination of higher education.

That mindset has especially begun to take hold at the community college level. McClenney is the director of the Community College Survey of Student Engagement, or CCSSE, which has gathered and publicized the results of extensive student surveys at colleges nationwide since 2001. She views the trend toward using data as a means to begin a new conversation in higher education, one that does not assume an institution’s quality based on its revenue or the percentage of students it turns away. Nor should the conversation be driven by an enrollment-based financing system that’s solely focused on getting students in the door. “We believe a quality education is what happens when the student is actually on campus,” she says.

To McClenney and the CCSSE, that means well-publicized evaluations of a school's academic challenge, support services provided, student and faculty interaction, and level of student engagement.

Unlike the CCSSE, the results of its “sister survey,” the National Survey of Student Engagement, which queries students in four-year institutions, never see the light of day. According to McClenney, that's largely because the large number of private universities involved do not want to pay to participate in a survey that might result in a poor presentation to the public. McClenney calls it an “act of courage” to agree to participate in the public reporting of data that may not reflect well.

Measures are being taken to bolster the data collection systems around the state, fueled by money from the U.S. Department of Education, the Lumina Foundation and the Bill & Melinda Gates Foundation. Mark Milliron, the deputy director of postsecondary improvement for the Gates Foundation, says a favorite phrase in his organization is that trying to lead without data is like trying to bowl in the dark. Their goals go beyond providing data to policymakers and insitutions. "We are investing in the efforts to get data down as close to the learning moment as possible," he says. "If Facebook helps students choose better friends and iTunes helps them choose music, why don't we use data to help them pick the right learning object or understand the next best step on the learning journey they are on?"

Meanwhile, the Texas Higher Education Coordinating Board is overhauling its data collection system. By 2011, the coordinating board will have the capacity to collect student transcripts for the first time, allowing them to track student movement through the education pipeline in ways previously impossible.

House Higher Education Chairman Dan Branch, R-Dallas, says improving the quality of the data isn’t enough when the turnaround time on the information can be months or even years. He has been pushing the coordinating board to provide real-time data projections. “Not only do you have to have good data,” he says, “you’ve got to have timely data.” This was driven home to him in May when he went to give a commencement speech and couldn’t get a figure for the number of graduations expected in the state in the spring. “The problem we’ve had is that, even when we were getting data, it was too late,” he says. “A lot of the data out there has got ’06, ’07, ’08 type numbers, and that’s unacceptable.”

Just as accountability measures in public education initially drew fierce protests from educators, higher-education data-mining can irk college faculties. Much of the information already causing a stir has been publicly available for some time — for example, all the salary and revenue data in the A&M System’s much-discussed accountability document. But no school previously thought to interact with it as the A&M System has, catching many faculty members off guard.

“Our approach to the use of data has been too passive,” Higher Education Commissioner Raymund Paredes told legislators in August at a joint hearing of the House and Senate higher education committees, each of which was given interim charges to examine the coordinating board’s handling of data. “We tell institutions, ‘It’s there for you to use.’ But it’s clear that institutions haven’t used it extensively as they might. So we’re going to take steps to put it in their hands.”

The coordinating board recently unveiled a website with detailed report cards for each public institution of higher learning in the state. Two are provided for each school: one for members of the public, another for legislators and policymakers. The former includes information on graduation rates, annual costs, average financial aid and class sizes. The latter puts a heightened emphasis on financing and student success. Each public institution now must provide a link to its profile, generally on the bottom of its homepage.

It’s a step in the right direction, say those who tired of chasing down bits of information posted in different sections of a complex website. Drew Scheberle, senior vice president of education and talent development at the Greater Austin Chamber of Commerce, says essential higher education data, aside from athletics, has never been readily available. Scheberle is one of the driving forces behind a unique annual progress report on Austin Community College — one pushed by the business community, with the school signing on reluctantly.

He contrasts the coordinating board’s model — simply making information available — with the way the Austin Community College Progress Report is handled, saying, “We actually push this out to the community. We’re actively trying to push a vision in concert with ACC.”

Even with the school on board to co-author the report, the partnership between ACC and the chamber has at times been tense. While the most recent progress report was wrapping up, ACC President Stephen Kinslow, noting his staff had spent 200 hours on the project, fired off a note to the leaders of the project task force arguing that the school “cannot justify the excessive amount of staff time and diversion of resources from other priorities.” He made a point of noting that the school was “not averse to being evaluated.”

In Scheberle's experience, the "a" word tends to make educational institutions recoil. “They only look at it from the standpoint of, ‘How is this going to hurt me? If this is being used for accountability, it’d better be watertight,’” he says. Scheberle says he would prefer the report be viewed more as a “performance management tool” than as an accountability measure. “In accountability,” he says, “you only get beaten up with the information.”

According to Gary Johnstone, the coordinating board’s deputy assistant commissioner for planning and accountability, the board hopes to produce enough navigable data for interested parties in the business community to use the system, and colleges won’t have to spend the time.

One of the risks for institutions releasing data: Anyone can use the information however they want. As committed as McClenney and the CCSSE team are to publicly reporting their survey, they oppose its use in any kind of ranking. “They are silly,” McClenney says, arguing that they oversimplify the complex universe of education. “There is no significant difference between No. 1 and No. 15.” But Washington Monthly magazine has repeatedly used CCSSE data to rank the top community colleges in the country. At the very least, McClenney says, it muddles the conversation she wants the data to promote. In an open response to the magazine issued in August, she wrote, “We are confident that few, if any, would attest that their current performance — however good — is as good as it needs to be.”

Texans need truth. Help us report it.

Yes, I'll donate today

Explore related story topics

Higher education