The Career College Association issued a statement responding to my paper on gainful employment and since I’ve seen a few other critiques or questions from others, I thought it was worth giving a somewhat detailed response.
The statement asserts that my paper contains “wildly lower guesstimates” of the impact of new Department of Education regulations. I’d just note that these estimates are not all that dissimilar from the ones produced by the U.S. Department of Education. While my analysis does estimate the percentage of ineligible programs to be one percentage point lower than the department’s projections, the overall picture for the sector is a bit worse. My analysis suggests 15 percent of programs would be restricted—nearly double the department’s projection. Moreover, the analysis also predicts that a much larger share of programs would end up having to issue high debt warnings than would be fully eligible.
I will readily admit that the report would have been much stronger had it used actual Social Security Administration earnings data. That data, of course, is not publicly available for anyone—something the report acknowledges on page 6. Instead, I used a national data set run by the Department of Labor that produces reliable estimates of earnings data using very large samples of individuals’ earnings throughout the country. Is it as good as actual earnings information? No, but since that isn’t available, it is a reasonable substitute.
The CCA statement also criticizes the use of use of program cost information because it does not portray the total cost of student borrowing. I decided to use program cost information because I felt that existing measures of student debt were inadequate. First, they include average borrowing taken on by all students. Those figures understate debt levels for the gainful employment standard because the proposal only considers graduates of for-profit programs. The overall average significantly understates debt levels because the significant drop out rates at most schools mean a large percentage of students never have to pay for their complete education. Second, average borrowing figures do not include reliable estimates of private student loan borrowing, a large source of student debt that the industry relies upon once federal student loan limits are exhausted.
So instead I turned to program cost in order to gauge estimated debt levels. A student that graduates from a program would, at a minimum, have to find some way to cover the total charged tuition, plus fees, books, and supplies. I did not include estimates of living expenses because so many of the sector’s students are also working and have families so attributing all of those household expenses to student loan borrowing seemed unreasonable.
But if large numbers of students are borrowing way more than the program’s cost–a possibility that I note in two separate instances in the report–then who is to blame? It’s not clear that schools are innocent in this practice. A recent investigation by ABC News found that students at a branch of the University of Phoenix were being told to take out the maximum because it was easier to do so and the student could keep the extra money with no questions asked. If overenthusiastic encouragement on the part of the school is behind excessive borrowing, then it hardly seems right to then claim that taking on too much debt is out of the school’s control.
In short, I recognize that like any model, I had to resort to certain assumptions and rely on the data that are available. That said, I think the methodology employed was quite fair in balancing the issues of borrowing related to cost and income. And did so while relying only on data either reported by the federal government or the schools themselves.
But that doesn’t mean I’m not open to better data. If the CCA would like to provide more specific data on exact levels of private and federal student loan borrowing or earnings information for each of its 1,400 member institutions by program, then I would gladly accept that data and run the analysis.