Online course enrollment looks different depending on which set of recent higher education data you look at.
Through its Integrated Postsecondary Education Data System (IPEDS), the National Center for Education Statistics released data this month on the number of students taking distance education courses. While the center didn't produce a report based on the information, education technology consultant Phil Hill analyzed the data for fall 2012. Based on his analysis, Hill pegged the number of students who took at least one online course at 5.5 million.
Shortly after the IPEDS data was released, the Babson Survey Research Group produced a report about online learning in conjunction with Pearson and the Sloan Consortium. This report said 7.1 million students took at least one online class, which is an increase of more than 411,000 from the previous year.
These two numbers represent a difference of 1.6 million, with the IPEDS data coming in much lower. The Babson team plans to look at the data and figure out why there's a discrepancy, said Todd Hitchcock, senior vice president of Pearson Online Learning Services.
But education leaders have some ideas as to why the numbers are so different. For example, the two data sets could have included different definitions of distance and online education, as Hill pointed out in a blog post. That would have caused colleges and universities to count student enrollment differently from survey to survey.
At least several other issues could have contributed to the discrepancy. Higher education institutions are required to fill out the IPEDS survey if they want federal financial aid, but they don't have to participate in the Babson survey.
With their financial aid hanging on the IPEDS survey information, institutions are more likely to use exact numbers, said Russell Poulin, deputy director of research and analysis for the WICHE Cooperative for Educational Technologies based in Boulder, Colo. Survey respondents may have estimated numbers more frequently when completing the Babson survey.
In addition, on-campus and continuing education numbers for each university could pose problems. Let's say a person filled out the survey who was involved with online classes for on-campus students. That person may have only included online enrollments from on-campus students while leaving out students enrolled through the continuing education program. Or the student information systems of the separate on-campus and continuing education units may not match up well.
With the potential for such problems with data reporting, sample size becomes a key factor. The IPEDS data probably included every institution, while the Babson survey included as many universities as it could get.
"If there were errors in the data they collected and then they extrapolated out, then what could have happened was that the extrapolations magnified the differences," Poulin said.
Both the IPEDS and Babson information are useful for different purposes, and he hopes that both groups continue collecting data. For more than a decade, budget cuts prevented IPEDS from collecting data on distance education enrollments, and Babson stepped in to fill the gap. Now that IPEDS is back in the business of collecting that information, the Babson data can focus more on the other parts of the survey that it's known for, including trends and strategies about online learning.
"There's all sorts of misinformation or misconceptions out in the community about online ed and such," Poulin said, "and so it would be good to have data that's complete and from all the institutions that could be analyzed."