Abstract: America’s higher education system is in dire need of reform. The average college student leaves school with more than $23,000 in debt, and total student loan debt in the United States now exceeds $1 trillion. Furthermore, too many students are leaving college without the skills needed to be successful in the workforce. And yet, despite the dire state of today’s higher education system, there is hope on the horizon: By favoring knowledge and skill acquisition over seat time, online options and competency-based learning are disrupting the traditional higher education market and perhaps have laid the foundation for a revitalization of American education. Despite the promise presented by these innovations, however, the antiquated higher education accreditation process remains a considerable obstacle to reform.
America’s system of higher education is on the verge of dramatic change. After years of debate, enterprising academics may have resolved higher education’s most frustrating dilemma: that although a college degree or an equivalent set of skills is essential for a good job and the chance of upward economic mobility, a traditional college education has become unaffordable for many Americans—unless they are wiling to incur enormous debt. In fact, over half of all graduates with bachelor’s degrees incur an average of $23,000 in debt, and cumulative student loan debt now exceeds credit card debt.
Entrepreneurial educators are attempting to resolve this dilemma by using new business models and new ways of learning, such as through online courses, to slash the cost of a college-level education. These innovations offer the prospect of a fundamental restructuring of higher education with a sharp reduction in costs—a revolution that would be a boon to students seeking to acquire the skills they need in today’s economy.
Despite the promise presented by these innovations, a considerable obstacle remains: accreditation. A feature of the traditional education system, accreditation is a “seal of approval” granted to institutions of higher education and is intended to assure students that colleges and universities meet certain standards of quality. As a system of quality measurement, however, accreditation is riddled with problems. For example, it favors existing expensive business models for higher education, thereby making it difficult for new models to emerge. Additionally, accreditation rates entire institutions—rather than specific courses—and, as a result, is a poor indicator of the skills acquired by students.
Accreditation also narrows the number of educational opportunities available to students: In order to receive federal student aid, students must attend an accredited school. While accreditation is technically voluntary, students at an unaccredited college are not eligible for federal student loans and grants. Consequently, as federal student aid and subsidies have become an increasingly larger share of university budgets over the past four decades, for most institutions there is little choice but to seek accreditation.
Without question, America’s system of higher education needs dramatic and lasting reform, but accreditation continues to impede such a transformation. If higher education is to keep pace with the demands of future economies, the metrics used to value an education must place a greater emphasis on rating and credentialing specific courses and acquired skills—not institutions.
This reform can and should be driven by the private sector so that the skills students receive are the same tools valued by employers. Policymakers, lawmakers, and business leaders need to resist the efforts of existing institutions of higher education to thwart this necessary change.
The Accreditation System: Antiquated and Self-Serving
The primary purpose of accreditation, in general, is to assure customers that an institution meets certain standards of quality. Hospitals are accredited, for instance, so that patients can feel confident that the staff have appropriate training and experience and that the quality of the treatment meets appropriate professional standards.
The same theory applies to the accreditation of colleges and universities: that because the institution is accredited, students can be confident that the university is operated professionally and that the courses they take will be of an appropriate standard.
But the accreditation process, as applied to higher education, raises two important questions:
First, does it indeed follow that because an institution receives accreditation, the student knows the courses he or she takes constitute a good education?
Second, is this the best way to assure quality? Is it better to accredit a university in general rather than to assure students that specific courses are of a certain standard?
In both cases, the answer is no—a troubling response, given that so much depends on accreditation.
Authorizing Accrediting Institutions. In the United States, accreditation is a complicated, expensive, and time-consuming process. First, it must be determined which organization will perform the accreditation. The U.S. Department of Education (DOE) authorizes a limited number of such accrediting bodies, which then have the authority to accredit colleges and universities.
The DOE and the Secretary of Education wield significant power in determining which institutions are allowed to accredit colleges and universities. Specifically, the Secretary:
- Determines which accrediting agencies are reliable judges of the quality of a particular college, university, or educational program;
- Publishes a list of these approved accrediting institutions; and
- Appoints six of the 18 members of the National Advisory Committee on Institutional Quality and Integrity (NACIQI), which provides recommendations for approval of accrediting institutions.[1]
In order to be recognized as an approved accreditor, a prospective accrediting agency must complete a grueling review process overseen by DOE and the National Advisory Committee. First, a prospective agency must “have had at least two years’ experience functioning as accrediting agency—establishing standards, evaluating institutions or programs for compliance with those standards, and making accrediting decisions based on those standards—before it submits its application for recognition.”[2] Next, a new applicant must provide a narrative statement to the Secretary of Education describing “in depth the processes the agency uses to review and update its criteria and standards, the tests it uses to determine their adequacy and relevance in evaluating educational quality, as well as the results of those tests, and how it determines they are relevant to the needs of affected students.”[3]
Once these steps have been completed, a prospective agency submits an application to the DOE to become an approved accrediting agency. The application is then analyzed by DOE staff, who make announced and unannounced site visits to the prospective accrediting agency. Once the Department of Education staff analysis is completed, and once the DOE is satisfied that the prospective accrediting agency’s application meets the staff’s requirements, the application and supporting materials are sent to the National Advisory Committee on Institutional Quality and Integrity.
The NACIQI places the agency’s application on the meeting agenda (meetings occur only twice per year) for consideration. The intent to become an accrediting agency is published in the Federal Register and is open to a public comment period. After hearing presentations from the prospective accrediting agency and considering the application and supporting materials, the NACIQI then recommends to the DOE whether to approve, deny, or limit a request. Finally, the department issues a decision about whether to approve or deny a new accrediting agency.[4]
Accrediting Colleges and Universities. Once a prospective accreditor becomes a U.S. Department of Education/National Advisory Committee on Institutional Quality and Integrity–approved accrediting institution, it then wields the power to grant accreditation status to a college or university. Colleges and universities that request evaluation by an accrediting agency must meet agency-developed criteria in order to become “accredited” by the agency. Just as becoming a government-approved accrediting agency takes tremendous time and effort, in order to become accredited, colleges must be willing to invest considerable cost and time.
There are two categories of accreditation: institutional and programmatic/specialized. Institutional accreditation applies to the entire college or university and, by extension, any program offered at the school. Programmatic or specialized accreditation applies to schools, departments, or programs within a university. However, specialized accreditation of a department or program typically occurs within a university that already has institutional accreditation.[5] For example, the Wharton Business School at the University of Pennsylvania is accredited by the Association to Advance Collegiate Schools of Business,[6] the University of Virginia’s Master of Architecture Degree is accredited by the National Architectural Accreditation Board,[7] and Marymount University’s teacher preparation program is accredited by the National Council for Accreditation of Teacher Education.[8] In addition, the DOE reports that “a number of specialized accrediting agencies accredit educational programs within non-educational settings, such as hospitals.”[9]
A program or school within an accredited university might seek programmatic accreditation (even though the university in which it resides is already institutionally accredited) in order to ensure that graduates of a particular program are eligible to sit for a credentialing exam.[10] The additional accreditation is designed to be a quality control measure for specific programs or departments.
There are 10 national accrediting agencies (including four faith-based agencies) and six regional accrediting agencies. In addition, there are the several specialized and programmatic accrediting agencies.[11] Most traditional four-year colleges and universities are regionally accredited by one of the six regional accrediting agencies. By contrast, most for-profit and technical schools are accredited by national accrediting agencies.
When reviewing the labyrinthine world of accrediting agencies, it is easy to lose sight of a critically important question: What does it take to become accredited by a national or regional accrediting agency? Accrediting agencies set standards for accreditation to which the college or university seeking accreditation must adhere. (These are the same standards that were approved by the U.S. Department of Education during the approval process to become an accrediting agency.)
In order to demonstrate that it meets these standards, a college seeking accreditation must prepare “an in-depth self-evaluation study that measures its performance against the standards established by the accrediting agency.”[12] In addition to this in-depth review, colleges and universities must also allow for on-site evaluations. Once a university is accredited, the accrediting agency continues to monitor the school to ensure that it is in keeping with the standards set by the accrediting agency.
Impeding Education: The Perils of the Accreditation System
Accreditation, professional licensing, and other tools that regulate the provision of services have always been something of a double-edged sword. On the one hand, they can protect consumers from charlatans and low-quality providers, and in technical areas where consumers often do not feel able to judge quality accurately—areas such as emergency medical care—they can provide an assurance of excellence. On the other hand, however, these tools can also become a barrier to entry in a market, enabling existing providers to use licensing to thwart competition. When that happens, licensing does not assure quality, but instead protects inefficient and inferior products.
With regard to colleges and universities, accreditation has become, first and foremost, a barrier to entry. Indeed, the accreditation system has morphed into a powerful and rigid system whereby a few large regional and national accrediting agencies have a tremendous amount of power over higher education. This system, in turn, creates massive and expensive headaches for existing colleges and universities; crowds out new higher education start-ups; and creates an inflexible and questionable college experience for students who, in order to be eligible for federal student aid, have little choice but to attend accredited institutions.
An Onerous Requirement Rather than a Measure of Quality. Once a voluntary decision on the part of universities, accreditation is now a de facto requirement for institutions to be eligible even to open their doors or for their students to receive federal aid. Today, accrediting agencies act as “regional monopolies,” the parameters under which almost every institution of higher education in the United States operates. As the American Council of Trustees and Alumni notes:
America’s accreditation system emerged in the late 19th century as a voluntary system for serious educational institutions to differentiate themselves from institutions that were “colleges” in name only. There was a competition among the private accrediting organizations that enabled market forces to maintain a necessary level of quality. The knowledge that institutions could drop accreditation kept associations from becoming dictatorial or attempting inappropriately to influence the content of education.[13]
The nature of accreditation shifted, ACTA notes, when, in 1952, the G.I. Bill conditioned eligibility for federal student aid on institutional accreditation. The 1952 G.I. Bill “marked the beginning of accreditation’s partnership with the federal government in monitoring institutional quality, with the accreditors acting as the gatekeepers to federal funds.”[14] Accreditation and federal student aid were further linked with the Higher Education Act of 1965, which coupled accreditation with significant amounts of new federal student aid. Access to federal student aid was now conditioned on approval from the “new gatekeepers: the accreditors.”[15]
Prior to 1952, colleges sought accreditation only “if the benefits (e.g. signaling quality and/or helping the institution) outweighed the costs.[16]” Now accreditation “is a near necessity, regardless of its benefits in these dimensions.”[17] The newly fortified link between accreditation and federal aid for all college-bound students—in combination with universities’ growing appetite for federal subsidies—has changed accreditation “from a voluntary service to a nearly universal obligatory review.”[18]
Measuring Inputs, Not the Quality of Outcomes. Even though it is a de facto requirement for colleges, accreditation does not guarantee academic quality. Indeed, it is granted largely on the basis of the inputs a college reports to the accrediting agency. Such inputs—for example, the number of library books in the university library, the school’s disciplinary code, and its mission statement—are among the criteria used by accrediting agencies to grant accreditation status to a college.
Despite having a dubious link to student performance, skill acquisition, and employability, these criteria continue to be used by accrediting agencies; measurable student learning gains or instructional quality have little impact. As ACTA states, “If the accrediting process were applied to automobile inspection, cars would ‘pass’ as long as they had tires, doors, and an engine—without anyone ever turning the key to see if the car actually operated.”[19]
Moreover, ACTA notes, while almost all colleges are accredited, it is largely agreed that academic quality has declined in recent decades. Economist Richard Vedder states that:
Prior to the establishment of federal financial aid programs, accreditation was completely voluntary…. Because it was not universal, having accreditation meant something…. Once the federal financial aid programs became established fixtures of the educational landscape, however, accreditation’s performance deteriorated. The primary reason is that because accreditation is now so important to an institution’s financial survival, it has become near universal. In other words, “once a badge of distinction, accreditation has now become so commonplace as to be of negligible benefit to either educational consumers or the institutions themselves”.
For example, Harvard has the same accreditor as Central Connecticut State University, though one suspects that there is a large difference between those two schools (as suggested by the more prominent college rankings guides which consistently place Harvard near or at the top but do not even rank Central Connecticut State).[20]
Clearly, the quality of the education received by students has little—if anything—to do with the accreditation process.
The Fox Guarding the Henhouse. Alarmingly, and in a manner that parallels the history of many licensing systems, accreditation now suffers from numerous conflicts of interest. For instance, regional accrediting agencies are financed in part by college and university membership in the associations. Colleges are dues-paying members of accrediting associations that determine their accreditation. Consequently, accreditors are more reluctant to deny accreditation renewal, an action that would result in the loss of dues-paying members of the association. “The desire to maintain collegiality and not to lose paying association members raises conflict of interest issues that make the regional accreditors questionable gatekeepers for eligibility for federal funds.”[21]
Moreover, removing a college’s accreditation status could mean that a regional accrediting agency loses students to other parts of the country and, hence, to colleges accredited by other regions. This reality creates further perverse incentives to accredit institutions of questionable quality.
Ultimately, these conflicts of interest have created a system whereby accreditation agencies are inclined to protect the interests of existing colleges and universities.
Credit for Courses of Dubious Academic Value. In 2010, the University of South Carolina (USC) made headlines for offering a credit-bearing course entitled “Lady Gaga and the Sociology of Fame.” The objective of the course was to “unravel some of the sociologically relevant dimensions of the fame of Lady Gaga.”[22] Because USC is an accredited institution, any course offered at the school is thereby also accredited.
USC is not alone in offering college credit for courses of questionable academic rigor and value. Indeed, courses such as “The Science of Superheroes,”[23] “Gay and Lesbian Caribbean Literature,”[24] and “Cyberfeminism” are offered at UC Irvine, Syracuse, and Cornell, respectively—all of which are accredited universities.[25] And at Bowdoin College, students can take a “Women’s Studies” course (for credit, of course) that asks: “Is Beethoven’s Ninth Symphony a marvel of abstract architecture culminating in a gender-free paean to human solidarity, or does it model the process of rape?”[26]
In what one writer deemed the equivalent of “academic snake oil,”[27] offering courses of dubious value is yet another pitfall of institutional accreditation and the current makeup of U.S. accreditation as a whole. To be sure, some courses with colorful titles are simply examples of professors using their flair for marketing in order to attract students to solid courses, but the general problem is that once an institution is accredited, its courses are as well—no matter whether the content or quality of a specific course reaches the standard implied by accreditation.
Colleges Insulated from Competition from Higher Education Start-ups. Part of the reason colleges can offer courses of dubious academic rigor or educational value is that the current accreditation process (along with other factors such as easy access to federal student aid) insulates them from the competitive pressures of the market. As George Leef of the John William Pope Center for Higher Education Policy notes:
The accreditation process does nothing to enhance the market’s requirement that schools be good enough to meet the competition. Accreditors base their decisions not on educational results, but on institutional inputs, whether schools do things “the right way”—“enough” books in the library, faculty members with “proper” credentials, “adequate” financial support, and so on. Conforming to those criteria does not ensure that their students will in fact gain any educational benefits.[28]
Moreover, the requirement that prospective colleges allow for on-site evaluations (meaning that they effectively have to be already in operation with professors teaching courses and students taking classes) creates a “built-in Catch-22 for innovators and entrepreneurs—you can’t be accredited (get access to public money) until you have proved yourself in advance. You can’t prove yourself in advance—prospectively—unless you are accredited.”[29]
Because accreditation is such a barrier to entry, it has a high value to any institution: One estimate of the market value of accreditation is $10 million. No wonder that some for-profit colleges decided to purchase nonprofits in order to inherit their accreditation and avoid the costly headache of seeking first-time regional accreditation.[30]
Hindering Innovation. In addition to insulating colleges from normal market forces, the existing accreditation system reduces the incentive for colleges to revise their existing business models and make the reforms needed to spark innovation. For instance, one regional accreditation agency—the Western Association of Schools and Colleges—requires any university that wants to make a substantive change (defined broadly as anything that may affect the school’s quality or objectives) to submit a detailed report to the agency at least four months before the implementation date of the proposed change. Since almost any and every change a college would contemplate meets the definition of “substantive change” per the accreditation agency, almost any modification would take many months to be approved, making “a quick response to changing market conditions impossible.”[31]
Regrettably, the rigid regulations of the Western Association of Schools and Colleges, while numbing for any administrator with an entrepreneurial streak, are not unique. As for-profit higher education researchers John Sperling and Robert W. Tucker note:
The Southern Association of Colleges and Schools requires that before any significant changes are made in purpose, programs, scope, location, ownership, level of operation, or instructional delivery systems, the institution must notify the Executive Director of the Commission on Colleges in writing “at least one year in advance of the proposed change.”… When businesses have a need for educating their employees, their planning time frames are in weeks or months, while those of higher education are in years.[32]
Moreover, if a college or university attempts to fast-track a potential change and implement it before accreditation agency approval, the school can be placed on probation and even lose accreditation.[33] Given that the digital era has accelerated so many administrative tasks—both in higher education and in the broader business community—such inflexibility is particularly detrimental.
Creating a Limited Vision of What Higher Education Can Be. Perhaps most frustrating of all, however, is the manner in which the existing accreditation regime limits the vision of higher education and creates an inflexible college experience for students.
While the 1952 G.I. Bill and the Higher Education Act of 1965 first coupled accreditation with federal funding, federal regulations in the 1992 reauthorization of the Higher Education Act (HEA), cemented the idea of higher education as “youth-centered and campus-based,” with the student population “assumed to be homogenous with regard to age and work status.”[34]
Instead of being able to shop around for individual classes that might meet their professional or academic needs, under the existing accreditation regime in the U.S., students are largely consigned to an off-the-shelf college experience at a government-accredited institution. This “one-size-fits-all” college experience pigeonholes the typical student as someone who will require four years of undergraduate work to complete training in a given field, no matter what area of study the student has chosen to pursue. The current regime also disregards the flexibility and access to content that online learning has produced over the past several decades.
Accreditation thus has become a poor gauge of college quality. Colleges rarely lose accreditation once it is granted, despite widespread recognition that the quality of higher education has been on the decline for decades. At the same time, colleges slog through the bureaucratic and time-consuming accreditation process in order to access federal subsidies, which constitute an increasingly large share of college budgets. This accreditation system hinders innovation, creates an inflexible college experience for students, and results in accredited courses that are of questionable academic value.
The inflexible, bureaucratic club that is the American college accreditation system is antithetical to reform. Higher education will remain impervious to change if the perverse incentives maintained by the accreditation regime are allowed to stay in place. However, a combination of increased access to online learning, portability of student loans, and a market-based, private accreditation system could produce dramatic changes in the higher education structure and ultimately drive down costs while improving quality.[35]
The Changing Landscape of Higher Education
The rapidly changing nature of higher education stands in sharp contrast to the rigid, protective accreditation process. Students today have access to more information than at any other time in history. Technology has created a world in which information about almost any topic is readily available to anyone with an Internet connection and a computer.
Yet this information surge has failed to penetrate the center of the higher education bubble, largely due to the insulating effects of state and federal subsidies and the accreditation system. Moreover, prices at U.S. higher education institutions continue to inflate despite the fact that “the cost of basic knowledge is lower than ever before.”[36]
Traditional higher education, however, may no longer be able to ignore the revolution at its doorstep. Dramatic changes are on the horizon as entrepreneurial educators experiment with radically different business models and approaches to learning. For instance, high-quality open-source courses, taught by professors from some of the most elite institutions in the country, are beginning to transform higher education by democratizing access to content. Meanwhile, creative new approaches to organizing courses and teaching hold the prospect of sharply reduced costs for campus-based and “hybrid” institutions that combine bricks-and-mortar with online information.
The MOOCs (Massively Open Online Courses) Revolution. Online courses have become a growing feature of education around the world. As a means of delivering academic content, online learning has been deemed the “single biggest change in education since the printing press,”[37] and free online courses from top-level academics have become one of the most interesting examples of the radical rethinking of higher education.
Udacity. More than 160,000 students representing every country in the world (except North Korea) enrolled in Udacity’s very first course: “Introduction to Artificial Intelligence.”[38] Created by Stanford University professor Sebastian Thrun, Udacity delivers free courses to students virtually. Students who complete a course and pass an online exam receive a certificate of completion from Udacity. Online learning, Thrun says, will “exceed the best education today…. If this works, we can rapidly accelerate the progress of society and the world.”[39] Udacity now offers 11 courses, including introductions to physics, web design, and statistics.
Udemy. Similar to Udacity, Udemy offers some 6,000 courses online, from language courses to game theory and everything in between. While many courses are offered free of charge, others are offered for fees ranging from as little as $6 to as much as $300. Anyone can take a course through Udemy, and professors and other experts from around the world teach courses. Udemy strives to “disrupt and democratize education by enabling anyone to learn from the world’s experts.”[40] Prestigious universities such as Yale, Harvard, and MIT offer many of the courses.
Straighterline. Straighterline provides in-house course advisers and subscription-based pricing for students enrolled in courses through their online university model. Straighterline’s courses are tutor-supported rather than instructor-led and are self-paced with on-demand tutoring. Straighterline has articulation agreements with some 30 colleges that honor courses taken through the company and award credit to students who need the courses to fulfill remedial requirements or obtain prerequisites or who need a course not offered at their enrolling institution. Straighterline also works with employers, encouraging companies to include their courses in their tuition assistance programs.
Coursera. Coursera hosts virtual courses offered by elite universities such as the University of Pennsylvania, Princeton, Berkeley, and Stanford. These free courses are taught by the universities’ professors, and the technology company hopes that this will enable the “best professors to teach tens or hundreds of thousands of students.”[41] While, unlike Udacity, Coursera does not currently offer certification for successful course completion, it will provide information about student performance to third parties upon a student’s request. The lectures, which are taught by noted professors who provide interactive lessons with frequent feedback and assessments, include statistics, computer science, biology, economics, and many other courses.
edX. Through its MITx program, the Massachusetts Institute of Technology became one of the first elite universities to begin offering massively open online courses. In May 2012, MIT announced that it had partnered with Harvard to form edX, a platform for the two prestigious schools to jointly offer their MOOCs to interested students from anywhere in the world—for free. Harvard professors will teach the online version of the courses they teach at the college to traditional on-campus students. Students who complete an edX course will receive a certificate of mastery to demonstrate content knowledge. According to the Chronicle of Higher Education, the edX platform will be open-source “so it can be used by other universities and organizations who wish to host the platform themselves.” While edX will initially play host to adapted versions of courses from MIT and Harvard, the institutions expect it to become a clearinghouse for open courses offered by various institutions.[42]
Whether it is Udacity, Udemy, Coursera, or edX—or a limitless world of other open-source content delivered through platforms yet to be imagined—the knowledge base for MOOCs will still need to come from those who have expertise in their fields of study. Online learning pioneer John Chubb explains that:
In this blended educational world, the Harvards and MITs will not be stuck charging tuition for on-campus education while they give away course materials online. They and other elite institutions employ world-renowned leaders in every discipline. They have inherent advantages in the creation of high-quality online content—which hundreds of other colleges and universities would be willing to pay for.[43]
This is a concept that Harvard and MIT seem to grasp and one that is likely to spread throughout higher education, particularly if accreditation is reformed. Chubb notes that during the announcement of edX, Harvard and MIT boasted that they would be able to reach millions of new students around the world, with online learning.
It is no surprise that Harvard is on to something. Through edX, Harvard and MIT are credentialing content knowledge, laying the groundwork for a higher education network through which students can attain various certificates for knowledge mastery from a wide variety of colleges, course providers, and delivery mechanisms.
Different Business Models: Credentialing Skills, not Seat Time. As MOOCs begin to disrupt the antiquated and ineffective model of higher education, the market is beginning to address another critical piece of the higher education puzzle: institutions that move beyond the traditional concept of the bachelor’s degree. These cutting-edge institutions are leading the higher education reform revolution, pressing for a system that is cost-efficient, customizable, and of value to students, taxpayers, and employers alike.
For too many students, the bachelor’s degree is little more than a pricey piece of paper of questionable value. While it typically signals to prospective employers a student’s persistence in degree acquisition, it does not always indicate that a student has obtained mastery of the particular concepts or skills that would be desirable in the workforce. A more effective approach would be to certify or credential skill and concept attainment—an emerging strategy of innovative institutions.
Western Governors University. Founded by 19 governors, Western Governors University provides online competency-based degrees. WGU is a fully accredited competency-based model for the country. Students learn “independent of time and place” through online courses, and content mastery is then assessed to provide “degrees and other credentials that are credible to both academic institutions and employers.”[44] Being competency-based, WGU allows students to advance as soon as they are able to demonstrate content mastery on assessments.
Western Governors University’s model is distinct, save for a few other institutions such as University Now’s New Charter University: Students pay for tuition every six months and are assessed a flat-rate fee, paying only for the amount of time it takes to complete a particular program. Students finishing in less time save money; the faster a student progresses, WGU notes, the more money the student saves. The university does not “rely on classes in the traditional sense.”[45] Instead of accumulating credit hours based on the amount of time spent in a particular course, students complete assessments measuring skills in a given subject area.
Students who pass a given assessment are awarded competency units instead of credit hours and can work to earn as many such units are they are able to earn in a six-month period. To help guide their progress toward a competency-based degree, students are paired with a mentor. While WGU was the first fully online university to receive accreditation, this distinction (such as it is) required a long, drawn-out process. But the years-long bureaucratic slog culminated in WGU’s earning accreditation from four agencies.[46]
City and Guilds. The U.K.-based City and Guilds has developed standards and qualifications in numerous job sectors, which are offered in some 10,000 training centers around the world. City and Guilds has developed “relevant qualifications that are recognized and respected by employers all over the world.”[47]
City and Guilds is not itself a teaching institution. Instead, the organization provides certificates, courses, and assessments that are useful for companies looking to improve the skills of employees. Students can take these courses and sit for exams at thousands of centers across the world. While the model helps companies to develop a tailored workforce, prospective employees also benefit from having access to employer-approved courses, thereby boosting their chances of landing a job.
City and Guilds offers National Vocational Qualifications (NVQ) that assess skills or test an individual’s ability to do a particular job. Certificates and diplomas are also offered, as are apprenticeships and single-subject qualifications. City and Guilds also provides apprenticeship qualifications in engineering, construction, and manufacturing, as well as credentials in business skills, logistics, and information technology (IT), along with a host of other qualifications. Consumers can also obtain functional skills qualifications in English and math.
University of Wisconsin. More traditional institutions are also adopting a similar direction. For example, in June 2012, the University of Wisconsin System announced that it would move toward a competency-based degree model that would allow students to start classes at any time during the school year and to receive credit for skills and concepts mastered outside of the UW classroom. Students will be able to demonstrate competencies through assessments and, by doing so, to earn credit. Known as the Flexible Degree Program, the University of Wisconsin’s model changes learning metrics from being defined through seat time to being assessed based on competency; students graduate as soon as they can demonstrate content mastery.
Brigham Young University–Idaho. Students at BYU–Idaho can obtain technical certifications in core courses while working toward a bachelor’s degree. These official qualifications mean that students who fail to complete their entire degree program still leave the university having obtained valuable credentials. Students in some disciplines who live off-campus can earn a bachelor’s degree for less than $8,000 for their entire four years of undergraduate work.[48]
Microsoft Certification. Companies such as Microsoft have long offered certification in computer skills and experience. IT experts who have the knowledge to design technology-based solutions can become Microsoft Certified Solutions Experts (MCSE). Individuals who are able to implement and administer Microsoft SQL server databases can obtain the Microsoft Certified Database Administrator (MCDBA) certificate. Certificates are also available for developers, and Microsoft provides advanced certifications in other IT areas.[49]
Taken in conjunction with the online courses offered at for-profit institutions such as the University of Phoenix, the increasing number of online courses offered at traditional institutions, and the Massively Open Online Courses being expanded by some of the most elite institutions in the U.S., students today have a rapidly expanding universe of course content from which to choose.
Instead of being limited to one institution, students thus could have a completely customized college experience if they were able to choose from among those many online options, courses taught in a traditional classroom setting, and hands-on technical or internship experience—all from a variety of different providers—which could then be pieced together as a “degree” or certificate of competency to provide to prospective employers. While this new higher learning experience sounds like a potential solution to many of America’s higher education problems, however, one important question remains: Who would assess the value of a competency credit?
Contemplating the Future of Higher Education
With regard to the future of higher education, one thing is certain: Tomorrow’s model is going to look very different from the current paradigm.[50] Higher education appears to be on the verge of the same kind of massive transformation—or “disruptive innovation”—that has changed the news/newspaper industry so dramatically. The expensive bricks-and-mortar, “sage on a stage” model of college, largely unchanged for centuries, is being challenged by radically different visions of education.
In addition to the innovations discussed above—the MOOCs, online education, and new business models such as WGU—this impending transformation is also being driven by new teaching approaches. Such new approaches are already appearing at the K–12 level, pioneered by entrepreneurial ventures like the Kahn Academy.[51] Kahn and other similar approaches have “flipped” the sequence of school education, with “homework” becoming the acquisition of online information and the schoolroom becoming the place where teachers work through customized problem sets and projects with groups of students who are working at their own pace and level. Such customization can—and should—be a driving force at the college level.
So what could the future college experience look like? Admittedly, when disruptive innovation is occurring, it is hard to predict how an industry will evolve, but some things seem increasingly plausible.
First, the use of online information and online classes to transmit core information will become far more prevalent. Consequently, students will be able to learn at their own pace from world-ranked experts at the time that is most convenient to them and at a fraction of today’s cost. In turn, as a result of this new method of dissemination, college faculty will function less as lecturers and far more as coaches, teachers, and mentors.
Second, greater convenience and a huge reduction in costs mean that lower-income students will be able not only to obtain the skills they will need to do well in the future economy, but also to do so without incurring crippling debt. Given the key importance of college-level or equivalent skills to future income, the transformation of higher education will likely lead to an enormous boost in the economic mobility of Americans who are now on the lowest rungs of our society.
Third, in the future, there may well be more students who study from their own apartments, homes, or neighborhoods—a phenomenon similar to the growth of homeschooling at the K–12 level.[52] Such a development may cause well-educated parents to take the lead in mentoring their children’s higher education. It could also lead to small “nodes” in neighborhoods and rural towns, perhaps in the evening or weekend at a local high school or business office, where groups of students and education leaders meet for regular seminars and exams based on an online curriculum and course content. For students seeking solid credentials at a reduced price, this localized approach could be the way to obtain strong skills and employability without incurring heavy debt.
These and other features of disruptive innovation occurring in higher education will likely change the very concept of “college” or “university.” Today’s colleges provide, generally at a high price, a wide range of education and other services that are “bundled” together. These services range from lectures to library facilities and from sports to social networks and parties. But in the future, these elements may not all be provided within the same “college.” Indeed, the college may be an institution—and not necessarily even a bricks-and-mortar institution—that assembles elements from different sources to provide a more customized experience.
Homeschooling gives a clue as to how this different form of college might function. A feature of the homeschooling movement, for instance, is that students pursue sports and enrichment activities and develop social networks outside of the home. In some states, homeschooled students participate in regular public school sports teams. A similar scenario could define the “college” of the future, with the strictly academic features of online classes and local nodes supplemented by social, sports, and cultural services that are supplied in other ways. In essence, higher education is likely to become “unbundled,” with separate features provided through different suppliers and the “college” functioning as an enterprise that assembles the various components in a customized package to suit different students.
Another version of this general pattern is “blended learning,” already growing at the K–12 level, in which existing or new bricks-and-mortar colleges—or perhaps even smaller, local institutions—combine online courses and high-quality, customized teaching in a variety of ways. This blended learning approach is particularly attractive because existing colleges can provide umbrella accreditation that encourages the development of an innovative online program.
An example of this trend is Southern New Hampshire University, which originated in the 1930s and functioned for many years as a small New England college. Southern New Hampshire now offers three ways to obtain a degree: through a traditional campus experience, through regional centers affiliated with the campus, and entirely online. Also likely are college degrees that combine periods of home-based online coursework with periods of campus-based courses and “traditional” college life.
The more significant the possible change, however, the more suffocating the current accreditation system will be. From the cost and ponderous nature of the process to the tendency of the system to protect existing institutions and thwart new approaches and competitors, accreditation is and will remain the enemy of innovation. Until they can be assured of quality in ways other than accreditation, young Americans will never have the access they need to an improved and less costly higher education experience.
Credentialing Skill Attainment, Not Institutions
Under the current system, in order to be competitive in the job market, students attend college for four years to earn a bachelor’s degree. This system, stagnant and financially untenable, should be replaced with a new model in which students earn credits for concept mastery, the value of which would be determined through a system of independent accreditors in competition with one another to demonstrate that their “stamp of approval” is the most rigorous or most accurate in judging competencies valued by employers.
By focusing on the skills actually acquired in particular courses—rather than accrediting institutions—independent credentialing agencies would have little in common with the current government-sanctioned accreditation system. Rather, these new agencies would be similar to a Good Housekeeping seal of approval, an “Underwriters Laboratories” stamp of excellence, a City and Guilds certification, or the independent evaluations offered by groups and publications like J. D. Power and Consumer Reports. As economist Richard Vedder writes:
Americans spend vast amounts of money buying houses, cars, and major appliances—yet none of these things are “accredited.” We have developed other means of providing information. For example, Consumer Reports, J.D. Powers and Associates, and Underwriters Laboratories all give consumers information [on] the products they are purchasing, and private home inspections by disinterested third parties help assure that real estate transactions truly represent what buyers and sellers expect.[53]
In such an environment, absent federal intervention, market forces would no doubt produce many accrediting entities, and competition between these institutions would be fierce, with each vying to prove that its seal of approval on a course, internship, or other competency is the most rigorous and useful in determining a student’s content mastery and abilities. In addition, over time, consumers—and employers—will come to recognize which seals of approval are providing the most qualified candidates for their industries.
Similarly, assessments for a particular course could be managed through groups like the College Board, ACT, or the Lumina Foundation. Such an approach already exists for students of accounting, who take a CPA exam to demonstrate proficiency in the field.[54] Outcomes, then, not time spent in class, become the focus of the curriculum, creating “a more appropriate measure for judging students and institutions.”[55]
In the same way a Michelin star is universally respected as a distinction of excellence in the restaurant industry, so too could independent accreditors provide valuable information to prospective employers as well as parents and students.
While the first step toward reforming higher education has to include a reconfiguration of college accreditation and the unleashing of new higher education business models, employers also must recognize the benefits of the credentials-certifying concept and skill attainment rather than time spent and courses taken in educational institutions. “It’s the dying companies that value college degrees,” says Udemy founder Eren Bali. “You have to think beyond that piece of paper.”[56]
Sparking the Revolution: What Needs to Be Done
Federal Policymakers. Federal policymakers should work to limit Washington’s intervention in higher education—specifically, through accreditation—so that reform can take place. Specifically:
End government sanctioning of accrediting agencies and allow any institution to accredit courses. At the same time, accreditation should be voluntary, and accrediting entities’ reputations should rest with market forces, not government institutions. The abundance of online information, coupled with the self-interest of students to be competitive in the job market, “reduces the problem of fraudulently low-quality education to one of de minimis proportions.”[57]
Avoid federal “scorecards.” A seductive idea, even among some critics of today’s accreditation system, is to have the federal government replace or supplement federally driven accreditation with a scorecard that seeks to measure the output of colleges by criteria such as graduation rates, employability of graduates, and value for money. Such federal intervention would be a mistake: Existing institutions that are comfortable within the cocoon of protectionist accreditation would lobby hard, and no doubt effectively, for output measures that define success in their own terms. Moreover, a competing range of such private outcomes-based scorecards already exists, sponsored by such bodies as U.S. News & World Report, Forbes, ACTA, and Kiplinger’s.
Decouple accreditation and federal funding. ACTA notes that once accreditation agencies became the gatekeepers for federal funding, “accreditors essentially gained regulatory control over colleges.”[58] Federal policymakers should therefore decouple accreditation and federal funding through amendments to the Higher Education Act, eliminating the necessity that colleges get accredited by the government-sanctioned system. This reform would allow independent accrediting institutions to enter the market, thereby providing students with numerous options for creating their “degree” and shaping their college experience.
State Leaders. With regard to reforming the accreditation system, state leaders also have an important role to play:
Encourage investment in 529 college savings accounts. 529 college savings plans are tax-advantaged accounts that offer an attractive vehicle for families to save for future higher education expenses. Interest earned on money invested in a 529 account is allowed to accrue free from federal income tax obligations.[59]
While this is codified in federal law, most states offer either tax credits or deductions to encourage saving in a 529 college savings plan. Many states allow college savings to accrue in 529 accounts without requiring investors to pay state taxes on interest earned and permit families to withdraw money tax-free to pay for tuition, books, and other education-related expenses. Today, at least nine states still subject 529 earnings to state taxes. To provide students with increased flexibility in their higher education financing, those states should allow interest earned on 529 college savings accounts to accrue free from state income tax liability.[60]
Shift state schools to a competency-based model. Governors and state higher education system leaders should follow the lead of Wisconsin and move state colleges and universities to competency-based degree models. Degrees should be awarded for competency in a given subject, not for the number of hours spent in the classroom. Such a shift would, in turn, expedite degree completion and save money for students and taxpayers alike. Governors should take the lead in encouraging state school trustees to embrace competency-based degrees.
-
Offer dual enrollment options. States should offer and expand dual enrollment programs that give advanced high school students the opportunity to take college-level courses while in high school and receive college credit for successfully passing those courses.[61]
The Business Community. To be successful, many of these reforms require support from the business community. The business community can help to enhance competition and accelerate reform in two important ways:
Discourage government from using accreditation as a barrier to new higher education ventures. Successful American businesses understand the value of competition and the need to prevent government-backed regulation or “standards” from blocking new entrants to a market. As competition increases, existing colleges and universities will attempt to use accreditation to obstruct new business models and to restrict aid to students attending traditional colleges and universities. Recognizing the dangers of anti-competitive practices, business leaders need to get off the proverbial sidelines and engage in the battle to open up competition in higher education.
Establish credential approval seals. Limiting Washington’s intervention in higher education and accreditation will provide opportunities for the business community to establish metrics, standards, and, ultimately, credentials for the coursework that students take at various institutions, as well as other “real world” or internship experience. In order to provide independent assessments of and credentials for course work and other skills, businesses, nonprofits, and other non-governmental entities should work to create “an educational analog of Underwriters Laboratories.”[62] By doing so, employers can help to assure future students that if they succeed in employer-credentialed courses, they will have a far greater chance of finding a job after graduation.
Laying a Foundation for Lasting Reform
Despite living in an era where information is more accessible than at any other time in human history, families are struggling to afford the cost of college tuition. The average college student leaves school with more than $23,000 in debt, and total student loan debt in the United States now exceeds $1 trillion.
In addition to being burdened with crushing debt, too many students are leaving college without the skills needed to be successful in the workforce. These young people are sold a bill of goods about the importance of a bachelor’s degree—that such a degree is the gateway to future success, a piece of paper without which they are doomed to a life lacking professional fulfillment and financial security. Unfortunately, upon graduation, the utility of such a degree often fails to meet expectations.
And yet, despite the dire state of today’s higher education system, there is hope on the horizon: By favoring knowledge and skill acquisition over seat time, online options and competency-based learning are disrupting the traditional higher education market and perhaps laying the foundation for a revitalization of American education.
Policymakers are in a unique situation to hasten such reform by supporting the customization of higher education for students. In particular, policymakers should back the decoupling of accreditation from federal financial aid subsidies, a reform that would provide independent entities the opportunity to credential courses and skills.
As former college president Robert Dickeson observes, “The standards for accreditation…are based on an institution’s self-study of the extent to which the institution feels it has met its own purposes.”[63] Without accreditation, adds George Leef, higher education institutions “would be compelled to examine their operations anyway by a force much more powerful than accreditation—the force of competition.”[64]
Such a transformation would likely burst the higher education price bubble, increasing access to course content and customizing students’ learning experiences. In short, costs would decrease and quality would increase—a testament to the power of innovation and competition. Coupled with the end of the current cozy accreditation regime, the continued proliferation of online learning and more accurate measurements of attained skills would offer future college students the prospect of a better education, increased employability, and lower education costs.
—Lindsey M. Burke is Will Skillman Fellow in Education in the Domestic Policy Studies Department and Stuart M. Butler, PhD, is Director of the Center for Policy Innovation at The Heritage Foundation.