Education and training programs are falling far short of their potential. A competition among states to provide workers with better information may point the way forward.
In the United States, education and training proprograms have long been key in helping people get better jobs and achieve higher living standards. With the earnings divide between skilled and unskilled workers at a historic high, the imperative for increasing skill levels is great. Training programs offer opportunities for low-income individuals to qualify for jobs that enable them to enter the middle class and for displaced workers to regain a significant portion of their lost earnings. Improving workforce skills also enhances the nation’s competitiveness and fosters economic growth.
Evidence shows that many career and technical training programs lead to high-paying jobs and stable careers. In fact, the earnings of young adults with two-year degrees in technical and industrial fields or with certificates requiring at least a year’s worth of credits in similar fields often are comparable to those of workers with more traditional fouryear degrees. Moreover, there is sound evidence that these benefits can accrue to workers young and old and to those with strong or weak academic backgrounds. For example, 65% of individuals who received training certificates in highreturn fields had grade point averages (GPAs) of C or lower in high school, compared to only 15% of individuals who earned four-year degrees in high-return fields. This difference suggests that poor academic preparation need not be a barrier to obtaining a high-return credential.
Even with the best of intentions, however, many people seeking career advancement ultimately choose training programs that do not suit their needs or the needs of local employers, and many do not complete the training programs. Some, uncertain of the outcomes, hesitate to invest time and money in training at all. These poor choices represent a troubling loss of economic gains for workers and for taxpayers.
A main reason for such poor results is that many people interested in pursuing new or upgraded skills cannot obtain reliable information about what training programs are available, how completing programs affects earnings, and whether they have the attributes needed to complete the high-return programs. The good news is that these information deficits can be resolved.
Toward this goal, we offer a plan that is built on providing prospective trainees with the information they need in ways that make it easy to use and understand. The plan, which will be organized on a state-by-state basis, will feature a mix of online information systems coupled with assistance from career counselors. The online systems could be accessed at workers’ homes, public libraries, campus career centers, and public One-Stop Career Centers supported by the Workforce Investment Act. The help from career counselors could be integrated into support programs at training institutions and at One-Stop Centers.
To drive the plan forward, we propose a competition among states for grants to enhance information collection and dissemination and expand the availability of staff-provided career guidance. Statistical analyses and educated opinion suggest that our plan will work. The competition will definitively assess its overall success, identify which of its components are most effective in terms of performance and cost, and provide a blueprint for maximizing returns for workers and the nation alike.
Our hope is that the competition will be jointly run by the U.S. Departments of Education and Labor. To be effective, $15 million will be needed over five years to fund at least two state proposals.
Big payoff, but poor choices
Evidence shows that many career and technical training programs lead to high-paying jobs and stable careers. In fact, young adults with a two-year degree in a technical or industrial field or a certificate requiring at least a year’s worth of credits in similar high-return fields often increase their earnings by 33% or more, to levels comparable to those of workers with more traditional four-year degrees. In contrast, students with two-year degrees in low-return fields seldom enhance their earnings at all.
Unfortunately, a recent study in Florida showed that three out of four young community college students either completed low-return programs or did not complete enough high-return courses for them to have much of an effect on their earnings. Similar results have been observed in studies in the state of Washington and in Pittsburgh, covering unemployment insurance claimants of all ages.
To better understand why students often make poor choices, we interviewed staff who advise students and other participants at community college and workforce training programs. The close-to-unanimous conclusion from our discussions is that the prospective trainees face systematic information deficits that hinder decision-making. For example:
- They are not aware of the wide range of programs available at local community colleges and for-profit training institutions. In particular, although they are familiar with academic programs leading to two- and four-year degrees, they often are unfamiliar with the wide range of available highreturn certificate and career-oriented two-year programs.
- They have limited information about how returns vary among programs. They overestimate the returns from academic programs and underestimate the returns from career-oriented programs, especially those in building trades and protective services. They fail to recognize that some high-return programs can be completed quickly, whereas others take years to complete. They also fail to recognize that demand for some skills is widely distributed across the country, whereas other skills are in high demand only in some locations.
- They have difficulty assessing whether their schooling and experience are adequate for them to complete programs. On the one hand, they underestimate the importance of academic preparation in certain high-return fields—such as those related to science, technology, engineering, and mathematics (STEM)—and they fail to recognize when their STEM skills are not strong enough to complete certain highreturn programs. On the other hand, they fail to recognize that they have skills needed to obtain high-return certificates in areas such as health care, protective services, auto mechanics, plumbing, and heating and air conditioning repair and installation.
- They have great difficulty obtaining effective career counseling. They may have few friends or relatives to turn to who are knowledgeable about training options. This is often true for low-income workers, blue-collar displaced workers, and children of immigrants who may be the first in their families to pursue further career training.
- They are not able to adequately compare the net returns across similar programs at different institutions. By not factoring in differences in costs, they sometimes select highreturn, high-cost, for-profit programs from which, after repaying loans, the net benefits are no higher than from lowerreturn but much less expensive community college programs. Although many for-profit programs offer high-return training that more than offsets their high costs, some advertise misleading statistics about benefits and costs. In addition, for-profits spend far more on advertising than community colleges do. The advisers we surveyed said that potential trainees too readily accept advertising claims without assessing their accuracy or carefully weighing the benefits and costs of alternatives.
One factor contributing to such awareness gaps is that community colleges spend billions of dollars on instruction but only tiny amounts on support services. The counseling that takes place is aimed toward helping students select the courses they need to complete a program—after they have already selected a program of study. There are few organized efforts to help prospective trainees make sound choices of programs that further their goals and complement their skills. At most community colleges, the ratio of students to career counselors is greater than a 1000 to 1.
There is some evidence that counseling students does help and can be a key element of successful dropout prevention programs. Community colleges counselors reported that their services frequently prevented prospective trainees from enrolling in programs that they were unlikely to complete, were unlikely to improve their career prospects, or were inconsistent with their interests and constraints.
There is also evidence that counseling can be done at low cost. One-Stop Career Centers provide these services to all applicants for training vouchers to help them select an appropriate program. The services, which include individual and group counseling with well-trained staff, cost less than $400 per person. At the conclusion, individuals have filled out a form similar to a college application that is based on their own research and the information obtained from their counseling that describes the likely outcomes from the best available training options, the requirements to complete those options, the extent to which the individual meets those requirements, the direct and indirect costs of the training, and how those costs will be met.
Although there is unambiguous evidence that many career-oriented training programs are capable of increasing the earnings of workers with diverse backgrounds, the evidence is less strong for how better information and improved assessment and counseling would affect students’ selection and completion of high-return programs. This is precisely the knowledge gap that our proposed competition is intended to fill.
Shape of the competition
Most states, aided by substantial data development funding by federal and state governments, already have a wealth of basic data on various aspects of workforce training. Several states are using the data to produce relevant performance measures, and a few states even make that information available online. But overall, the systems have not produced much improvement in the completion of career and technical training programs that offer high payoffs for participants.
In part, this is because users are unaware of or lack the means to access the systems. A more central problem is that the information is often not presented in ways that make it useful. Potential trainees who currently make the poorest training choices often have the least experience and preparation in using data to make complex decisions. This is especially true for individuals who did not do well academically in high school and have had little, if any, postsecondary education.
The competition we propose will offer grants to states to use their own existing longitudinal data systems to fill major information gaps and then deliver the information to prospective trainees in a meaningful way. The grants would support efforts in four areas. These are: (1) assembling the data needed to make sound decisions; (2) organizing the data to produce relevant measures, including those that accurately reflect the payoffs from training programs and are needed to identify high-return fields and programs; (3) disseminating the information using systems that combine use of computers and staff in a way that improves training choices; and (4) sustaining systems that prove to be costeffective. Although the primary focus would be on helping prospective trainees make the best possible choices, the competition would also create incentives for administrators and policymakers to respond to changes in those choices by moving resources from low-return programs where few workers end up with better jobs to high-return programs where many workers end up with better jobs.
The following section examines each of the program components in turn.
Assembling data. States would assemble longitudinal data that link completion of specific education and training courses to labor-market outcomes. Included would be data on completion rates and post-program earnings of participants in specific programs; on program attributes such as field of study, duration, and cost; and on participants’ backgrounds, including age, gender, years of education, highschool grades, number of years of high-school math and science courses, and pre-program earnings.
Much of this information is already available. Although some of the data were collected in response to the No Child Left Behind Act, the major impetus came from the Department of Education when it made more than $600 million available through its Statewide Longitudinal Data Systems (SLDS) program, and from the Department of Labor when it made about $30 million available to add workforce data to the education data. Today, at least one-third of the states have a full system in place that includes secondary, postsecondary, and earnings data, and most of the remaining states lack only the inclusion of wage record data, which are not especially difficult or costly to include. Some states have data going back 10 or 20 years, which are very useful for assessing the long-term effects of training and how the effects vary under different economic conditions.
For this component of the competition, states would be required to identify the sources of data they intend to use and how they would be matched at the individual level, including safeguards to protect personal privacy.
Organizing data. States would use the data to demonstrate how post-program earnings and completion rates vary depending on the characteristics of the programs, characteristics of the participants, and characteristics of the local labor markets. Whereas nearly all states have the data required to estimate expected completion rates and earnings, or could assemble these data relatively easily, only a few states have organized the data to provide the information required to help actual and potential trainees improve their choices. These states have used the data mostly to produce basic tabulations of the number of students in a training program, number completing the program, basic characteristics of the average student, number employed, and earnings over different periods.
A key goal of the competition is to encourage states to produce statistics that potential trainees can use to assess their probability of completing different programs, estimate the boost in earnings they could achieve after completing the programs, and develop realistic estimates of direct and indirect program costs. Without having all three types of information, potential trainees might enter high-return programs that they would be unlikely to complete or that would cost more than students’ expected increases in earnings.
Constructing simple measures of who participates in the programs and how they fare in the job market can yield valuable information on program completion rates and subsequent earnings by field of study. Analysis can also show the importance of program length and intensity, trainee characteristics such as academic preparation that affect outcomes, and labor- market characteristics relating to local demand for workers in different fields.
More advanced measures can tailor the information to applicant and program characteristics. For example, prospective trainees can be advised that information technology (IT) specialists earn about $45,000 three years after completing training. However, 90% of those completing IT programs had high-school GPAs of 3.0 or better and completed at least three years of high-school math courses. Trainees also could be informed that IT graduates living in cities with substantial high-tech employment earned about $15,000 more than IT graduates living in small cities and rural areas far from high-tech centers. Further, prospective trainees who lack the academic preparation that makes completion of IT programs likely could be given comparative information about health care, protective services, and other training programs that offer high wages and high probabilities of completion to trainees with lower levels of academic preparation.
Growing recognition of the importance of providing reliable information about the returns and costs of training is demonstrated by the Department of Education’s Gainful Employment programs, which require certain education institutions, including for-profit groups that offer courses that do not lead to a degree, to disclose a range of information to prospective students. The disclosure requirements, however, assume that the information would be understood sufficiently to reduce unreasonable risk-taking by prospective trainees, especially those who might incur large debts to obtain high-cost training from for-profit institutions. Our proposed competition would complement Gainful Employment by providing information about which statistics are misleading and what set of well-rounded statistics and assistance in interpreting them would lead to attainment of the programs’ underlying goals.
In the end, these calculations should provide easy-to-understand metrics to help stakeholders navigate the world of workforce development programs. For this component of the competition, states would be required to describe what statistics would be produced, how the data collected would be used to create those statistics, and what group or body would be charged with the task.
Disseminating information. States would develop ways to display the information and provide staff assistance when necessary, so that stakeholders with different levels of experience in using data to make complex decisions end up making sound decisions. States would be free to propose creating and testing a range of systems to display and disseminate information as well as provide staff assistance. Whatever systems the states implement, they will need to develop rigorous methods to measure their overall effectiveness and how different elements affect users with different characteristics. Particular attention should be given to devising systems that combine online and staff assistance, so that individuals who often fail to make sound decisions get the help they need. This group includes individuals with the poorest academic preparation and least experience in using data to make decisions.
One option that states might choose for displaying information could be posting at-a-glance report cards on the Web that describe training programs and their performance. Report cards might be developed and tested with three (or more) different levels of complexity.
States might start by developing a basic report card that draws from the SLDS program. The report card could have a menu-based system similar to the Web-based systems commonly used to find, for example, the lowest airfare. At the most basic level, a potential trainee could specify program characteristics, and the Web-based system would display a menu on completion rates and earnings of appropriate programs. For example, potential trainees could enter items such as the field of interest, the location of interest, cost, duration of the program, whether full-time attendance is required, high-school GPAs of completers, expected earnings gains of completers, and percentage typically completing.
The search engine would select the specific programs that meet the user’s criteria in an order specified by the user, such as from highest to lowest expected returns. The user would review the information and alter the search criteria to narrow the search to the most relevant options. As the search is narrowed, the user could request that the system present a screen that directly compares one program with other programs.
Florida and other states have systems that use the SLDS to place this type of information on the Web, along with information about the cost of the programs, entrance requirements, and duration. For example, a prospective trainee could inquire about registered nursing programs and certified nursing assistant programs in Miami and then compare differences in earnings, completion rates, cost, and duration across these two types of programs and within each type.
To build on the basic report cards, states could create an intermediate-level system where trainees would enter personal characteristics to obtain more tailored choices. The list of programs to be considered could be narrowed by putting in personal characteristics such as highest level of education, GPA, number of math courses completed, grades in those courses, and characteristics of programs of interest, such as cost, duration, flexibility of when and where courses are offered, and fields of study. By providing much more accurate information about the individual’s probability of program completion, the intermediate system would quickly narrow consideration to programs that have a high potential for completion and for generating high returns for the individual user. Thus, the intermediate system would reduce the burden placed on users of determining the extent to which general statistics provided by the basic system would apply to them.
An advanced system using report cards would include all the features of the intermediate system, along with a Webbased assessment of the potential trainee’s attributes that affect the probability of completion and interests. For example, there are Web-based versions of both the Armed Service Vocational Aptitude Battery and tests offered by ACT that potential trainees could take to assess their level of academic and work-related skills as well as the careers the testtaker would be best qualified for and find most interesting. When entering such test results into the state system, prospective trainees could widen the range of programs to be considered or narrow choices to programs with the best salaries and highest levels of personal satisfaction.
Prospective trainees also could use community college entrance exams and other tests to determine whether they qualify for specific programs. An important adjunct to these tests would be an online tutorial system that would help potential trainees brush up on skills they once mastered or even develop new skills needed to gain entrance to programs. Pueblo Colorado Community College, in association with the local One-Stop Center, has developed this type of system. It has helped many trainees enter high-return training credit programs without having to first complete time-consuming and costly remedial courses. Such a system would be especially beneficial to workers who are unaware that community colleges require passing entrance exams to enter many programs, have rusty skills because they have been out of school for many years, or cannot afford the time or expense of completing remedial courses.
In addition, the same systems that collect data and create the report cards for use by prospective trainees can produce metrics that are useful to decisionmakers who want to understand how to better serve program participants. In particular, administrators could use information on labor-market returns to adjust course offerings—dedicating more resources to programs that meet trainees’ needs and cutting back on programs that are mismatched to local employer demand. Greater transparency regarding performance will also exert competitive pressures on programs to improve outcomes.
Sustaining systems. States would need to identify information systems that are unambiguously cost-effective and propose ways to permanently fund effective systems after grant startup funds are exhausted. Sustainability is a relevant component of this competition because it will give precedence in awarding funds to states that have realistic plans to implement cost-effective systems. It also will give states opportunities to think about ways to create incentives for trainees to use the systems to achieve their own goals and for program administrators to use the systems to increase the returns from taxpayers’ investments.
For example, states could require recipients of student financial aid to use the systems to develop a realistic plan to achieve their goals, with the expectation that simply reviewing their options will improve their choices without any compulsion to alter decisions. Similarly, states could require community colleges to put in place performance-management systems to assess labor-market effects of career-oriented programs and make resource allocation decisions that increase the number of high-return slots at the expense of low-return ones.
In explaining their proposals, states could demonstrate that their legislatures would approve the funds needed for continued operation. Or they could describe ways to secure funding that would not require new appropriations. For example, they could propose making small reductions spread across many students in state-funded scholarships to cover ongoing costs, or they might propose reducing community college career and technical education programs with low enrollment and using those savings to fund the Web-based systems and provide more career counselors. Such mechanisms might also induce state education agencies and legislative bodies to reallocate resources from low-return to high-return programs based on evidence of their cost-effectiveness.
Close-up view of proposals
As with the Department of Education’s Race to the Top competition, a key to obtaining high-quality proposals is to clearly define the goals of the competition (including the theory of action underlying those goals) and to provide a clear understanding of what is required to win an award. We want the proposed competition to focus attention on the need to create an integrated system where the whole is greater than the sum of its parts: assembling relevant data, creating useful measures from those data, testing alternative ways to disseminate the information so that it improves trainee choices, assessing the cost-effectiveness of alternatives, and putting in place permanent systems once they are proven to be cost-effective.
A second key element of the competition is creating an effective scoring system that ensures that the funded proposals are those with the greatest potential to provide clear evidence of the effectiveness of systems for helping a range of users. As the primary determinant of an award, we recommend using a combination of the expected benefits of the proposed system relative to its cost and the feasibility of creating the system. The goal is to fund innovative proposals that go beyond systems already in place but are still feasible to construct with available technologies.
The first part of each proposal would describe the system, provide a convincing explanation of how it would increase earnings and returns on investment, and explain how those benefits relate to costs. The centerpiece of the description would be an analysis of how the system would alter the choices made by prospective trainees and how those changes would affect earnings and investment returns.
The second part of the proposal would detail how the system would be created within the proposed time and budget constraints. It would include a thorough description of what data would be used, how the data would be organized to estimate completion and earnings, how the estimates of completion and earnings would be developed, how those estimates and other information would be accessible on a Web site, and how end users would extract information from the system. A key component of this discussion would be describing prior experience in performing each task and the qualifications of the team members on the project.
The third part of the proposal would describe how the proposed system would be tested, the types of training programs and users that would be included in the test, and how the benefits of the system would be measured. The more inclusive the test and the greater the rigor of the test, the more points would be awarded.
Although the estimate of the expected value of the proposed system would be the primary basis for making awards, no award would be made unless the proposed tests were sufficient to determine whether the system was cost-effective for at least some users and some types of programs. However, the review panel could work with a state with a promising proposal to refine the tests to reach a point where a reasonable accuracy would be achieved.
The fourth part of the proposal would describe the funding mechanisms that would be used to sustain systems once they are demonstrated to be unambiguously cost-effective. This section also could describe how states would ensure that trainees and program managers use the systems effectively and thereby generate the savings that could be used to sustain the systems.
For example, trainees receiving state grants could be required to produce coherent plans that examine local demand for the skills being pursued, the probability of program completion, direct and indirect costs, and how the out-of-pocket expenses would be covered before registering for programs. Community colleges could be required to conduct orientations for first-time students before they register for classes or after registering but before the start of classes. That orientation would include group sessions providing an overview of how to use the information systems to identify and select effective career training plans, as well as group and individual sessions to review plans with career counselors. Similarly, program managers could be required to oversee the development of information on workplace outcomes by program and student characteristics, and then use the information to provide feedback to various departments or to make decisions on resource allocations.
Our view is that awards should go to states whose proposals describe systems with the best chances of being highly cost-effective, as long as they provide the means to determine their effectiveness. However, if there are proposals that are roughly equivalent in terms of expected value and clarity of evaluation, preference should go to proposals that offer the system with the greatest promise of being sustained.
Returns on investments
We expect that new systems developed under the competition grants will lead to at least a 10% shift in outcomes among students who currently fail to make sound choices when seeking career-enhancing education and training. This shift would substantially reduce the percentage of students who leave college after a year or so with no career-enhancing credential, or leave with a low-return two-year degree. If 10% of students end up making better choices by obtaining a high-return certificate or two-year degree, the social return on the investments in training programs also would increase significantly. And perhaps an even larger positive effect could come from improved information and counseling that induces workers to seek training, particularly those who do not enter training programs because they are uncertain or even skeptical of their benefits and costs.
We are optimistic that the demonstrations will capture the best available information and make it available in practical and innovative ways that will yield many rewards. We are also optimistic that determining precisely how much difference it would make if the information available to prospective trainees were improved and they were offered more help in making decisions that further their goals in a few states will lead to widespread adoption of effective systems.
Given the problems of current education and training efforts, as well as the tightening budgets of the many people still out of work, we think the time is right to set the wheels of competition—a historic driver of the nation’s progress— in motion.
Louis S. Jacobson (firstname.lastname@example.org) is president of New Horizons Research in McLean, Virginia. Robert J. LaLonde (email@example.com) is a professor in the Harris School of Public Policy at the University of Chicago. This article is adapted from Using Data to Improve the Performance of Workforce Training, a white paper they prepared for the Brookings Institution’s Hamilton Project.