By Theodore J. Marchese
Social mobility -- the intrinsic opportunity Americans possess to advance in their lives -- is an underlying value in United States society. Nowhere, perhaps, is this more evident than in terms of the acquisition of knowledge. This snapshot, by a veteran observer, of post-secondary education in the second half of the 20th century, clearly reflects its evolution and expansion, as well as the manner in which institutions like colleges and universities swiftly respond and adapt to changing social and economic needs. Following this article is a brief consideration of The Essence of the Educated Person. A third piece briefly outlines The Accreditation Process by non-governmental peer bodies for higher education and by state and local authorities for primary and secondary education.
Roughly a half-century ago, at the beginning of the post-World
War II era, the United States
already claimed a well-developed system of higher education, with
1.5 million students enrolled
on some 1,700 campuses across the country.
It was a system with its own history. It encompassed
universities that combined, under one roof,
English-style undergraduate colleges with German-style graduate
and research faculties. It had
ended the hegemony of the classics by admitting practical studies
such as agriculture and
engineering to the curriculum in the 1860s, and coursework in
business, health, and numerous
other fields during the first four decades of this century. It
had invented the course and credit-hour system in the 1890s as a
means of encouraging transfer between institutions, and
accrediting
associations -- run by the colleges themselves -- to assure
quality. And by the early decades of
this century, the system had developed faculties to rival in
intellect the best in the older European
universities.
It should be noted that like their older counterparts, the
newer American universities, even in
1945, were elitist, male, white and relatively aloof from
society. Yet however uneven or
parochial they were, these were the institutions that had
educated for the nation the likes of
Presidents Thomas Jefferson, John and John Quincy Adams, Theodore
and Franklin D.
Roosevelt; poet Walt Whitman and novelist Henry James; pioneer
education theoretician John
Dewey and social activist Jane Addams; and the Rev. Martin Luther
King, Jr., leader of the
African American civil rights movement.
In the five decades since World War II, America rebuilt and
greatly expanded participation in its
system of higher education, by a stunning factor of 10, in an
effort to make educational
opportunity more open and accessible, fairer and more relevant.
Government and industry came
to see higher education as an investment in an educated workforce
that would propel the nation
to new levels of economic well-being and national security.
Individuals came to see higher
education as an indispensable investment in their own futures, as
a route to social mobility and
personal fulfillment. The combined result was a system that in
50 years ballooned from 1.5 to 15
million students, resulting in the world's first example of
"mass" higher education. In the
process, to accommodate this great change in scale, entirely new
ways had to be found to govern
colleges, finance campuses and students, and assure quality and
accountability.
Postwar Foundations
Three events around the end of World War II set the stage for
this growth.
In 1944, the U.S. (federal) Government enacted the G.I. Bill,
which promised the nation's
military men and women that at war's end, Washington would pay
for them to attend college or a
trade school. Millions of returning veterans, few of whom would
have otherwise so aspired,
chose to pursue higher education, flooding enrollments well into
the 1950s. These mature,
motivated young adults did well on campus, graduated into a host
of occupations and
professions, and became such a model of success that the very
idea of college attendance, and
its benefits, took on new salience for Americans.
The second milestone development came in 1947. Noting the
impressive academic
achievements of the veterans already enrolled, a presidentially
appointed commission proposed
to President Harry S Truman the startling recommendation that not
one-tenth but fully one-third
of all youth should attend college -- and that it would be in
the nation's best economic and
social interests to provide the necessary opportunities. In the
years just after the war, therefore,
the experiential and conceptual underpinnings for expansion were
set firmly in place.
The third significant item was a widely-read report issued in
1945 by Vannevar Bush, head of the
respected U.S. Office of Scientific Research and Development.
Bush, a physicist and dean at the
Massachusetts Institute of Technology, had mobilized wartime
efforts to bring to battle radar,
penicillin and a host of new weapons systems -- most notably the
atomic bomb. Acknowledging
that so many of these successes derived from a foundation of
basic research, Bush created the
vision of science, in his words, as an "endless frontier" for the
nation, investment in which would
bring untold dividends in national security and social
advancement.
Out of the war effort came a whole generation of top
scientists committed to national security
work, men (and some women) who moved back and forth between
government service, national
laboratories and the campus. In 1950, the U.S. Congress
chartered the National Science
Foundation and charged it with promoting research and development
and the education of
scientists. Fed by Cold War fears, U.S. Government-sponsored
university research soon became
quite a significant venture, with federal expenditures exceeding
$1 billion by 1950. In
subsequent years, public and private expenditures for
university-based contract research -- peer
reviewed and competitively awarded, as Vannevar Bush had urged --
rose to $21 billion by
1996, doubtless with many benefits to the United States, but also
to the great advancement of
science itself within the university, and to the roughly 100
institutions of higher learning that
garnered 95 percent of these federal funds.
The Booming Sixties
In the 1960s, the higher education system underwent intense
expansion and development. The
immediate cause was the arrival at college doors of the "baby
boom" generation -- the
heightened numbers of 18-to-22-year-olds born in the aftermath of
World War II. These young men
and women, often the sons and daughters of G.I. Bill
beneficiaries, had lofty educational
aspirations. As a result, between 1960 and 1970, college
enrollments jumped from 3.6 to 8
million students, with aggregate expenditures rising from $5.8 to
$21.5 billion. To accommodate
this enrollment rise, existing universities and four-year
colleges grew in size, helped by federal
construction loans and high capital investment by the sponsoring
states.
The most noteworthy development of the decade, however, was
the emergence of a distinctive
new institutional form, the comprehensive community college.
"Junior" colleges, offering the
first two years of instruction for students intending to transfer
to baccalaureate (four-year)
institutions, had been a fixture since the early 1900s. In the
early 1960s, a new vision for such an
institution -- explicit community-relatedness, open-door
admissions, and high status for
vocational-technical studies -- emerged. Soon, every community
wanted its own "democracy's
college," as community colleges came to be known. In the course
of the decade, new community
colleges opened for business at the rate of one per week -- a
total of 500 in ten years.
Enrollments soared from 453,000 students to 2.2 million. Equally
important was the fact that
although two-thirds of the earlier enrollments were in programs
designed for transfer to a four-year program, by the 1970s, 80
percent of all community college students were in shorter-term
programs -- preparing to be engineering technicians, health care
workers, law enforcement
officers, among dozens of occupations. If a community needed
trained workers for a new plant,
adult basic education, certificate programs for day-care workers,
or English-language training for
recent immigrants, its community college was there to respond.
The great expansion of existing institutions, combined with
the creation of new ones, raised
needs at the state level for new mechanisms of planning,
governance and finance. As a result, in
the 1960s, many states created high-level boards to govern or
coordinate their burgeoning
systems, with the role of overseeing a planned growth of
public-sector higher education. A
notable and influential model was California's 1960 Master Plan,
designed by Clark Kerr [president emeritus, University of
California]. It
specified that the top 12 percent of all California secondary
school graduates would be
guaranteed admission to the prestigious University of California
-- which grew or built nine
campuses statewide to accommodate these numbers. The plan also
stipulated that the top 30
percent would be guaranteed admission to a campus of the
California State University, which
eventually grew to 22 campuses; and that every secondary school
graduate would be guaranteed
enrollment at a local community college (106 of which were
eventually developed). To assure
access for all students, public-college fees were to be kept as
low as possible, with the state
funding most of the costs of four-year university attendance.
Community colleges, with their
more frugal budgets, were financed one-third by the state,
one-third by the sponsoring
community and one-third from student fees.
Students and Markets
Through the mid-1940s, Washington's role in higher education
was restricted mostly to data
gathering. Education at all levels, many believed, was a matter
reserved to the states by the
Constitution; federal support would bring unwanted "intrusion" if
not "control." But after
World War II, with national security interests coming to the
foreground, support for university-level research increased. In
the late 1950s, after the Soviet Union launched its Sputnik space
probe, national defense was invoked as a reason to support the
training of engineers, scientists,
foreign-language specialists and various building programs.
In the 1960s and 1970s, a consensus emerged that
special-purpose federal programs should be
curtailed in favor of federal aid to students themselves, in
support of the national commitment to
equality of access without regard to accidents of birth. The
Higher Education Act of 1965 and
supplementary Education Amendments of 1972 created today's
system of student financial aid.
It combines grants, work opportunities and loans to help
full-time students meet the tuition and
living costs of college attendance.
Significantly, the amount of aid for which a student qualifies
was determined both by family
income and the costs of the college of the student's choice.
In other words, a young man or
woman from a lower-income household might receive $2,000 to
attend a public institution (a
state university), and $10,000 to enroll in a private
institution. The aim of this provision was to
"level the playing field" between public and private higher
education and provide every student
with access both to the system in general and specifically to the
college of one's choice.
Additional broadening enactments in the late 1960s and 1970s
forbade any college receiving U.S.
Government allocations to discriminate on the basis of race,
gender, religion, national origin or
handicap.
The 1972 amendments went one step further to place a
distinctive mark on American higher
education. It awarded financial aid to the student, not the
institution. In effect, all colleges,
public or private, would have to compete for their enrollments.
The hope was that a new,
student-driven "market" for higher education services would
compel schools to focus more on
student needs. As enrollment growth continued through the 1970s,
the effects of this change
were initially small. But in the 1980s, as the size of the
college-age population began to decline
(the beginning of the post-baby-boom era), all types of colleges
had to learn how to market
themselves, simply to maintain their enrollments. In the years
that followed, through today,
highly-qualified students might receive literally hundreds of
recruitment contacts when
enrollment season begins, and even earlier.
Over time, then, the "postsecondary" marketplace was remade.
A relative handful of prestigious
universities and colleges, many of them private, assumed
commanding niches that allowed them
to admit just a small fraction (10-20 percent) of the able
students applying. In another test of this
resulting stratification, the top 30 colleges in the United
States today enroll 80 percent of the
minority students in the country with standard aptitude test
scores of 1,200 or above (out of a
possible 1,600). At the opposite end of the market, marked by
lower tuitions and near-open-door
admission, a score of aggressive, entrepreneurial universities
have sprung up. The newest, the
University of Phoenix (Arizona), enrolls an astonishing total of
40,000 students at some two
dozen sites across the American West.
Taking Stock
To foreign visitors, a remarkable feature of American higher
education is the degree to which it
is market-driven and free of central direction. Indeed, all
institutions, large or small, public or
private, compete with one another for faculty and administrative
talent, research and foundation
grants, legislative appropriations and alumni support, and --
overall -- public approbation and
approval. In addition, larger institutions compete with one
another through high-visibility
athletic programs.
At the same time, U.S. institutions tend to be relatively
dynamic and responsive to economic and
social changes, at least as reflected by markets. Even without
centralized manpower training, the
supply and demand for such trained professionals as engineers,
physicians and educators has
remained roughly in balance. On the disappointing side, most
observers sense that an enhanced
responsiveness to students hasn't translated into fundamental
improvements in the quality of
undergraduate education itself. Nor will markets by themselves
always reward a college's
attention to deeper values, such as a broad-based curriculum or
foreign language study.
That said, the system's overall record of accomplishment
remains impressive. If an original goal
was to provide equal access, it has been substantially achieved.
Women, one-fourth of all
enrollments in 1950, represent 55 percent of the student
population at present. Today's
enrollment of African-Americans (10 percent), Hispanic-Americans
(7 percent),
Asian-Americans (6 percent), and
Native Americans (1 percent) approaches their percentages in the
general
population. Overall, 77
percent of all 18-year-olds finish high school, and two-thirds of
this group proceed to college.
Additionally, among two- and four-year public campuses, adult,
part-time enrollments are now in
the millions and account for over 40 percent of total
enrollments. On many campuses, the
median age is 25 to 30. Today, thanks in part to
degree-completion programs for people in their
30s and 40s, fully a fourth of the U.S. adult population holds a
college degree.
Perspectives
Visitors to these shores notice several other distinctive aspects of U.S. higher education.
One is its sheer scale and cost. Fifteen million students
attend some 3,700 postsecondary
institutions, ranging from a few hundred
students to 50,000 at state
universities in Ohio and Minnesota. Aggregate expenditures for
higher education now exceed
$200 billion a year -- a mind-boggling 2.4 percent of the gross
domestic product (GDP),
compared, for example, with the 0.9 percent of GDP expended in
the United Kingdom. Of the
$200 billion, about half represents the allocation of public
money, the other half income from
tuition and fees, sale of services, endowments, and voluntary
giving (this last category alone
totaled $14 billion in 1995-96). A typical large
research university (with a teaching
hospital attached) has an annual operating budget of $2.5
billion, enrolls 35,000 students,
employs 4,000 faculty members and 10,000 support staff, raises
$150 million a year from private
donors and boasts an endowment of over $1 billion. The average
public four-year campus
charges $3,000 a year in tuition and the average private college
charges $13,000 a year -- with
roughly half the students at each type of institution receiving
financial aid.
The huge tuition differential between public and independent
higher education particularly
puzzles foreign visitors. How can the more expensive private
sector institution survive? The
tuition difference arises, of course, because the private college
does not enjoy direct public
support. In fact, however, to the student from a middle- or
low-income family, the tuition
differential may all but disappear through financial aid (with
perhaps a larger loan to repay
upon graduation). Still, the bargain of public higher education,
and its availability in virtually
every community, has brought to it most of the enrollment gains
of the past decades: Public
colleges enrolled half of all students in 1945, and 78 percent
today.
Is the private sector doomed? Not at all. Private colleges
prosper by offering distinctive
curricula, more inviting campus environments, and (often) the
prestige of their degrees. Many
continue their appeal to founding constituencies, religious or
ethnic. Private higher education
tends to be more innovative, entrepreneurial and values-driven,
and serves as a creative balance
to state-controlled higher education.
Another uncommon feature of the American college, both private
and public, is the character of
undergraduate studies. In the course of four years of study, the
average student will devote about
a third of the time to studies in a "major" (economics, physics,
business, and so on); one third to elective and supporting
courses; and --
mostly in the first two years --
one-third to general education. The last of these represents the
university's historic commitment
to produce graduates who will study broadly; appreciate science,
philosophy and the arts;
learn the habits of democracy; and develop higher abilities to
write, find and use information,
think critically and work with others.
Since much of a college's general education program is
prescribed for the student, it inevitably
raises questions of values. Faculties engage in endless debate
as to what should be taught and
learned. Some hold out for a canon of Western classics; others
argue for the inclusion of
multicultural topics and voices. All the while, with the flood
of new, often less-well-prepared
entrants into college and increased student job-mindedness, it
becomes more of a challenge for
schools to maintain student (and sometimes faculty) enthusiasm
for a broad base of electives.
And so various innovations have been employed in teaching,
curricula and technology to engage
students and help them succeed.
Statistically, as the century winds down, U.S. institutions of
higher learning annually award about
540,000 associate (two-year) degrees, 1.1 million bachelor's
(four-year) degrees, 400,000
master's degrees, 76,000 professional degrees (in law, medicine,
and other fields) and 45,000
doctorates. Among Ph.D.s, the biological and physical sciences,
mathematics and engineering
predominate; at leading universities, as many as 50
percent of candidates for those
degrees are from outside the United States. Within graduate
schools overall, the growth area is at
the master's level. A constant heightening of labor-market and
student expectations has led to
significant increases in master's-level studies in business,
education and the health professions.
Across all levels of American higher education, the 1990s have
witnessed an explosion in
deployment of information technologies. Most campuses, and
indeed several entire state
systems, are "wired up." Entire libraries are on-line;
technology expenditures totaled $16 billion
in the 1980s, and the figure is expected to have doubled in this
decade. On dozens of campuses,
every student and faculty member now has his or her own computer
(and often a web page); in
35 percent of all classes, professors and students communicate by
electronic mail (via computer).
The effects of all this on modes of instruction and on the
character of student and faculty work are
being intensely scrutinized.
One final development is a consequence of this technological
revolution: a huge growth in
distance education. Ninety percent of all U.S. institutions with
enrollments of 10,000 or more
now offer courses on-line. Coming on-line, too, are a number of
brand new "virtual"
universities, the best-known of which, the Western Governors
University, begins operations
across 11 states in January 1998. Technology, even as it remakes
the classroom, seems poised to
remake the postsecondary marketplace, too.
----------
Theodore J. Marchese is vice president
of the American Association for Higher Education and executive
editor of
Change magazine. E-mail address:
[email protected].
U.S.
Society & Values
U.S.I.A. Electronic
Journal, Vol. 2, No. 4,
December 1997