You are on page 1of 71

Policy and Program Studies Service

Review of the Fund for the


Improvement of Postsecondary
Education (FIPSE)
Comprehensive Program

2004

U.S. DEPARTMENT OF EDUCATION OFFICE OF THE DEPUTY SECRETARY


DOC. # 2004-16
Review of the Fund for the
Improvement of Postsecondary
Education (FIPSE)
Comprehensive Program

Prepared for:

U.S. Department of Education


Office of the Deputy Secretary

Prepared by:
Andrea R. Berger
Rita J. Kirshstein

American Institutes for Research


Washington, D.C.

2004
This report was prepared for the U.S. Department of Education under Contract Number ED-01-CO-0026/0015 with
the American Institutes for Research. The project monitor was Daniel Goldenberg in the Policy and Program
Studies Service. The views expressed herein do not necessarily represent the positions or policies of the Department
of Education. No official endorsement by the U.S. Department of Education is intended or should be inferred.

U.S. Department of Education


Rod Paige
Secretary

Office of the Deputy Secretary


Eugene W. Hickok
Deputy Secretary

Policy and Program Studies Service


Alan Ginsburg
Director

Program and Analytic Studies Division


David Goodwin
Director

September 2004

This report is in the public domain. Authorization to reproduce it in whole or in part is granted. While permission
to reprint this publication is not necessary, the suggested citation is: U.S. Department of Education, Office of the
Deputy Secretary, Policy and Program Studies Service, Review of the Fund for the Improvement of Postsecondary
Education (FIPSE) Comprehensive Program, Washington, D.C., 2004.

To order copies of this report, write:

ED Pubs
Education Publications Center
U.S. Department of Education
P.O. Box 1398
Jessup, MD 20794-1398;

via fax, dial (301) 470-1244;

or via electronic mail, send your request to: edpubs@inet.gov.

You may also call toll-free: 1-877-433-7827 (1-877-4-ED-PUBS). If 877 service is not yet available in your area,
call 1-800-872-5327 (1-800-USA-LEARN). Those who use a telecommunications device for the deaf (TDD) or a
teletypewriter (TTY) should call 1-800-437-0833.

To order online, point your Internet browser to: www.edpubs.org.

This report is also available on the Department’s Web site at www.ed.gov/about/offices/list/ods/ppss/index.html.

On request, this publication is available in alternate formats, such as Braille, large print, audiotape, or computer
diskette. For more information, please contact the Department’s Alternate Format Center at
(202) 260-9895 or (202) 205-8113.
CONTENTS

CONTENTS...........................................................................III
EXHIBITS............................................................................VII
ACKNOWLEDGMENTS................................................................VII
EXECUTIVE SUMMARY...............................................................IX
Introduction.................................................................................................................................ix
Methodology...............................................................................................................................ix
Findings......................................................................................................................................ix
To What Extent Did the Funded Projects Fit within FIPSE’s Operating Principles?............ix
To What Extent Have FIPSE Projects Been Institutionalized?................................................x
To What Extent Have FIPSE Projects Disseminated Information?.........................................x
To What Extent Have FIPSE Projects Been Replicated Elsewhere?.......................................x
To What Extent Have FIPSE Projects Produced Convincing Evidence of Effectiveness?.....xi
Conclusion..................................................................................................................................xi
CHAPTER 1
INTRODUCTION.......................................................................1
Overview of FIPSE......................................................................................................................1
Previous Studies...........................................................................................................................2
Current Review............................................................................................................................3
Methodology............................................................................................................................3
CHAPTER 2
FIPSE’S OPERATING PRINCIPLES.................................................7
Principle #1:
Projects Address Widely Felt Issues and Problems.....................................................................7
Expert Reviewers.....................................................................................................................8
Principle #2:
Projects Are Responsive to Local Initiatives.............................................................................10
Problem Identification...........................................................................................................11

iii
Solution Development............................................................................................................11
Principle #3:
Projects Are Comprehensive with Respect to the
Variety of Problems Addressed and the
Range of Institutions and Learners Served................................................................................12
Variety of Problems................................................................................................................12
Variety of Institutions.............................................................................................................14
Range of Learners.................................................................................................................16
Principle #4:
Projects Are Action Oriented.....................................................................................................18
Principle #5:
FIPSE Is Risk Taking.................................................................................................................19
New Versus Established Projects...........................................................................................19
Unproven Ideas......................................................................................................................19
Summary....................................................................................................................................21
CHAPTER 3
THE IMPLEMENTATION AND
INSTITUTIONALIZATION OF PROJECTS..............................................23
Institutionalization Levels .........................................................................................................23
Barriers to Implementation........................................................................................................24
Barriers to Institutionalization...................................................................................................25
Supports for Implementation.....................................................................................................26
FIPSE Funding......................................................................................................................26
Nonfinancial Support from FIPSE ........................................................................................27
Institutional Administrative Support......................................................................................28
Institutional Funding.............................................................................................................28
Supports for Institutionalization................................................................................................29
Administrative Support..........................................................................................................29
Internal Funding....................................................................................................................29
External Funding...................................................................................................................30
Summary....................................................................................................................................30

iv
CHAPTER 4
THE DISSEMINATION OF FIPSE PROJECTS......................................33
Dissemination Activities............................................................................................................33
Dissemination Activities and Institutionalization .....................................................................35
Dissemination Leading to Replication.......................................................................................35
Summary....................................................................................................................................36
CHAPTER 5
THE REPLICATION OF FIPSE PROJECTS ........................................37
Replication Levels.....................................................................................................................37
Barriers to Replication...............................................................................................................39
Administrative Barriers.........................................................................................................39
Faculty Barriers....................................................................................................................40
Change to Original Project Design.......................................................................................40
General Supports for Replication..............................................................................................40
Financial Support..................................................................................................................41
Support of Key Officials........................................................................................................41
Support of Faculty.................................................................................................................41
Support from the Original Projects............................................................................................41
Consultation...........................................................................................................................42
Collaboration.........................................................................................................................43
Effectiveness of Support Provided by Original Project Staff.................................................43
Role of FIPSE Support in Replication.......................................................................................44
Replication in Projects without FIPSE Dissemination Funds...............................................45
Replicating Sites and Dissemination.........................................................................................46
Summary....................................................................................................................................46
CHAPTER 6
THE EVALUATION OF FIPSE PROJECTS..........................................47
Evaluation Guidelines................................................................................................................47
A Review of FIPSE Project Evaluations....................................................................................48
Availability of Evaluations.....................................................................................................48

v
Evaluators..............................................................................................................................48
Evaluation Methodologies.....................................................................................................48
Evaluation Focus...................................................................................................................49
Evaluation Findings..............................................................................................................50
External Review of Evaluations................................................................................................50
Conclusion.................................................................................................................................51
CHAPTER 7
CONCLUSION.......................................................................53
REFERENCES.......................................................................55

vi
EXHIBITS

EXHIBIT 1
A COMPARISON OF SAMPLED PROJECTS WITH THE POPULATION OF PROJECTS
FUNDED IN 1996, 1997 AND 1998...........................................4
EXHIBIT 2
PERCENTAGE DISTRIBUTION OF GRANTEES
IN THE EVALUATION SAMPLE BY TYPE............................................15

EXHIBIT 3
PERCENTAGE DISTRIBUTION OF GRANTEES
IN THE EVALUATION SAMPLE BY REGION..........................................16

EXHIBIT 4
PERCENTAGE DISTRIBUTION OF PROJECTS BY LEVEL OF STUDENT IMPACTED....17
EXHIBIT 5
PERCENTAGE OF GRANTS GOING TO MINORITY-SERVING INSTITUTIONS*.......18
EXHIBIT 6
LEVEL OF PROJECT INSTITUTIONALIZATION........................................24
EXHIBIT 7
REPORTED DISSEMINATION ACTIVITIES*..........................................34
EXHIBIT 8
LEVEL OF PROJECT REPLICATION..................................................38

vii
ACKNOWLEDGMENTS

Many people donated their time and expertise to this report. First, we would like to thank the
many people in the higher education community who, despite their busy schedules, took time to
communicate with us about their experiences. Second, we would like to thank the staff at the Fund
for the Improvement of Postsecondary Education (FIPSE). They provided us with data we required
and answered our many questions about the program as a whole and about individual projects. In
particular, Joan Krejci Griggs helped us tremendously as our primary point of contact at FIPSE. We
are grateful for everyone’s enthusiastic cooperation.

The Policy and Program Studies Service of the U.S. Department of Education sponsored this
review of FIPSE. Daniel Goldenberg served as the project officer and provided valuable technical
guidance.

The American Institutes for Research conducted this study with Andrea Berger serving as the
project director and Rita Kirshstein serving as the principal investigator. Other team members
included Elana Benatar, Hilary Cederquist, Alice Davidson, Amy Goodman, and Laura Walton. In
addition, David Rhodes and Holly Baker provided extremely valuable support.

vii
EXECUTIVE SUMMARY

INTRODUCTION
The Fund for the Improvement of Postsecondary Education, referred to hereafter as FIPSE,
was established in 1972 to award grants to postsecondary institutions and organizations for the
purpose of “improving postsecondary educational opportunities.” Although early studies of the
FIPSE program were quite favorable, FIPSE’s Comprehensive Program has not been reviewed in
over two decades. This review focused on five basic research questions:
1. To what extent did the funded projects fit within FIPSE’s operating principles?
2. To what extent have FIPSE projects been institutionalized?
3. To what extent have FIPSE projects disseminated information?
4. To what extent have FIPSE projects been replicated elsewhere?
5. To what extent have FIPSE projects produced convincing evidence of effectiveness?

METHODOLOGY
In addressing the study questions, this review uses a descriptive, qualitative approach. The
review is based on a sample of 60 randomly selected projects funded from 1996 through 1998.
Information came from the project annual and final reports, FIPSE’s online database, project Web
sites, and discussions with project staff and FIPSE program officers. Experts in relevant fields
reviewed a small subset of the projects that were purposefully chosen.

FINDINGS
To What Extent Did the Funded Projects Fit within FIPSE’s Operating
Principles?
FIPSE established a set of five principles to guide funding decisions. In general, the projects
reviewed reflect a balance of these principles. Expert reviewers generally found that projects
addressed widely felt issues. Grantees noted that there were local issues behind the project. These
local issues were similar to those faced at a national level. Thus, even though projects were
responsive to local initiatives, they were applicable to the broader postsecondary community. A
variety of institutions and organizations administered programs for many different types of learners
and problems. These projects were action oriented in that they all focused on implementing a

ix
solution rather than studying the problem. Finally, FIPSE’s reputation for funding innovative
projects was supported by the experiences of the grantees and the evaluations of the expert reviewers.

To What Extent Have FIPSE Projects Been Institutionalized?


Most projects continued in some form after FIPSE funding ceased. About two-thirds of the
projects continued full force, or even grew. About one-fifth of the projects continued with some
reduction of scope. Fewer than 10 percent of the projects no longer existed at the time of the review.
FIPSE staff provided support by helping shape projects, connecting project staff to others with
similar interests, and helping solve problems. In fact, the most consistent complaint about FIPSE
staff was the desire for more of their time. Projects that were institutionalized had sufficient funding
from either internal or external sources and the support of the relevant administration.

To What Extent Have FIPSE Projects Disseminated Information?


Almost all the projects reported engaging in some dissemination activities. Most frequently,
these activities included presentations, but Web sites and published articles were also common.
There was also a prevalence of informal, colleague-to-colleague dissemination.

To What Extent Have FIPSE Projects Been Replicated Elsewhere?


More than one-half of the reviewed project appeared to have been replicated to some extent.
In getting started, replicating projects experienced many of the same types of issues as the original
project: support of administrative officials was crucial, as were sufficient resources. Getting faculty
and staff buy-in was a particular concern for replicating sites. Replicating sites were usually
supported to some degree by the original project site. This support ranged from e-mail exchanges to
regular visits between institutions. Not all support was unidirectional. In some instances, replicating
institutions had an impact on the original project as well. Eleven of the reviewed projects were
dissemination grants; an additional 10 projects received subsequent dissemination grants, meaning
that about one-third of the projects received funding from FIPSE specifically for replication work.
Although some projects replicated using grants from FIPSE, the evidence suggests that these projects
continued replication work after FIPSE funding ceased and that project replicated without FIPSE
funding.

x
To What Extent Have FIPSE Projects Produced Convincing Evidence of
Effectiveness?
It appears that there was an inconsistent emphasis on and support for project evaluation work
in 1996, 1997, and 1998. Projects were not required to submit evaluations and many projects
provided incomplete reports of their evaluation activities. When evaluation methods could be
determined, the quality varied dramatically. Commonly, projects collected self-report data through
surveys and focus groups. Some projects supplemented the self-report data with examinations of
student outcomes such as retention and course grades. A few projects used random assignment and
blind coders for student assessments.

CONCLUSION
The findings from this review suggest that FIPSE’s Comprehensive Program was
successfully meeting its legislative mandates. Although all projects had their strengths and
weaknesses, the expert reviewers, by and large, believed that the particular projects they reviewed fit
within the operating principles. In general, projects continued after the end of FIPSE funding. Most
projects shared their work with others in the higher education community. Many of these projects
influenced other postsecondary settings. FIPSE’s lack of emphasis on evaluation appeared to result
in some poorly conceived and reported evaluations. Overall, given the challenge of supporting
innovation, FIPSE succeeded in affecting postsecondary education.

xi
CHAPTER 1
INTRODUCTION

OVERVIEW OF FIPSE
The Fund for the Improvement of Postsecondary Education, referred to hereafter as FIPSE, is
a grant program under the auspices of the Office of Postsecondary Education within the U.S.
Department of Education. By federal government standards, FIPSE is a relatively small program.
From 1996 to 1998, the period under review, FIPSE awarded approximately 90 grants per year with
each grant receiving, on average, approximately $70,000 per year (and ranging from $45,000 to
$450,000 over the three years of the grant).
FIPSE was originally established by Congress in 1972, and the authorizing legislation has
changed little in subsequent years. The most recent legislation, reauthorized in 1998, allows FIPSE
to award grants to nonprofit institutions or organizations for a variety of purposes, including
improving educational access for all, linking education to career paths, integrating technology,
reviewing institutional missions, implementing cost-reduction procedures, developing individualized
educational experiences, improving graduate education, and reforming the credentialing process for
new institutions and programs. Clearly these mandates allow a very diverse set of funded projects.
Although FIPSE has focused its grants on different priorities at different times, it has always funded
a wide range of activities.
The FIPSE application process consists of two stages. First, applicants submit a
preapplication, which is a five-page narrative of the proposed project. External reviewers with
expertise in postsecondary issues (including faculty, administrators, and other professionals) evaluate
these short summaries. FIPSE staff use the preapplication summary and the external reviews to
choose a small subset of applicants to continue to the second stage. In the second stage, applicants
must submit a more exhaustive (25-page) proposal. FIPSE program officers answer applicants’
questions and suggest ways to improve the proposals. External reviewers, with expertise in the
proposed project areas, review these proposals and make suggestions. After the expert reviewers,
proposals are read and scored by FIPSE staff. Final finding decisions are made by the FIPSE
director based on the final scores. In the end, fewer than 5 percent of applicants submitting
preapplications receive FIPSE funding (FIPSE, 1996).

1
FIPSE has several formal grant programs: the Comprehensive Program, the European
Community-United States of America Cooperation Program in Higher Education and Vocational
Education and Training, the Program for North American Mobility in Higher Education, and the
Cooperation and Student Mobility in Higher Education between the United States and Brazil
Program. In addition to these formal programs, FIPSE also supports many congressionally directed
grants. The largest of its formal grant programs, the Comprehensive Program has changing funding
priorities and it can sponsor any project that addresses “important improvements in postsecondary
education” (FIPSE, 2001).
This breadth of funded projects contributes to FIPSE’s visibility. Although small, FIPSE
enjoys wide name recognition in the higher education community. This high degree of visibility
results from the long tenure of the program. In 2002, Change, a magazine focusing on higher
education issues, devoted an entire issue to examples of project work supported by FIPSE over the
previous 30 years (Cambridge, Ewell, Fields, and Miller, 2002). Its visibility is also increased by its
reach. In the last 10 years, FIPSE has funded more than 500 institutions and organizations.

PREVIOUS STUDIES
In 1980, an extensive external evaluation examined the outcomes of projects that FIPSE
funded during its first five years. Prepared by the NTS Research Corporation, this research
concluded:
[The] results of our evaluation of [FIPSE] are largely positive. …Central to this evaluation
was the finding that the agency has in the largest sense accomplished its congressionally
mandated mission to encourage improvement in postsecondary education. This finding was
consistently substantiated by the large amounts of data. (NTS Research Corporation, 1980,
I-9)
On the basis of this evaluation’s generally positive findings, the National Center for Public Policy in
Higher Education (NCPPHE) more recently examined FIPSE’s early years. Guided by the question
“What special qualities made FIPSE so successful and memorable?” this 2002 exploration focused
on structural and operational features of FIPSE and some of the early successful projects.
Both reports provide a wealth of information on the activities and impacts of FIPSE and
FIPSE-funded projects from 1973 through 1978. However, a national study of the FIPSE program
has not occurred in over two decades.1

1
Independent researchers have reviewed some programs more recently, including the Learning Anytime Anywhere
Partnerships and the EC-U.S. Cooperation Program.

2
CURRENT REVIEW
The U.S. Department of Education contracted with the American Institutes for Research
(AIR) to review FIPSE projects. This review focused on five basic research questions:
1. To what extent did the funded projects fit within FIPSE’s operating principles?
2. To what extent have FIPSE projects been institutionalized?
3. To what extent have FIPSE projects disseminated information?
4. To what extent have FIPSE projects been replicated elsewhere?
5. To what extent have FIPSE projects produced convincing evidence of effectiveness?
As evidenced by the research questions, this review did not attempt to evaluate the impact of
the FIPSE program or its funded projects. Instead, this review describes the degree to which FIPSE
met its objectives. Using more rigorous methods (e.g., random assignment, quantitative outcome
assessments) to evaluate individual FIPSE projects is often feasible. However, applying these
methods to an evaluation of the FIPSE program is both difficult and beyond the scope of this review.

Methodology
AIR’s review of FIPSE focused on Comprehensive Program projects funded during 1996,
1997, and 1998. The Comprehensive Program could sponsor any project that addressed “important
improvements in postsecondary education” (FIPSE, 2001). Within the Comprehensive Program,
some projects received grants specifically to share their existing expertise with others: FIPSE funded
these dissemination grants to assist with replicating the project elsewhere as well as to work on
disseminating information about the project.2
AIR randomly selected 60 projects to review, most of which were funded for three years.
The first 10 projects were selected from a group of 22 dissemination grants funded specifically to
replicate existing projects in new environments. An additional 50 projects were randomly selected
from the remaining projects funded during this period. The final review sample included 24 (out of
74) projects funded in 1996, 17 (out of 67) projects funded in 1997, and 19 (out of 62) projects
funded in 1998. During the review process, AIR identified an additional dissemination project in the
sample. Therefore, the final review sample consisted of 11 dissemination projects. The reviewed

2
One funding program, Disseminating Proven Reforms, was run occasionally to provide grants for the replication of
existing projects. When run, it was a very small portion of each year’s grants (e.g., it funded only 10 grants in
1998). Most of these projects were originally developed with the assistance of a Comprehensive Program grant.
Owing to this close relationship to the Comprehensive Program, and the focus of this review on replication work,
the 1998 funding year for the Disseminating Proven Reforms program was also included in the population.

3
dissemination projects represented 45 percent of all dissemination grants funded from 1996 through
1998. Because these projects were chosen randomly, they represented the types of projects
commonly funded by FIPSE. Exhibit 1 compares the sample of selected projects with the full
population of FIPSE projects from those grant years on several common topic areas.
EXHIBIT 1
A Comparison of Sampled Projects with the Population of Projects
Funded in 1996, 1997 and 1998

Sample %
Topic Area Population %
(n = 60)a

Access, Retention, and Completion 22 21

Assessment 8 11

Cost Control/Organizational Restructuring 8 8

Curriculum Reform 22 26

Faculty Development 22 20

Teacher Education 10 10

Technology 33 33

a
Each project could be classified in multiple topic areas. Therefore, the “Sample %” and “Full Population %” columns
sum to greater than 100.

Although 60 projects is a substantial percentage of the projects funded from 1996 through
1998, it is still too small a sample to make reliable subgroup comparisons (e.g., comparing projects in
different types of institutions). Therefore, most findings are reported for the entire sample.
Data collection occurred in two phases. The first phase examined outcomes for the 60
randomly selected projects. In gathering data for this phase, AIR used many different sources. The
initial information sources were the final reports that grantees submitted to the FIPSE office.
However, these documents were sometimes not available and frequently did not adequately address
all aspects of the project relevant to the specified research questions. When additional information
was needed, AIR staff gathered information from the following sources:

4
• Additional project documentation submitted to FIPSE (e.g., annual reports and evaluation
reports).
• The FIPSE project database on the Internet.3
• Individual project Web sites.
• Discussions with FIPSE program officers.
• Discussions with project staff members (including project evaluators).
• Discussions with staff members at replicating institutions.
The second phase of the review involved outside experts’ evaluations for a subsample of 16
projects. These projects were not randomly selected. The selected projects came from the most
commonly funded topic areas: (a) technology; (b) curriculum reform; and (c) access, retention, or
completion. In the end, eight projects were selected from curriculum reform and eight projects were
selected from access, retention, or completion. Within these two content areas, some projects were
selected specifically because they used technology in their approach. In addition to considering
content area, AIR also selected projects that had received dissemination grants. Once the 16 projects
were selected, AIR recruited individuals with expertise specific to each project to examine the
projects in terms of the five review questions.4 The reviewers based their feedback on the data AIR
collected from the previously listed sources.
The remainder of this report examines the five research questions that guide this study. By
describing the types of projects funded and their implementation progress, this review provides
insight into the direction and life of Comprehensive Program grants.

3
Available at http://www.fipse.aed.org (Jan. 20, 2004).
4
There was only one reviewer for each of the 16 projects. However, some reviewers assessed more than one
project.

5
CHAPTER 2
FIPSE’S OPERATING PRINCIPLES

FIPSE’s legislative mandate is particularly broad, aiming to “improve postsecondary


education opportunities.”5 FIPSE implements this broad mandate through a set of principles that
guide funding decisions. These principles are as follows:
1. [Projects] focus on widely felt issues and problems in postsecondary education, rather
than on special interest groups or prescribed solutions.
2. FIPSE programs are responsive to local initiative, leaving to the applicants the tasks of
identifying specific local problems and proposing solutions. Responses to local
problems, however, should have the potential for wider influence.
3. FIPSE programs are comprehensive with respect to the variety of problems addressed
and the range of institutions and learners served.
4. FIPSE projects are action-oriented, usually involving direct implementation of new
ideas or approaches rather than basic research.
5. FIPSE is risk-taking in its willingness to support new and unproven ideas as well as
proven ones. (FIPSE, 2004)
This chapter examines the degree to which the projects in this selective review reflect these
operating principles.

PRINCIPLE #1:
PROJECTS ADDRESS WIDELY FELT ISSUES AND PROBLEMS
Problems that inspire applications to FIPSE may come from a call to action at the national
level or a response to a persistent problem within a particular academic community. Some widely
felt problems are unique to a specific niche of the postsecondary community. For example, one
project was responding to “a call for teachers of practical and professional ethics in the U.S. [that] led
to the realization that there was very little training available for people interested in this field.”
Although the field of practical ethics is small, this training situation was a national problem. Other
examples of widely felt issues follow:
• Closing the distance between engineering education and the engineering work
environment.
• Accommodating older students.
• Maintaining a competitive institution.
5
Title VII, Section 741 of the Higher Education Act of 1965, as amended.

7
• Preparing high school students for college-level work.

Expert Reviewers
All 60 projects defined a particular problem that they wished to address. To determine the
significance of these problems, the experts reviewed the subsample of 16 projects. All the projects
reviewed by experts focused on either (a) curriculum reform or (b) improvement of student access,
retention, and completion. In conducting this review, outside experts considered the “need for the
project” (i.e., whether the project addressed a widely felt problem). The reviewers had access to the
final reports prepared by the projects but not to the applications. Many final reports appeared to be
written with the FIPSE program officer (who had a great deal of background knowledge) in mind.
Therefore, the reviewers did not have the benefit of the full rationale for funding the project that was
available to the FIPSE officers. To supplement the expert reviewers’ assessments, FIPSE program
officers provided their view of the importance of the issues addressed by these 16 projects.
Expert reviewers determined that 13 of the 16 projects addressed a problem in postsecondary
education at a national level.

Curriculum Reform
Within the area of curriculum reform, the problems addressed fell into three categories:
• Making a curriculum topic area more accessible to students.
Example: One project attempted to help non-mathematics majors to see the relevance of
mathematics to their learning.
• Reforming the instructional approach within professional education.
Example: One project attempted to address the need for professionals with international
experience as the field shifted to a more global work environment.
• Integrating technology to improve learning.
Example: One project addressed concerns that the traditional scholarly journals were not
meeting the needs of the academic community by attempting a technology-based
solution.
Reviewers assessed the problems addressed by the eight curriculum reform projects to be
significant. One reviewer stated: “There is little doubt that it is difficult to convince students who are
not especially interested in mathematics that it is worth their time and effort to learn mathematical
content.” In assessing a project for students who were learning disabled, one reviewer stated:
[S]ince students with learning disabilities comprise roughly six of the nine percent of
students with disabilities in postsecondary settings, faculty members will work with this

8
group of students more often than any other disability group. Understanding effective
instructional techniques to use with this student group is important nationwide.
In discussing one of the projects using a new technology, both the reviewer and the FIPSE
program officer concurred that the problem existed when the project was funded but that the situation
had changed. The reviewer stated: “Technology-mediated pedagogy is of undoubted significance,
but the field is evolving rapidly.” The FIPSE program officer’s sentiments were similar:
[I]n 1996 or 1997, there were a lot of people beginning to use the Internet, in particular for
classes. There were also a lot of situations where people wanted to do things remotely.
There was a sense that the learning wasn’t sufficiently collaborative. At that point we didn’t
have commercial tools that enabled people in remote places to work together.

Access, Retention, and Completion


Within the area of access, retention, and completion, three types of problems were evident:
• Facilitating access to postsecondary education for underserved populations.
Example: One project focused on the disconnect between high school and college
instruction and the poor quality of writing skills for many low-income, college-bound
students.
• Stemming attrition.
Example: Four projects addressed a problem many institutions were facing—helping
students make the transition to and stay enrolled in college, particularly during the first
year. Of those four projects, two assisted underserved populations; the other two were
geared toward all first-year students.
• Using technology to support students.
Example: One project was funded to fill the need for student services in a distance
education environment.
Reviewers also deemed these problem areas to have a broad reach. In a strongly worded
statement, one reviewer verified the importance of facilitating access to college:
Virtually everyone who works on the problem of college access understands the issues the
author outlined. There is a firewall between high schools and colleges. Teachers do not
speak with one another and students are frequently not prepared in high school for college-
level work. …The problem remains as significant today as it was when the author wrote the
proposal.
Two reviewers stated that the attrition rate for first-year students was then, as it is now, a
major problem facing many institutions. Reviewers also noted that attrition was a particular problem
for first-generation college students, students with limited English proficiency, and students who
were members of ethnic or racial minorities. One reviewer stated the problem as such:
The typical first year of college is fraught with difficulties and challenges for the curriculum,
students, and faculty. The curriculum in the first year of college is fragmented; it is a series

9
of unrelated courses with “no attempt at horizontal integration.” First-year students are
particularly challenged by faculty and the curriculum to move to a more sophisticated,
higher order level of thinking. They learn in isolation from each other and as a result are
denied both the academic and social benefits of peer interaction.6
Finally, one reviewer noted the need for online student services:
Student services has emerged as the critical area of development in the evolution of new
instructional models, at least as important to overall student performance and satisfaction
and often more complex, more costly and more time-consuming to develop than the
instruction itself.
However, for three projects, of the eight in this content area, the reviewer did not find the
problem to be particularly significant. One project addressed student expectations for college-level
work. The reviewer felt that the real issue at the national level was faculty expectations for students.
However, the FIPSE program officer felt that this focus on student expectations was key:
This project…[examined] student expectations, and specifically looked at what college
students think they ought to do versus what faculty think college students ought to do. The
disconnect between student and faculty expectations had been talked about a lot in the past,
but not much had been done.
Another project attempted to develop a tool for measuring students’ long-term intellectual
growth. The reviewer felt that this project was too removed from the more pressing concern of
student retention and completion. The FIPSE program officer reported that this project was
intriguing and groundbreaking. However, as the reviewer pointed out, a groundbreaking project does
not necessarily address a problem in postsecondary education.
The final project was difficult to interpret owing to the quality of the information provided by
the project staff. The reviewer felt that the problem was never fully articulated. The FIPSE program
officer, who had access to more background information, felt that the project did a very good job of
articulating the problem and the potential solution.

PRINCIPLE #2:
PROJECTS ARE RESPONSIVE TO LOCAL INITIATIVES
Initially, it might appear that responding to local initiatives is contrary to responding to
widely felt issues. However, FIPSE’s guidelines indicate that even projects that are responsive to a
local initiative should have the “potential for wider influence.” Almost all sampled projects appeared
responsive to a local need.

6
The reviewer included citations, not reproduced here, to substantiate these statements.

10
Problem Identification
Out of the 60 sampled projects, many mentioned that the project impetus was an important
need that existed within the institution or community. Three themes emerged in problem
identification:
• A local persistent problem.
 One project identified adult postsecondary education as a critical problem. The rural
state had large distances between most of the population and the flagship public
institution.
 A four-year institution felt that despite a large number of community colleges in the
area, too few students transferred from the community colleges to the four-year
institution.
 A rural two-year institution found that its first-year dropout rate was 50 percent.
• A change in the local environment.
 One public four-year institution was experiencing massive growth. Staff were
concerned that first-year students were feeling lost and that faculty were out of
contact with others outside their discipline.
 A project addressed the growing shortage of secondary school science teachers.
• An opportunity to capitalize on the local environment.
 A business school’s students needed real-world experiences and the local education
community needed technical training.
 An institution addressed the low college-going rate for local students by developing a
program combining the strengths of one of the university’s departments and
connections to local industries.
Several issues that were locally initiated were similar to those that were widely felt. Locally
initiated projects were also attempting to facilitate access to postsecondary education for underserved
populations, stem attrition, reform the instructional approach within professional education, and
integrate technology to improve learning.

Solution Development
Most new grant recipients implemented solutions developed locally. The exceptions were
grants awarded to national organizations or dissemination grants. National organizations usually
developed projects within the national organization and then implemented the projects in partnership
with institutions. Some dissemination projects exported their solution to other locations.

11
For FIPSE grants, solutions frequently developed from the bottom up. Below are several
examples of projects that faculty and staff developed:
• One project director, motivated by her experiences working with students on probation,
developed an intervention program.
• Frustrations with large lecture classes prompted one professor to develop a course in
which students could learn on their own and receive support when necessary.
• Given the decreasing size of many German departments, one professor found new ways
to bring students into the department.
Prior to receiving FIPSE funding, institutions supported many projects. Institutions
supported basic research into the nature of the problem as well as implementation and evaluation.
For example, one project received a President’s Initiative grant to develop a software prototype.
Another project received a grant from the Office of Sponsored Research to pilot test the program.
Sometimes, administrative offices initiated a local response. In these situations, the
administration usually set up a committee to investigate the issue and suggest solutions. For
example, to improve retention rates, one project director noted that the institution held a “series of
meetings over a period of years where different faculty gathered to brainstorm about ways to improve
the curriculum.” Another institution, dealing with the high failure rates in mathematics classes, set up
a committee to “do something about the poor state of college algebra.”

PRINCIPLE #3:
PROJECTS ARE COMPREHENSIVE WITH RESPECT TO THE
VARIETY OF PROBLEMS ADDRESSED AND THE
RANGE OF INSTITUTIONS AND LEARNERS SERVED
In the 1970s, the original FIPSE project officers took their mandate for comprehensiveness
from the legislation, which did not restrict the types of applicant or types of issue funded (NCPPHE,
2002). FIPSE continues to strive to provide grants to a variety of institutions and organizations
serving a variety of learners who have a variety of problems.

Variety of Problems
FIPSE’s current legislation authorizes grants for:
• Improving educational access for all.
• Creating paths to professional training and integrating experiential learning.
• Establishing programs that use technology for communication.

12
• Redesigning and clarifying institutional priorities and purposes.
• Implementing cost-reduction procedures.
• Expanding access for individuals entering, or reentering, institutions and providing
courses of study tailored to individual needs.
• Improving graduate education.
• Examining and implementing reforms in credentialing new institutions or programs.
The legislation also gives the director of FIPSE leeway to award grants “for innovative projects
concerning one or more areas of particular national need identified.” These areas of national need
include but are not limited to the following:
• Institutional restructuring.
• Improved articulation between two-year and four-year institutions.
• Evaluation and dissemination of model programs.
• International cooperation, and student exchange, between postsecondary institutions.
When making funding decisions, the FIPSE program officers select projects that reflect these
legislative priorities.
The 60 projects in the review sample reflect all the areas listed above. Two projects
specifically addressed redesigning and clarifying institutional priorities and purposes. At least 29
projects worked to improve educational access for all.7
As noted above, the legislation allows a great deal of variety, and the issues covered by
grantees illustrate the variety of problems that FIPSE addresses. On its Web site, FIPSE classified
grants into 40 content areas. Although no “typical” problems were addressed by projects, broad
content areas described the work of many grantees. The most commonly funded grants during the
1996 through 1998 funding periods were in the following areas:
• Access, Retention, and Completion.
• Assessment.
• Cost Control and Organizational Restructuring.
• Curriculum Reform.
• Faculty Development.
• Teacher Education.

7
Given the broad definition for this area, only projects specifically targeting underserved populations were counted.
However, many more project could be considered within this area.

13
• Technology.
Within each area, however, a variety of issues emerged. For example, within curriculum reform,
some grantees focused on the need for:
• Calculus and statistics courses for students in the arts and humanities.
• Courses that address students’ lack of interest in humanistic texts and lack of analytical
and communication skills and faculty members’ lack of pedagogical innovation.
• Adaptable instructional practices to improve language instruction for students who are
learning disabled.
• A course of study to improve students’ ability to listen and speak effectively.
Grantees addressing access, retention, or completion problems focused on:
• Low completion rates of distance education students owing to the large geographic area
served.
• The lack of academic and study skills on the part of many high school seniors planning to
attend community college.
• The fact that far more Japanese students studied in the United States than vice versa.
• The need to improve the writing of science students.
Although this review examined only a sample of projects funded within a three-year period,
the 60 reviewed projects included 36 of the 40 content areas. Even within this restricted time frame
and sample, FIPSE appeared to cover a broad range of issues.

Variety of Institutions
For the 60 projects reviewed, grants were awarded to a variety of institutions and
organizations (see Exhibit 2). During the review period, public two-year institutions accounted for 8
percent of the postsecondary institutions funded. Further analyses of the entire applicant pool
suggest that two-year institutions were somewhat less likely to submit an application than four-year
institutions.8 Two-year institutions represented approximately 21 percent of the applicant pool but
approximately 25 percent of the population of colleges and universities (NCES, 2003).9 Two-year
institutions were also less likely to receive a grant (13 percent of grantees) than would be expected
given their prevalence in the postsecondary population.

8
FIPSE provided the applicant pool data, which included the total population of Comprehensive Program applicants
in 1996–1998.
9
Data are for four-year and two-year degree-granting institutions that were participating in Title IV federal financial
aid programs.

14
EXHIBIT 2
Percentage Distribution of Grantees
in the Evaluation Sample by Type

Percentage Distribution
Type of Grantee of Sample Grantees
(n = 60)

Public two­year institutionsa 8%

Public four­year institutions 47%

Private, not­for­profit, four­year institutions 35%

Organizations 10%

a
For-profit institutions were not eligible to receive FIPSE funds. Two-year, not-for-profit institutions did not receive
funding in the review sample.

Exhibit 2 shows the distribution of grantees but not the distribution of institutions actually
participating in projects and therefore underestimates the number of institutions affected by FIPSE
funding. The number of institutions participating in the funded projects far exceeds 60. First, many
grantees worked with multiple institutions. For example, Rutgers University received a grant to
collaborate with three community colleges, and the Colorado Electronic Community College
comprised a consortium of community colleges. Second, all of the grantee organizations worked
with postsecondary institutions. The American Association for Colleges of Pharmacy worked with
many four-year postsecondary institutions. The Higher Education Information Center worked with
four-year postsecondary institutions and high schools.
Within the review sample, the grantees were dispersed throughout the country, including the
U.S. territory of Puerto Rico (see Exhibit 3). As noted above, this distribution reflects only the
location of the grant recipients, not all the institutions affiliated with projects. Within the reviewed
projects, there were grantees in every region of the contiguous United States.

15
EXHIBIT 3
Percentage Distribution of Grantees
in the Evaluation Sample by Region

Percentage Distribution
Region of 
of Sample Grantees
Grantee
(n = 60)

New Englanda 20%

Mid­Atlanticb 23%

Southc 12%

Midwestd 22%

Southweste 5%

Westf 15%

Hawaii or Alaska 0%

Puerto Rico 3%

a
Connecticut, Maine, Massachusetts, New Hampshire, Rhode Island, Vermont
b
Delaware, Maryland, New Jersey, New York, Pennsylvania, Washington, D.C.
c
Alabama, Arkansas, Florida, Georgia, Kentucky, Louisiana, Mississippi, North Carolina, South Carolina, Tennessee,
Virginia, West Virginia
d
Illinois, Indiana, Iowa, Kansas, Michigan, Minnesota, Missouri, Nebraska, North Dakota, Ohio, South Dakota,
Wisconsin
e
Arizona, New Mexico, Oklahoma, Texas
f
California, Colorado, Idaho, Montana, Nevada, Oregon, Utah, Washington, Wyoming

Range of Learners
Of the 60 projects sampled, three-fourths of the projects focused on undergraduate students
(see Exhibit 4). These projects were usually not centered on a specific college year. All five projects
that focused on a specific year in college focused on first-year students. In addition, two projects
were directed at high school students, and four projects were directed at graduate students. The
remaining projects covered at least two educational levels.

16
EXHIBIT 4
Percentage Distribution of Projects by Level of Student Impacted

Number of  Percentage Distribution
Level of Student Impacted
Projects of Projects

High School 2 3%

Undergraduate 45 75%

Graduate 4 7%

Covered multiple levels 9 15%

Eighteen of the projects in the review sample (30 percent) were geared specifically toward
assisting underserved populations. For example, projects:
• Assisted minority and low-income students in their comprehension of “academic
language.”
• Helped first-generation and low-income students, either entering or transferring to an
institution, with their transition to a university setting.
• Addressed the time and distance barriers faced by older students attending college.
• Developed a mentoring program for minority students who were “falling through the
cracks.”
Rather than single out a population for an intervention, some projects reached underserved
populations because the institution itself served these groups. Within the review sample, Historically
Black Colleges and Universities (HBCUs) received two grants, with one of those going to a
consortium of HBCUs. Hispanic-Serving Institutions (HSIs) received two grants. None of the
grantees in the review sample was a tribal college. However, three projects worked with tribal
colleges.
Because these types of institutions represent such a small proportion of postsecondary
institutions, the scope was broadened to look at all projects funded by the Comprehensive Program
from 1996 through 1998. Exhibit 5 shows the percentage of grants each year awarded to HBCUs and
HSIs. Given that these types of institutions represent approximately 3 percent and 5 percent of the
population, respectively,10 the percentage of grants going to these institutions seems proportionate.

10
The percentages of HBCUs and HSIs in the postsecondary population are based on figures provided by FIPSE.

17
EXHIBIT 5
Percentage of Grants Going to Minority-Serving Institutions*

Hispanic­Serving 
Historically Black 
Year
Colleges and Universities
Institutions

1996 7% 2%

1997 4% 4%

1998 4% 6%

* Data based on all Comprehensive Program funded projects between 1996 and 1998. Data were not available for tribal
colleges.

PRINCIPLE #4:
PROJECTS ARE ACTION ORIENTED
Within the review sample, all but a few projects were clearly action oriented. Action-oriented
projects implemented solutions rather than researched problems or solutions. Possible exceptions
included a project that attempted to create a computer-based version of an interview assessment and
one that worked with several states to investigate implementing accountability policies. However,
even these two examples could arguably be considered action oriented—one was developing a new
product, the other was developing new policies. Neither project was strictly research. Therefore, at
least within the sample, all projects could be classified as action oriented.
It is interesting to compare this finding with that of the 1980 large-scale study. In reviewing
398 projects, the study found that approximately 94 percent of funded projects were primarily action
oriented (NTS Research Corporation, 1980). Thus, FIPSE’s early commitment to funding action-
oriented projects appears to have continued.

18
PRINCIPLE #5:
FIPSE IS RISK TAKING
New Versus Established Projects
Approximately 60 percent of the 1996 through 1998 grants were awarded to projects
implemented as a result of FIPSE funding (i.e., they did not exist fully prior to the FIPSE grant).11
The remaining projects either continued the development of an existing project (30 percent) or
disseminated an existing project (10 percent). This split of approximately 60-40 between new and
established projects reflects FIPSE’s aim to support both unproven and proven ideas.

Unproven Ideas
FIPSE’s emphasis on innovation illustrates its willingness to take risks. In fact, the
application materials included a section on the importance of innovation, stating that projects are
innovative “if they have not been tried before—or if there is a significant challenge in adapting them
to new settings or new populations” (FIPSE, 2001, p. 5).
In discussions, project directors (and other project staff) spontaneously commented on
FIPSE’s willingness to fund innovation, even when others would not. Following are a few examples
of these comments:
• One project director stated that he knew that the project “was a risky venture.” But one of
the reasons he approached FIPSE was that “their point is to fund unproven things; things
that have never been done before.”
• One project initially applied for funding from the National Science Foundation (NSF).
NSF was unwilling to fund the project at first and “FIPSE really got [the project] going
because NSF was too conservative.” As the project continued, it received more funding
from NSF than from FIPSE, but FIPSE provided the start-up funds.
• One project director knew that the approach was “not really what everyone else was
doing.” As a result, he felt it was “very close to the innovative concept of FIPSE.”
• A project director stated that his project was “a high-risk, high-benefit project. The risks
were high; but if it succeeded, it would open the door for all sorts of experimentation. …
From FIPSE’s point of view, I think the risk was well worth the money. Had it worked, it
would have been remarkable.”

11
The estimates for the distribution of new, expansion, and dissemination grants were based on a combination of (a)
the known number of dissemination grants in the population and (b) the estimates of the distribution of new and
expansion grants within the non-dissemination grant review sample.

19
FIPSE’s reputation for funding innovation is not new. In discussions with some of the
original FIPSE grantees, one of the pervading themes was FIPSE’s willingness to take risks
(NCPPHE, 2002).

External Reviewers’ Assessments


The expert reviewers assessed the degree of innovation in the subsample of 16 projects.
These experts’ assessments come with two caveats. First, the expert reviewers had the benefit of
hindsight. They evaluated the approach the project actually took, whereas initial grant decisions
were based on the proposed approach. Second, only one expert assessed each project for this review,
whereas multiple individuals reviewed each proposal.
Two projects were difficult to assess because of the quality of the reporting. The AIR team
found that many of the final reports lacked sufficient detail, requiring follow-up with project staff.
This lack of detail may have occurred because project staff may have assumed that the FIPSE
program officer was the sole audience for the report. But, even with this background knowledge, a
FIPSE program officer mentioned that the low quality of the reporting for one of these two projects
was a real deficiency.
The reviewers commented that six of the projects were innovative or were new applications
of proven ideas.12 Following are three examples of what the reviewers believed made a project
innovative:
• Most applied mathematics courses are at the level of calculus or above, where
appropriate applications are fairly easily found and adapted. It is a greater challenge…
to find authentic applications suitable for lower-level courses where students have fewer
mathematical skills on which to draw.
• The crisis in scholarly publishing is severe enough to justify experiments on all fronts.
The [project] is a very worthy experiment taking one approach.
• This program is an innovative way for students to be involved with children/families and
also learn how to problem solve in a supervised situation. The case studies model
professional behavior, which is a very creative and innovative approach and has not been
done before.
One project was strongly based on proven ideas but combined these ideas in an innovative way. This
combination contributed to the reviewer’s reaction:
Very often universities and colleges seek grants in attempts to improve student achievement
or performance with the assurance that federal or nonfederal funds will make project goals
achievable. In most cases (almost always) those goals are never achieved simply because
12
Reviewers were not directed to comment specifically about each principle and the degree to which the project met
the principle. Instead, the reviewers were provided with the principles and asked to take those into consideration
when assessing the quality of the project.

20
the programmatic design, implementation, and assessment of all project objectives and
activities have not been well-thought out. Such is not the case for this project. Faculty,
administrators and staff…should be highly commended for the model that they have
developed…[including an] array of theoretical and conceptual frameworks.
For three projects, expert reviewers did not feel that the project represented innovation or a
new expression of proven ideas. One of these projects appeared to have developed only a
questionnaire (this particular project had a weak final report). The reviewer stated: “Just shortening
the length of the questionnaire does not constitute a promising new strategy that added to or provided
an alternative to an existing approach.” The FIPSE program officer noted that because this particular
project involved a consortium of institutions all taking different approaches, “the project resembled
managing a program with five parts, each of which does not overlap.” The questionnaire was the one
outcome that crossed institutions.
Another criticism focused on the use of technology. For one project, the reviewer noted that
much was made of an impressive new software program, but there was little indication of how this
would address a problem in postsecondary education. The FIPSE program officer also noted that the
project was somewhat removed from the field of education but he thought it to be cutting edge: “This
is cutting edge, as close to the edge of education you can get.” Technology projects also risk being
dated even when they started out as innovative. One project undertook a massive effort to determine
available technology. The reviewer summarizes the outcome as such:
The information gathered in this part of the project represented a considerable amount of
time, energy, and labor. It was, at the moment of its creation, a snapshot of the current state
of the field in this area. As the report notes, however, the highly volatile nature of software
and technology development meant that the durability of the data was very limited, and
within six months the value of the information was minimal.
Although the original project may have been innovative, reviewers felt that two of the
dissemination projects did not have replication plans based on proven ideas. One project opened a
center instead of working directly with a few specific institutions. The reviewer felt that this
approach led to greater awareness of the issue but to few exact program replications. For another
project, the reviewer stated that the original reform was based on proven ideas and was successfully
implemented; however, the original institution did not insist that replicating institutions adopt the
entire program. The resulting piecemeal programs were less successful.

SUMMARY
This chapter reviewed each of the five FIPSE program principles to determine whether the
selected projects represented these principles. The picture was overwhelmingly positive. In large

21
part, the FIPSE grants in the review sample supported projects that addressed widely felt issues,
developed solutions responsive to local needs, comprehensively covered the postsecondary education
community, undertook action rather than research, and used innovative approaches.
Some qualifications are in order, however. First, the review of some projects was
undermined by the poor quality of reporting. It is not clear that this correlated with projects of poor
quality, but it does raise that question. Second, problems or solutions involving technology were
sometimes obsolete by the time the project ended. Either the technology was outmoded or another
product filled the need. Finally, given the distribution of two-year and four-year institutions in the
nation, it appears that a disproportionate number of grants went to four-year institutions.

22
CHAPTER 3
THE IMPLEMENTATION AND
INSTITUTIONALIZATION OF PROJECTS

FIPSE grants are “intended to be used as seed capital for the initial development or expansion
of innovative projects, not for ongoing support of program operations” (FIPSE, 2001, p. 20).
Although FIPSE funds may be used to cover significant start-up costs (such as development time and
expenses), successful projects should continue without FIPSE funding after the grant ends. To
institutionalize, projects need continued funding from either the institution (or organization) or a
third party.

INSTITUTIONALIZATION LEVELS
This review defined institutionalization as the long-term stability of the project after FIPSE
funding ceased. 13 All available sources of information were used to determine levels of
institutionalization, including final reports (many of which were written a year after the grant
completion), Web sites, and e-mail and telephone correspondence. For collaborative projects,
institutionalization was based on the level of institutionalization at the majority of collaborating
institutions. For projects with dissemination grants, institutionalization was based on the success of
the replicating institution(s) in sustaining the project after FIPSE funding ceased. For the projects in
the review sample, the grant completion dates ranged from September 1998 to September 2002.
Exhibit 6 displays the level of projects’ institutionalization in spring 2003. The primary
message from this exhibit is that most projects continued after the end of the FIPSE grant. Almost
two-thirds of the projects, 64 percent, were sustained or expanded since the end of their FIPSE grant
period. Only 7 percent of all reviewed projects had no aspects of the project remaining. The 1980
FIPSE study surveyed 271 projects to determine project continuation and found that approximately
70 percent of projects continued after FIPSE funding ended (NTS Research Corporation, 1980).
Although based on a much smaller sample, the percentage of institutionalized projects sampled from
1996, 1997, and 1998 is close (64 percent).14

13
Not every grant was a program implemented at an institution. For other types of grants (e.g., developing a
software product), institutionalization refers to the degree to which the goals of the project were met and sustained
(e.g., a software product was still being used and supported).
14
The methods for categorizing projects’ institutionalization levels by these two studies are very similar but not
identical. The 1980 evaluation also considered staff members’ optimism for continuing at least five more years.

23
EXHIBIT 6
Level of Project Institutionalization

Percentage 
Number of 
Level of Institutionalization Distribution
Projects
of Projects

No aspects of the project remain. 4 7%

A few aspects of the project remain. 7 12%

Much of the project remains, but at least a few key aspects 
10 17%
have ended.

All key aspects of the project remain. 38 64%

Total 59a 100%

a
Not enough information was available to determine the level of institutionalization for one project.

BARRIERS TO IMPLEMENTATION
All four projects that were not institutionalized experienced difficulties during
implementation. For one project, a change in the political landscape doomed the program within the
first year. During the planning stages, the project staff received the necessary approval from a
relevant state agency. A change in the political leadership in the state resulted in the agency revoking
that permission, effectively stopping the project. The project director said, “We had to end after one
year. During our second year of funding, it was clear we couldn’t continue the project. FIPSE was
nice to give us money to do an orderly wrap up.”
A second project had trouble during the implementation stage, but it persisted until the
administrative leadership changed. During the early years, the project ran into difficulty working
across academic departments, such as not being able to access resources during class time. The
project director had difficulty convincing department chairs to share resources for this
interdisciplinary project. An academic dean who arrived after the project started terminated the
project. The institution reorganized once again and replaced this academic dean. The project
director has participated in initial discussions to revive the project.
The third project had tremendous difficulties working with multiple institutions.
Instructional staff remained committed to the project, but the support of administrators waned. A

24
cross-site leadership team met only once in the second year of the project and never in the final year.
A lack of clarity about the program led to different departments laying “claim” to the project. The
project director left the project before the grant ended, and the project ceased at that point.
The fourth project faced administrative, logistical, and motivational barriers. The institution
was unable to provide needed data during the grant period, and an outside vendor was unable to
finish work within the necessary time frame. The project staff had difficulty scheduling important
aspects of the program. Finally, students could participate in the program, but they were not required
to do so. The administration believed that students would be motivated to use the program, but few
did so. Soon after the end of the grant, the “programmatic essence” was gone.
Even those projects that successfully institutionalized faced barriers during implementation.
Common implementation barriers follow:
• Institutional policies at odds with the project.
• Resistance to change within the administration.
• Resistance to change within the faculty or staff.
• Lack of sufficient resources (e.g., time, facilities, equipment, materials, or funding).
• Turnover of key project staff.
• Challenges using technology.
Several of these implementation barriers stood out as particularly prevalent in projects that never
fully institutionalized (i.e., those projects that remained but with a reduction in scope). These
projects mentioned lack of sufficient resources, turnover of key project staff, and challenges using
technology with disproportionate frequency.

BARRIERS TO INSTITUTIONALIZATION
In discussing their experiences sustaining a program, project staff mentioned few barriers. It
seemed that if a project became well established during the three-year period with FIPSE support,
project staff were usually able to maintain it for at least the period observed in this review (two to
five years after the end of FIPSE support). Of course, different projects encountered different
barriers, such as technology becoming obsolete, a key project staff member leaving, or turnover
occurring within the administration. However, by far the most common, and difficult to overcome,
barrier to institutionalization was a lack of sufficient resources.
For projects that were not fully institutionalized, the lack of sufficient resources was a
primary contributing factor. Some projects were affected by larger budget crises experienced by

25
many institutions. One project director expressed this difficulty: “The huge downturn in the
economy is being felt all over. So that put things on hold. Now, we are just trying to keep the
education programs afloat.” Several projects continued to receive some funding but at amounts that
required reducing the scope. A project director stated that plans for the program had been “squashed
by the budget crises. …In this kind of climate, doing [this program] was not cost effective.” Finally,
one project director noted that in the face of a statewide 10 percent budget cut, the project would not
be funded. As a result, the project team had to drop some aspects of the project.

SUPPORTS FOR IMPLEMENTATION


The FIPSE program itself assisted project implementation. In addition, the same factors that
hindered a project when lacking, facilitated a project when abundant. A supportive administration
and adequate resources typically accompanied successful implementation.

FIPSE Funding
Of course, one major support that all projects had in common was funding from FIPSE.
Given that all the projects in the review received FIPSE funding, it is difficult to know the impact of
FIPSE funds on project implementation. Some anecdotal evidence suggested that FIPSE funds
provided an important support for project implementation. Several projects mentioned that they did
not think the project could have been implemented without FIPSE support. One project director
stated that the university was unlikely to have contributed the necessary resources without the FIPSE
grant. Another project director stated, “FIPSE was the linchpin for the project. FIPSE made the final
connection between the two institutions. Neither school had a grant like FIPSE in the past.”
Many projects received multiple FIPSE grants. Of the 60 projects in the sample, 15 received
funding from FIPSE during earlier grant cycles. In fact, four projects previously received multiple
grants. Projects that had previously received FIPSE funding were not more likely to be
institutionalized at the time of this review than other projects; rather, the distribution of
institutionalization levels was quite similar to that of the not previously funded projects. Although it
would be logical for long-funded projects to have more success at institutionalization, none of the
projects received a subsequent FIPSE grant simply to continue its project. All grants supported new
work. For seven projects, this new work involved dissemination work. For the remaining projects,
the new work involved expanding or reworking the original project.

26
Nonfinancial Support from FIPSE
FIPSE did more than provide funding. FIPSE staff also provided technical assistance to grant
recipients. This technical assistance occurred through “project monitoring…on a collegial basis” and
an annual Project Directors’ Meeting, which was required for grant recipients (FIPSE, 2004).
Grantees frequently commented on the support that FIPSE program staff provided. FIPSE grant
recipients found support through:
• The connection to others in the postsecondary community.
• The direct assistance from the FIPSE staff.
• The flexibility of FIPSE staff.
By far the most common comment about FIPSE staff was their helpfulness. They were
described as “a phenomenal resource.” FIPSE staff helped some projects plan their approach.
Projects saw the FIPSE staff as encouraging and enthusiastic. Following are some examples of
project directors’ statements:
• For some bureaucratic reasons, matching funds were not there on time. We needed
letters from FIPSE to speed up the process. Finally we got things going thanks to the
interventions from FIPSE, especially [the program officer].
• [The FIPSE program officer’s] door was always open. …We could call or e-mail and
contact her anytime. And she would respond in a reasonable time span. I think that’s so
important. …If she had not encouraged us, I don’t know where we would be.
• I’ve worked with a number of grant agencies. …They give you money and say, “I don’t
want to hear from you again. Just do it and write us a report.” What you do the first
time only matters if you go back for more money. …FIPSE is really good. They have
effective interactions, the annual meetings are helpful, and they focus on the evaluation.
Those are the right things. They are involved in people’s projects and that is positive
involvement; it’s not bureaucratic.
• We’ve been funded by a variety of agencies. I can speak highly of all of them, but FIPSE
pretty much stands at the top in terms of being helpful. Their program officers seem to
be really committed and they’re interested in what you’re doing.
Grantees viewed the FIPSE officers so positively that the number one criticism was the desire
for more time with the officers. Grantees wanted more communication in general and more face-to-
face interaction in particular. One project director from the West Coast “envied the people at the
University of Maryland” for their geographic proximity to FIPSE’s Washington, D.C., office. A
project director felt that if the FIPSE officer had not been so busy, maybe the officer could have
helped prevent an administrator from terminating the project. In general, the sentiment was that it

27
was “important to have face-to-face meetings. We’re all sensitive to the cost of doing that. But if it
could be done early, it really sets up good communication.”
FIPSE’s support of projects appears to be a long-standing part of its culture. Project directors
from the first five years of FIPSE funding echoed the supports listed above (NCPPHE, 2002). Then,
as now, the grantees’ experience with FIPSE was overwhelmingly positive.

Institutional Administrative Support


Administrative support came in the form of general, across-the-board backing (e.g., “All
layers of the administration supported the project”) and in the form of key officials “adopting” the
project. As an illustration, one project director said, “All the administrators were openly enthusiastic
about the project. The administrators were happy to serve on an advisory board, to consult, and share
their expertise.” Another project director noted that the “administrative team paid for lunches and the
little extras that made it possible to do.” As a final example, one project director listed the ways in
which administrators facilitated its implementation: “They provided the political leverage necessary
to secure changes in internal financial procedures, course listings, payroll, and materials concerning
the registrar’s office.”
Many projects noted that an administrator’s support assisted in smoothing the way. One
project director noted that two key administrative officials were willing to try the experimental
project: “They had a thorough understanding of the idea and were truly willing to take the risk. They
got close enough to it to absorb some of the risk, had it failed. It is very important for the top
administration to stick their necks out. If it flopped, they would share in the publicity.” Another
project director stated that an administrator was behind them, protecting their course assignments.
Finally, for many projects, key administrative officials served as project directors. One
project director, who was chair of the department, said that the project would have happened if he
had not been in that position, but being the department chair made it easier. An academic dean
served as another project director, which helped to ensure that the project was an academic priority.

Institutional Funding
Just as a lack of funding inhibited a project, adequate funding helped project implementation.
Depending on the type of project, the source of the support varied. In awarding grants, FIPSE
considers the level of institutional or organizational contributions. One project director noted that
years before receiving the FIPSE grant, the institution supported the pilot project. During the grant,

28
some institutions contributed more than was required. One project director noted that the institution
established a computer facility for the project. Another project director noted that the project had not
planned to pay participating high school teachers. When it wanted to pay these teachers, the
institution contributed the extra funds.

SUPPORTS FOR INSTITUTIONALIZATION


Projects that were successfully institutionalized repeatedly mentioned three factors that aided
institutionalization:
• Administrative support.
• Internal funding.
• External funding.

Administrative Support
After the grant period ended, successful projects continued to be supported by an institution’s
administration. Many project directors noted that the institution was “strongly committed” to
continuing with the project. As noted above, projects that were not institutionalized lost the support
of administrative officials. In particular, many of these projects lost support when budgets tightened.
One project that was successfully institutionalized continued to receive administrative support, in
part because of its cost efficiency.
When key project staff left, administrative officials who supported the project could replace
the staff and continue the reform. One project director started a sabbatical soon after the end of the
FIPSE grant, but the institution hired two people to run the program in her absence. New directors
carried on several other projects because the institutions remained committed to supporting the
project and found staff to ensure the project’s continuation.

Internal Funding
Institutions completely sustained many of the successful projects. In these situations, the
institution covered costs previously paid by the FIPSE grant. One project director stated, “Support is
100 percent from the college. The college had total responsibility for funding once the FIPSE grant
ran out.” Many successfully institutionalized projects had institutions that supported staff salaries,
scholarships, facilities, materials, and equipment. In addition to funding the project’s continuation,
some institutions invested in the project’s expansion.

29
External Funding
Although a few projects initially started with external funding (i.e., funding not from the
institution or from FIPSE), many projects acquired external funding near the end of or after FIPSE
funding. Throughout its history, FIPSE has encouraged grantees to seek additional external funding.
FIPSE hopes that its grants will be catalysts for projects to receive other funds. In many cases, this
extra funding is necessary because FIPSE grants are small to begin with and are not meant to sustain
a project. External support came from a myriad of sources, including other federal agencies,
associations, corporations, and foundations. A partial list of third-party funders follows:
• American Association for Higher Education
• Davis Educational Foundation
• Federal Bureau of Investigation
• German Academic Exchange Service
• Institute of Museum and Library Services
• National Aeronautics and Space Administration
• National Endowment for the Humanities
• National Institutes of Health
• National Science Foundation
• Nellie Mae Education Foundation
• Public Broadcasting Service
• Smithsonian Institution
• W.K. Kellogg Foundation

SUMMARY
Despite the barriers—some preventable, many not—about two-thirds of all projects in the
sample were institutionalized. Three key factors greatly affected project success at implementation
and institutionalization:
• The importance of the project to administrative officials.
• The adequacy of non-FIPSE resources (whether from internal or external sources).
• Continuity in project leadership.

30
Of course, projects experienced many other barriers and supports. However, these three areas were
most often mentioned as “making” or “breaking” a project.
Regardless of sustainability, FIPSE grantees, by and large, felt that the FIPSE program staff
were extremely helpful. This support occurred before and during the grant period and took many
different forms: logistic, programmatic, professional, and emotional. Project directors who desired
even more support felt that FIPSE should hire more program officers so that they would have more
time available to communicate and visit with all projects, regardless of location.

31
CHAPTER 4
THE DISSEMINATION OF FIPSE PROJECTS

FIPSE projects should convey what they learn to the greater postsecondary community. As
seed grants, projects should be spreading project knowledge elsewhere. This chapter focuses on
projects’ efforts to share the knowledge they gained. In addition to this description of dissemination
activities, this chapter includes a discussion of how several replicating sites learned about projects.
This chapter does not discuss the success of dissemination activities (i.e., the degree to which the
word “got out” to the relevant postsecondary community). A national survey beyond the scope of
this review would be required to determine the reach of projects’ dissemination work.

DISSEMINATION ACTIVITIES
Dissemination activities were geared toward general information sharing. Dissemination
could have led to replication. However, for the purposes of this review, dissemination did not include
activities focused on implementation at other locations. Those activities are considered to be
examples of replication and are described in the next chapter. The vast majority of reviewed FIPSE
projects (90 percent) shared information with the postsecondary community. All 11 dissemination
grants reported some dissemination activities. In the 1980 study, 87 percent of projects engaged in at
least some dissemination work (NTS Research Corporation, 1980).
Exhibit 7 displays the frequency with which projects reported various dissemination
activities. Projects usually shared information through conference presentations. More than one-
third of the projects reported setting up Web sites15 and organizing workshops. Almost one-third of
the projects reported publishing articles or materials. In addition to these formal dissemination
avenues, 17 percent of projects reported other activities; most of these projects involved individuals
outside the project disseminating information. Rather than rely on one mode of dissemination, 70
percent of projects engaged in at least two different activities (e.g., setting up a Web site and
publishing articles).
Some examples of projects’ dissemination activities follow:
• Presentations—A team of instructors and faculty from one project gave a conference
presentation.
15
A number of projects that reported having a Web site failed to report the Web address. In addition, quite a few of
the reported addresses were no longer active. As with publications and presentations, the evaluation team took the
word of the project staff concerning the existence of a Web site.

33
• Web sites—One Web site for a new type of course included a course description,
requirements, grading structure, readings, processes, and a 28-minute video that included
clips from a class.
• Workshops—One project’s staff provided evaluation materials, papers about the project,
and design information.
• Materials—A publisher distributed a series of materials based on a project. As new
work is produced, the publisher adds it to the series.
• Articles—“There were three major articles in the [journal]. …That journal, every year,
gives an award to the best paper. Of all the papers published in 2000, [the year] which
corresponds to our first two publications, we won that award.”
• Other—One project was chosen to be a case study site by researchers outside the project.
The resulting publications gave the project high visibility.

EXHIBIT 7
Reported Dissemination Activities*

Number of 
Dissemination Activity Percentage of Projects
Projects

At least one activity 54 90%

At least two different activities 42 70%

At least three different activities 23 38%

Presentations 35 58%

Web sites 24 40%

Workshops 23 38%

Materials 18 30%

Articles 17 28%

Other 10 17%

* These figures do not reflect the amount of dissemination within each category (e.g., the number of articles published or
the number of presentations given).

34
DISSEMINATION ACTIVITIES AND INSTITUTIONALIZATION
Difficulties institutionalizing a project did not necessarily hamper all dissemination efforts.
Even projects that experienced difficulty with institutionalization engaged in some dissemination
work. One project director stated that although the project had not continued, the staff worked on
disseminating lessons learned from the project; they gave numerous presentations and assembled
articles for publication. One project funded in 1996 originally planned to publish materials. In 2003,
the project supported an active Web site, but the materials were not published. The project director
stated that without funding, project staff were not able to find time to publish the materials, but that
they knew people visited the Web site. One project’s staff found that they had gained expertise that
interested others—they primarily heard from others looking for advice about applying for a FIPSE
grant. However, project difficulties did hamper some dissemination. One project’s staff had
purposefully done little dissemination because they thought their product was not reliable enough to
share with others.
Successful institutionalization did not guarantee strong dissemination. One project director
stated: “I’ve never thought of it as being something that was worth crowing about at meetings and
things. We wrote a number of informal articles and did a presentation or two.” Another project
director said that staff shared information inside the state, but, he added, “[T]here haven’t been any
efforts to get the word out to the broader community.”

DISSEMINATION LEADING TO REPLICATION


The main purpose of dissemination is to have an impact on others in the postsecondary
community.16 Although many projects knew that they had released information into the community,
they often did not know its impact. For example, a project director felt that many presentations had
generated a great deal of interest but did not know whether others had “taken a piece of this or a part
of that.” In fact, sometimes projects had effects about which they were unaware. One project staff
director learned that the project had been replicated in another state only when a new staff member
shared the news.
As part of the review, a small number of individuals who were replicating FIPSE projects
discussed how they learned about their projects. These individuals provide a somewhat biased
picture of the impact of dissemination because the original project staff knew of their work.

16
As an anecdotal example, four expert reviewers noted that they were aware of a project before being asked to
comment on it.

35
Individuals at replicating institutions usually learned of a project through personal
relationships with FIPSE project team members. Frequently, the original project staff approached an
individual at the replicating institution, rather than vice versa. For instance, before applying for a
dissemination grant, a FIPSE project director contacted colleagues about replicating the project at
their institution.
Conference presentations provided another common communication venue. In fact, two
individuals from replication projects specifically mentioned learning about the original project at the
FIPSE-sponsored annual conference (when they directed different projects). In addition, a few
individuals mentioned that they attended workshops or conferences focused specifically on the
project.
Interestingly, very few of the replicating respondents learned of the project initially from
published materials or Web sites, despite how commonly projects used these venues to share ideas.
Two people mentioned learning about the project from publications; however, each case was atypical.
One person received a project’s materials after complaining to a textbook sales representative about
textbook quality. Another person learned of a project by reviewing an article prior to its publication.
Only one person learned of a project after discovering the Web site during an Internet search.

SUMMARY
It is clear that FIPSE projects disseminated information. Most projects formally shared
information through presentations; but, most replicating institutions reported that they learned about
the FIPSE projects through informal channels. As one project director stated: “We have not formally
applied for any dissemination grants, but we have shared information on a colleague-to-colleague
basis.” Although grantees took steps to share project details with the broader community, these data
do not measure the success of these steps. Yet, anecdotal evidence from the replicating sites indicates
that dissemination efforts reached others in the postsecondary community.

36
CHAPTER 5
THE REPLICATION OF FIPSE PROJECTS

Measuring replication is difficult. The data for this review came primarily from reports
prepared by and conversations with FIPSE project directors. As noted in the last chapter, many
directors shared information with others, but they were not always aware of resulting activities at
other institutions. Just as this review does not address the reach of dissemination work, it also does
not address the replication of FIPSE projects within the full higher education community. Therefore,
this review likely underestimates the replication of FIPSE projects.

REPLICATION LEVELS
This review defined replication as the success of the FIPSE grantee in establishing the project
at locations not originally affiliated with the sampled FIPSE grant.17 For collaborative projects,
implementation at institutions outside the collaboration determined replication levels. For
dissemination projects, work done outside the auspices of the dissemination grant determined
replication levels. Therefore, activities outside the reviewed FIPSE grant defined replication levels
for dissemination grants. In some cases, FIPSE funded the replication activities. If a project in the
sample subsequently received a FIPSE grant, this review considered the subsequent work that
occurred as part of the dissemination funding to be part of its replication activities.
Exhibit 8 displays the extent of successful replications of FIPSE projects at other institutions.
About one-third of projects (34 percent) knew that external sites had replicated the project; 21
percent of projects were seriously working with other institutions or knew that other institutions had
adopted aspects of the project. Taken together, these percentages mean that more than one-half (55
percent) of the sampled projects knew that others had adopted at least some aspects of their work.
An additional 28 percent of projects knew about other institutions interested in the project. But 17
percent of projects were not aware of any replication efforts, or potential for replication, elsewhere.
It is important to remember that the definition of replication used in this report includes only
activities not funded as part of the grants sampled for this review.

17
As mentioned earlier, not every grant was a “project.” For other types of grants (e.g., developing a software
product), replication refers to activities that reflect growth (e.g., an increase in the usage of the software).

37
EXHIBIT 8
Level of Project Replication

Percentage 
Number of 
Level of Replication Distribution
a

Projects
of Projects

No project replication was reported. 10 17%

There was potential for replication (i.e., project staff were 
16 28%
contacted by other institutions).

Serious project discussions were held with other institutions or a 
12 21%
project started but did not look like the original project.

Project knew that others had set up similar projects at their 
20 34%
institutions (i.e., successfully replicated elsewhere).

Total projects 58b 100%

a
Activities were considered replication only to the extent that they were initiated after the sampled grant. This definition
differentiates between institutionalization and replication activities. For example, if a dissemination grant implemented
a program at four institutions, the replication level would reflect work done with additional institutions, not the four
affiliated with the FIPSE dissemination grant. Activities undertaken as part of subsequent dissemination grants were
considered replication by the original grant (e.g., a project in the sample subsequently received a dissemination grant).
b
Not enough information was gathered about two projects to determine the degree of replication.

Following are examples of a project at each replication level:


• No Replication Reported—This particular project set up a new model for developing
internship opportunities for students. The project was terminated at the institution. The
project director did conference presentations but said that there were no further plans to
disseminate the project within the postsecondary community and was not aware of others
replicating the project.
• Potential Replication—A project set up a virtual laboratory for science experiments.
The project director published articles about the project. When contacted, he provided
the project report and references and exchanged e-mails. He did not know whether any
of these institutions established similar virtual labs.
• Aspects of Project Started—One project involved collaboration between college and
high school writing instructors. Multiple institutions participated in the original project.
The instructors then assisted in dissemination mostly through informal channels. For
example, one project team member shared details with siblings who then adopted some of
the project ideas. Other team members shared the project idea with peers, and those
peers adopted some aspects of the project.

38
• Successful Replication Elsewhere—One project created a new high school that
provided technical education for college-bound and non-college-bound students. Once
the first school was successful, the state legislature appropriated funds to create three
more schools, each with a different technical focus.

BARRIERS TO REPLICATION
Not surprisingly, the barriers that impeded the original projects were often the same ones that
impeded replication at other institutions. Replicating projects faced:
• Institutional policies at odds with the project.
• Resistance to change within the administration.
• Resistance to change within the faculty or staff.
• Lack of sufficient resources (e.g., time, facilities, equipment, materials, or funding).
• Turnover of key project staff.
• Difficulties using technology.
For the original projects, the major barrier was lack of resources. Replicating projects also were
commonly impeded by lack of resources. However, replicating institutions experienced greater
challenges in dealing with administrations and faculties than the original projects. In addition, some
replication sites compromised the projects by changing the original design.

Administrative Barriers
At some replicating institutions, administration officials were not interested in the project.
One original project director hypothesized that the lack of support at replicating institutions was due
to the low level of financial support attached to the project: “Each institution involved only received
a minimal level of financial support. This project was often not a priority and competed with other
grant-supported projects that received more money.” One project director felt that a replicating
institution was not very committed to the project because of other competing concerns (e.g.,
enrollment changes). Another original project director felt that other institutions were resistant to the
project because they “did not want to take risks.” Owing to little administrative support, some
individuals at replicating institutions felt that they were, in the words of one staff member, “working
on this endeavor alone.”
Collaborative projects experienced problems that were particular to working across many
institutions. For example, scheduling meetings became a major challenge for some projects. One
original project director found the entire project challenged because of the particular Institutional

39
Review Board (IRB) requirements at each institution: “When dealing with IRBs at multiple
institutions, we never got the same answer. Each institution was following a different protocol. …It
was incredibly difficult to manage.” At other times, institutions were reluctant to share potentially
unfavorable information with other institutions. One project found that “only a few institutions
allowed their data to be shared publicly.”

Faculty Barriers
Many projects found that faculty at other institutions were more resistant to the reform than
their own faculty had been. One workshop at a replicating institution turned into a faculty “griping
session”; faculty voiced concerns about the pace of the reform work. In working with faculty,
projects found that “some teachers are traditionalists and not as interested in new things.” Several
project directors mentioned that selling the idea to faculty was their biggest barrier. One project
director noted that he was able to use faculty members’ initial resistance as a selling point: “It is a
useful barrier because people can say that at first they thought it wasn’t a great idea and then were
won over.”

Change to Original Project Design


As perceived by the original project directors, some of the replication projects were not
successful because key elements were changed. One institution changed a full course to a tutoring
program. One institution integrated a project with an existing program, with much less time spent on
it. One project provided supplementary materials, but the faculty did not require students to use
these materials. In these situations, the original project directors felt that the changes rendered the
projects much less effective.

GENERAL SUPPORTS FOR REPLICATION


Factors that made the implementation and institutionalization of the original project
successful also supported implementation at additional locations. Reports mentioned the important
role of administrative support as well as internal and external funding. Additionally, some replication
success built on the support of key officials and faculty at replicating sites.

40
Financial Support
Funding for the replicating sites came from the site itself and from external funders,
including FIPSE. Institutions frequently funded project staff time. For low-cost projects,
implementation required little beyond staff time to work on the project. Several institutions provided
projects with technical support from librarians, institutional researchers, and information technology
staff. External funding sources included private corporations and foundations, the National Science
Foundation, and state government agencies. Some high schools used Title I money to purchase
materials. However, the most common supporter of replicating sites was FIPSE itself through
dissemination grants. FIPSE grants assisted many of the projects by funding site visits, phone calls,
materials, conferences, compensation for students and faculty, and time.

Support of Key Officials


Some projects found that administrative officials sought FIPSE projects to address concerns
at their institutions. One project director noted that the staff had meetings with vice presidents and
deans, who invited the project to their campuses. A provost put together a committee to see how
another project could be integrated into colleges throughout the system. One dean even undertook
directing the project at her campus.

Support of Faculty
The blessing of upper-level administrators could smooth the adoption of a project, but it
certainly did not substitute for the commitment from those closer to the students—the faculty.
Before working with other institutions, one project insisted that the faculty at each institution give
full approval first. Another project was able to recruit faculty from 16 different departments to
participate, representing general support for the project across the institution.

SUPPORT FROM THE ORIGINAL PROJECTS


Once a project can demonstrate success, others will be more willing to take the somewhat
smaller step of replicating the project. In disseminating their work, FIPSE projects served as models
for others. However, these projects also actively supported replication.
Replication activities, for the most part, consisted of consultation and collaboration.
Consultation involved one institution providing guidance, advice, and materials to support
implementation at other institutions. Collaboration, in contrast, involved a more mutual relationship.

41
It could include the original and replicating institutions working together, but it frequently involved a
consortium of replicating institutions. The focus was less on exactly replicating the original project
and more on sharing knowledge to be modified for each unique environment. Of course, these two
activities were not mutually exclusive; many successful replications involved aspects of both
collaboration and consultation.

Consultation
Projects most commonly supported replication by consulting from a distance. This usually
involved communicating by e-mail and phone and sharing materials. One project used
videoconferencing to bolster the support it was providing. Some projects relied on consulting from a
distance exclusively. One project director found that the project was “simple enough that people do
not need pages of detail. They just need the concept and they can go back and work on something
that fits their own situation.” Another director at a replication site found that the instructions and
accompanying materials he received were sufficient: “Everything that was sent to me was more than
adequate.” After expressing interest in a project, a director at a replicating site received a 20-page
paper describing the program and evaluation methods. The director at the replicating institution
found the original institution to be “very generous and nurturing.”
Some projects found time and resources for site visits. These visits involved the replicating
team visiting the original institution or vice versa. More typically, the original project director visited
the replicating institution. Site visits frequently included presentations to faculty, discussions with
project staff about administrative issues, demonstrations of products or techniques, and evaluations
of the replicating site. One project director stated that she engaged in “a mix of strategic
conversations and hands-on direct training with faculty.” Many projects used a site visit to kick off
the project and then continued to be available from a distance. One project that had FIPSE funding
visited replicating institutions once a year. The director noted that going to instruct other sites had a
benefit because “one of the best ways to learn something is to teach it.” He thought that his
institution benefited as much as the replicating institution.
When it was instructive to see a project in action, representatives from replicating institutions
visited the original institution. One project’s staff “visited classes, met with faculty, had dinner with
teachers and then came back to [their] institution and initiated the process.” Another director went to
learn about a software package and how to use it. When multiple institutions made a site visit, the
format often included a conference, with speakers sharing their expertise. For example, one project

42
disseminating an interdisciplinary language program held an annual conference. While at the
conference, attendees heard about program variations and suggestions for implementation. They
were also introduced to the existing program at the host campus. Although not often done, some
projects integrated site visits in both directions. These usually started with the originating institution
hosting a visit, which was followed with visits to replicating institutions during the implementation
period.

Collaboration
Collaboration occurred when a project continued to develop on the basis of the input of many
implementers. Usually in these situations, the director from the original project coordinated meetings
and worked with the other institutions individually as the project’s development continued at all
institutions (original and replicating). Staff from a consortium of institutions would meet face-to-
face and online (usually through e-mail). In many cases, the consortium developed into “a network
of people who are aware of each other, and willing to share.” One consortium member noted, “The
opportunity to share ideas, to examine, and to critique is great. [The project director] brings her
leadership, but the others do as well.” One particularly large consortium was able to assign subgroups
of individuals to tackle specific problems facing the community. One member said that the support
she received from the original project director was great, but “the most interesting things about the
project were the occasional meetings…with the other collaborating institutions. …It was useful and
enlightening to hear how the other institutions were designing their projects and what their
experiences were.”
Several directors noted that they sought out collaboration to improve the project. One project
collaborated with other colleges within the university to improve a software product. Another
project’s staff found that they did not have the means to continue work on the project alone. They
enlisted another institution and both moved forward to develop the project. Finally, one project
partnered with high school instructors to create a course, based on the original postsecondary project,
suitable for secondary students.

Effectiveness of Support Provided by Original Project Staff


Among the various ways that project staff supported replication, no clear link emerged
between a particular type of activity and successful replication. Different types of projects required
very different types of support. In conversations with replicating projects, most staff reported that

43
they received sufficient support that met their expectations. Following are some examples of staff
responses:
• [Staff at the original project] have been available for informal discussions and have
provided conferences. Whenever I have questions, [the project director] has been very
receptive. I attended workshops last year about how to set up funding sources. It
brought in a lot of people who had set up similar programs. [The original project staff]
have been instrumental in bringing people together involved with similar programs
across the nation…[The project director] has been a critical facilitator.
• We communicate via e-mail and phone. [The original project staff] have always been a
great support. They keep me in the pipeline for new information. In addition to all this,
they have provided a couple of dollars for me to go to meetings. …Overall, they were a
great support.
• Whenever I have a new class, [the original project directors] have been willing to send
materials for free. …This has been a great support because it is very expensive to
purchase.
• The support has gone beyond my expectations. …[The original project team] has always
graded the pre- and post-tests for me. This has been a tremendous help.
• The support met our expectations. A certain amount is helpful, but then you really have
to pick it up and figure out how to fit it to your own institution.
Projects that experienced some dissatisfaction with the level of support provided by the
original project were primarily working with technological innovation. During the time that one
person was using a software package, the original project director “was not under the auspices of the
FIPSE grant. As a result, he wasn’t able to provide all of the support that would have been helpful.”
One individual was forwarded to several different staff members for help but never received a
complete answer. Finally, one project’s software applications were not running. It turned out to be
an issue with a vendor, but the individual felt that the project Web site should have provided an
explanation of how to avoid the issue.

ROLE OF FIPSE SUPPORT IN REPLICATION


Evidence suggests that projects that received dissemination grants continued to grow after the
grants ended. For example, staff affiliated with projects continued to hold conferences and add new
staff; projects involving new curricula continued to hold training. Clearly, not all dissemination
grants continued to replicate. However, the FIPSE funding appears to have helped many of them get
systems and supports in place that allowed for ongoing dissemination after the end of the
dissemination grant.

44
Overall, replication levels reported in this chapter were based on:
• Work done after the grant period by projects with dissemination grants.
• Work done after the grant period by projects without dissemination grants.
• Work done after the grant period by projects with subsequent dissemination grants.
Most of the projects fell into the first two categories. Fewer than 10 of the projects in the sample
received subsequent dissemination grants. Therefore, FIPSE funding did not support most of the
work reflected in the replication levels (although FIPSE may have supported earlier replication
work).

Replication in Projects without FIPSE Dissemination Funds


Of the 20 projects that successfully replicated themselves, eight never received funding from
FIPSE to disseminate. These cases demonstrated project capabilities without the support of FIPSE.
Many of the replicating projects received funding from another source. One project director’s travel
was funded by a subsequent NSF grant. One project was replicated when the state funded additional
sites. One institution was pleased with the project’s outcomes and funded a publication about the
project. The publication generated interest, and the project staff began offering summer workshops
to train other instructors.
Three projects set up distance education programs (two of which involved a consortium of
institutions). Each project director noted that the project served as a model for the feasibility of the
idea. All the project directors shared the ideas at conferences, but the ideas were “simple enough that
people just needed the concept” to be able to replicate. One of these three project directors also
worked with several new consortia as a consultant and provided training to faculty and helped them
develop appropriate distance education programs.
Two projects were able to replicate by sharing information with others. One project was
shared at conferences. Interested individuals received additional materials and project staff were
available to consult by e-mail and telephone. For a different project, a retired professor donated his
time to support others interested in adopting the software.
These few examples demonstrate that projects replicated when the idea was simple and either
did not require support or had additional funds to support it. It is not clear from the FIPSE
documentation how much replication is expected from the pool of projects funded. However, this
review found that the leaders of most projects who knew their work had been replicated elsewhere
had received support for these endeavors from FIPSE at some point.

45
REPLICATING SITES AND DISSEMINATION
FIPSE grantees were not the only advocates for disseminating a project. In conversations
with staff at replicating sites, examples of replicating sites assisting in dissemination efforts emerged.
Replicating sites published articles, gave presentations, led workshops both within and outside their
institutions, and talked about the project with their colleagues. One project director at a replication
site said, “We did a poster presentation in March directly about [this project]. Word got out that way.
We have also passed on [the original project director’s] information about 10 times.” When
replicating sites start to act as disseminators, it is the fruition of the “seed” grant goal: one small
project growing at other sites that in turn assist in growing it at even more sites.

SUMMARY
Although this selective review likely underestimates the replication of FIPSE projects,
evidence suggests that one-third of the sampled projects had been replicated and one-half of the
projects had been adopted at least in part. The challenge of successfully navigating administration
officials, faculty, resources, and technology made replication difficult. Yet, when supportive, all
these individuals and materials facilitated replication. In assisting replication sites, original projects
provided consulting services both on-site and from a distance. Some projects collaborated with
replicating sites to allow them to assist one another and improve the project overall. FIPSE funding
was instrumental for many replication efforts. However, some projects replicated without FIPSE
funding, and other projects continued to replicate after the FIPSE dissemination funding ended.

46
CHAPTER 6
THE EVALUATION OF FIPSE PROJECTS

The sustainability of many projects beyond the FIPSE funding period and the replication of
them on other campuses raise the question of how the impact of these efforts is being determined.
National studies of FIPSE have not evaluated the effects of individual projects on the targeted
populations (e.g., the effect of an intervention program on student retention). Instead, this
information must be obtained from evaluations conducted by specific projects.
This chapter presents a review of the evaluations that FIPSE grantees arranged for their own
projects. This review includes a brief overview of the features of the evaluations as well as
assessments of the evaluations by the outside experts. Although FIPSE did not require grantees to
submit evaluation reports, the lack of details provided by more than half of the projects makes any
conclusions about the evaluations themselves tentative.

EVALUATION GUIDELINES
During the review period, FIPSE provided minimal guidelines for project evaluations.
Instructions for the 1996 grant year specified that the final application proposal should include a
section on the evaluation design (FIPSE, 1995). Applicants were instructed to consider what others
would deem to be solid evidence of project success. Other than providing a bibliography of
evaluation references, FIPSE provided guidance through discussions with the program officer. Thus,
the reviewed evaluations need to be considered in this context.
In recent years, evaluations of projects funded by federal programs, including FIPSE, have
increased in importance owing, in part, to the passage of the Government Performance and Results
Act (GPRA) in 1993. This legislation holds agencies accountable for program performance by
requiring that they measure and report on goals annually. As a result, federal agencies now place
increased importance on project-level evaluations. In the U.S. Department of Education, compliance
with GPRA and the use of performance data have continued to evolve. FIPSE now provides more
specific guidelines for evaluation in the application materials (FIPSE, 2001) and placed information
about designing and conducting evaluations on its Web site.

47
A REVIEW OF FIPSE PROJECT EVALUATIONS
Availability of Evaluations
Only about 40 percent of the projects reviewed submitted a complete evaluation report to
FIPSE. A little more than half of the projects included results from their evaluation in their final
project reports. For this latter group, it was typically difficult to determine the quality of the
evaluation, its methodology and focus, and, at times, the results themselves. Thus, the review of the
project-level evaluations is limited given the fact that many grantees did not submit complete
evaluation reports; in a few cases, even evaluation summaries were not available. However, it is
important to note that during the review period, FIPSE required projects to submit only annual and
final reports, not evaluation reports.

Evaluators
Ideally, experienced evaluators who are not involved with either the project or the institution
in which the project is operating should conduct the evaluations. About 40 percent of projects hired
outside evaluators. These individuals tended to be from other colleges and universities located near
the grantee institution or from research and evaluation firms. One evaluation of a science teacher
professional development program assembled a group of 17 educators. These individuals, however,
did not design the evaluation; instead, the project director set up the format for eliciting feedback.
Several projects appeared to use combinations of external and internal evaluators or the external
evaluators themselves worked as part of the project team. For example, in one project, staff collected
data but turned the data over to the evaluator for analysis.
In several projects, when the original evaluators either left or did not work out, other
evaluators were hired, project staff themselves attempted to evaluate the project, or no attempt to
evaluate the effort occurred.

Evaluation Methodologies
The vast majority of FIPSE projects that submitted evaluation reports for which
methodologies could be determined used some type of survey (approximately 80 percent). Many of
these surveys focused on participant satisfaction with their involvement in the project or their
perceptions of the project’s impact. Although these types of surveys, if developed and administered
appropriately, can inform an evaluation, by themselves, they rarely provide enough information to

48
determine whether the project itself made a difference. It appears that approximately 25 percent of
the projects reviewed used only these types of surveys to evaluate their efforts. Some evaluators also
used interviews and focus groups, which also rely on participants’ self-reporting of satisfaction and
impact.
About 20 percent of evaluations employed some type of observational method. In one
project, the evaluator observed classes and noted the types of interactions between faculty and
students. Another used cognitive labs in which students “thought aloud” as they worked through
newly developed software.
Another 20 percent of evaluations reviewed project work or participant outcomes. Student
journals and course work were the most common types of products assessed. Other evaluations
looked at course grades, grade point averages, and retention rates to determine a project’s impact. In
these cases, evaluators compared changes in these measures before and after participation in the
project.
A number of the projects attempted to use some type of comparison group. Comparison
groups included students enrolled in a different section of the same class but not exposed to the
project treatment and students from previous semesters who did not have the opportunity to
participate in the project.
Although uncommon, a few evaluations used random assignment to determine whether any
observed changes could be attributed to the FIPSE project. In one case, students were randomly
assigned to one of two classes that varied in their approach to teaching chemistry. Students could not
choose which variant of the curriculum in which to participate. Outside experts who did not know to
which class the students were assigned evaluated their work. Results generally favored the treatment
group. For another project, students were randomly assigned to two versions of a first-year advising
program. Results indicated greater satisfaction, better retention rates, and higher grades for those in
the treatment group.

Evaluation Focus
For the projects that provided information, most evaluations (about 60 percent) focused on
the impact of the project rather than on the implementation. Some evaluations provided feedback on
both implementation and outcomes (about 20 percent). The remaining 20 percent focused strictly on
implementation.

49
Evaluation Findings
Regardless of the evaluation design or methods used, findings that were reported tended to
indicate that the project’s impact was positive. Of the projects that reported outcomes, about 80
percent presented almost exclusively positive results. Often, however, the evidence supporting these
conclusions was quite weak. One project, for example, indicated that students improved their skills
between the beginning and end of a nontraditional course. However, students enrolled in a more
traditional course covering the same subject matter showed similar improvement.
Some evaluations did report mixed results or indicated little or no measured impact. These
findings were often reported along with other positive aspects of the project (e.g., “the students were
satisfied”) or explained away (e.g., too small a sample, a problem with the data). The desire or
tendency to report positive outcomes to the funding agency, although only natural, limits what
reviewers can learn from a program that funds innovation. Not all innovation can be successful.

EXTERNAL REVIEW OF EVALUATIONS


The outside experts who were asked to review 16 projects were also asked to assess the
evaluations conducted as part of the grant. The reviewers’ assessments varied by project. On the less
positive side, these individuals noted the following:
• The evaluation aspect was not well-defined in the project and consisted predominantly of
providing descriptions of the accomplishments of the individual [replicating sites].
• The investigators note that the planned evaluation was not conducted. In its place, they
provide information on the number of articles written, presentations made, anecdotal
comments made to them. …Whatever the reason, there is no excuse for totally excluding
a formal evaluation of a federally funded project.
• It is somewhat difficult to assess whether the evaluation design was appropriate or not.
…While the design of the software program and its different components were driven by
specific theoretical frameworks and methodological approaches, there was no unbiased,
rigorous data produced that could verify the effectiveness of the project or its goals.
• There is a total lack of evaluation with regard to assessing the effectiveness of the
strategies in different settings. …Satisfaction surveys of students at the different
campuses do not constitute an outcomes assessment.
On the more positive side, the outside experts made these statements:
• The evaluation…is superlative. It is clear, thorough, and provides concrete information
rather than anecdotal or impressionistic evidence.
• The evaluation methods were appropriate to the purpose of the program.

50
As reflected by these selected samples, the experts’ comments were generally more negative than
positive.

CONCLUSION
Drawing substantive conclusions about the evaluations for the FIPSE projects funded in
1996, 1997, and 1998 is problematic. Fewer than half the projects reviewed submitted thorough
evaluation reports. Of the projects that did provide details of the evaluation, many based their
evaluations solely on self-reports and did not include comparison or control groups. However, the
several strong examples demonstrated that is it possible for these small grantees to conduct
convincing evaluations. One project director, despite initial reservations, felt strongly about the
evaluation’s contribution:
Our gut feeling is that something strong happened and it has some good effects. The data in
the report is giving us a pretty clear sense that something is happening that doesn’t happen to
a control group. …When I heard the cost, I was appalled. …But it produced useful stuff. I
think [evaluation] is a waste of money unless you do it well.
The weak point in this selected review of FIPSE projects is certainly the evaluations. Given
the extent to which these projects appear to disseminate their products and practices, strong
evaluations would inform grantees as well as those attempting to replicate the effort. Solid project-
level evaluations would also fill some of the gaps resulting from infrequent programwide studies of
FIPSE. There is evidence that FIPSE is putting more emphasis on the evaluations: More recent
applications clearly state that independent evaluations are ideal and that these evaluations should
focus on outcomes and should not be based on self-reports (FIPSE, 2001).

51
CHAPTER 7
CONCLUSION

FIPSE’s Comprehensive Program has not been studied since the early years of the program.
The study, conducted in 1980, presented a largely positive picture of FIPSE (NTS Research
Corporation, 1980). The current review, much smaller in scope, covered only a sample of projects
from three grant years and relied on self-reports of project work and outcomes. However, this review
reaches many of the same conclusions as the earlier study. The current review finds that projects, for
the most part, continued after FIPSE funding, shared their ideas and lessons, and assisted others with
implementing ideas.
Some FIPSE grantees were attempting cutting-edge work at the time of their funding; other
projects attempted smaller innovations. According to Rogers (1995), innovation is situational: If the
project is “perceived as new” within the local context, it can be considered innovation. Therefore, a
range of activities can be innovative. Innovative projects could conduct basic research in unexplored
areas. Innovative projects could move an established approach to a more challenging environment.
Or, innovative projects could meld several established approaches. FIPSE projects covered this
range. One project contributed to the educational body of knowledge by attempting a new approach
to student assessment. Another project implemented an established curriculum at Native
American-serving educational institutions. In both cases, the grantees embarked on a new and
challenging reform.
A program that funds innovative work should not expect 100 percent of the funded projects
to flourish (in fact, that rate might indicate that the work was not particularly innovative). More than
80 percent of grantees had maintained at least some aspects of their project as much as four years
later. As mentioned previously, many grantees mentioned FIPSE’s flexibility as particularly
supportive. It is possible that this flexibility enhanced the likelihood of success with an innovation:
Projects were allowed to change their approach as they met barriers or discovered new avenues. The
fact that approximately three-fifths of the projects were institutionalized, and another one-fifth of the
projects institutionalized some work, demonstrates a strong track record in both FIPSE’s selection of
and support for grantees.
Within the sampled projects, some were easier to replicate than others. However, the level of
difficulty did not appear to be the main determinant of whether a project was replicated by other
institutions. Instead, it appeared to be the interaction of difficulty and funding. Projects that

53
produced materials (e.g., textbooks) that could be distributed easily once developed did not require
additional funds for replication work. However, replication of some of the most substantial projects
(particularly those requiring technical support) was greatly assisted by additional funding (either
from FIPSE or from other sources).
During the review period, FIPSE did not emphasize the evaluations of individual projects.
Many evaluations failed to demonstrate much beyond participants’ fondness for the project.
However, greater emphasis is currently being placed on the quality of the evaluations. Since the
review period, the federal government has placed more emphasis on evaluating outcomes (first with
GPRA, now with the Program and Assessment Rating Tool). The FIPSE office responded by making
project evaluation a more explicit criterion in grant making and by providing more support to
projects as they plan evaluations.
This review did not address many important FIPSE outcomes owing to the methodological
approach. First, the data collection did not include any independent assessment of actual project
implementation. Instead, projects provided all data concerning the degree to which a project was
implemented (and sustained). Second, although the review examined the individual project
evaluation methodologies, it did not systematically examine the project outcomes. One could
imagine a broad evaluation that would independently assess the outcomes for many different
projects, but that work was beyond the scope of this review. The review also did not independently
investigate the influence of FIPSE-funded projects on other projects. Rather than survey the
postsecondary community for evidence of influence, this review relied on grantees’ reports of
dissemination and replication and on grantees’ referrals for replication. As a result, this review likely
underestimated the scope of the influence because grantees may not have been aware of the full
extent of their influence.
Despite these limitations, this review provides a promising glimpse into the post-grant lives
of FIPSE grantees in the Comprehensive Program. The findings reported here support the findings
from the original 1980 study (NTS Research Corporation, 1980). In addition, they support the
generally positive reputation that FIPSE enjoys within the postsecondary community; as one project
director stated, FIPSE is “a place where people with good ideas can get funded.”

54
REFERENCES

Cambridge, B., Ewell, P., Fields C. and Miller, M. (2002). FIPSE: 30 years of making a difference
[special issue]. Change, 34(5).
Fund for the Improvement of Postsecondary Education (FIPSE). (1995). The Comprehensive
Program application for grants: Fiscal Year 1996. Washington, D.C.: Author.
Fund for the Improvement of Postsecondary Education (FIPSE). (2001). The Comprehensive
Program FY2002 information and application materials. Washington, D.C.: Author.
Fund for the Improvement of Postsecondary Education (FIPSE). (2004). Operating principles—
Fund for the Improvement of Postsecondary Education. Retrieved Jan. 18, 2004, from
http://www.ed.gov/about/offices/list/ope/fipse/princp.html.
National Center for Education Statistics (NCES). (2003). The digest of education statistics, 2002
(Table 243). Retrieved Dec. 24, 2003, from
http://www.nces.ed.gov/programs/digest/d02/tables/dt243.asp.
National Center for Public Policy and Higher Education (NCPPHE). (2002). Fund for the
Improvement of Postsecondary Education: The early years (#02-5). Washington, D.C.:
Author.
NTS Research Corporation. (1980). An evaluation of the Fund for the Improvement of
Postsecondary Education. Volume II: Final report. Durham, N.C.: Author.
Rogers, E. M. (1995). Diffusion of innovations (4th ed.). New York: Free Press.

55

You might also like