You are on page 1of 112

Study of Student Achievement

of Outcomes Using Rubrics

Inertia, Complexity

Drs. Bala, Clough, Jinkins, Kile


Industrial Engineering Program
UW - Platteville

P
D

e
Improvem

nt

C
Time

Input

IE

Assessment Data
Collection and Analysis

Curriculum
Design/Revision/Updates
via Faculty. Dept,
College

Outcomes
Product/
Service

Bala, Clough, Jinkins, Kile

Constituents

Plan & Update tools,


Outcomes &
Objectives

Objectives & Outcomes

Administration/
Faculty
Evaluation

Expectations/Requirements

Constituents

IE Program Continuous Improvement


Process1

ABET Criteria & PDCA2 Assist in Continuous


Improvement of IE Program.
Demings
Demings
wheel
wheel
(P.D.C.A.)
(P.D.C.A.)

ABET
Criteria

PLAN

DO

ACT

CHECK

Continuous
Improvement

ertia
n
I
/
e
c
n
a
t
Resis

Time

IE

Bala, Clough, Jinkins, Kile

University Mission,
College Mission and Strategic Plan, Program Mission
Rubrics

Program
Educational
Objectives

Assessment
(Tools, Data,
Analysis)

Evaluation
Feedback
Actions

Constituents

Program Learning Outcomes

Feedback for
Continuous
Improvement

Evaluation:
Interpretation of
Data & Actions

Educational
Strategies, Timeline
& C/E Diagram

Assessment:
Tools, Evidence
Collection &
Analysis

UW Platteville, IE Program Assessment & Evaluation Loops

IE

Bala, Clough, Jinkins, Kile

Self-Assessment of Assessment and Evaluation of IE Program Objectives and Outcomes3


0-not in place; 1-beginning stage of development; 2-beginning stage of implementation; 3-in place and implemented;
4-implemented and evaluated for effectiveness; 5-implemented, evaluated and at least one cycle of improvement

Stakeholder
Involvement
(Those who have a
vested interest in
program success)

Stakeholders are
identified
Primary stakeholders
are involved in
identifying
educational objectives
Primary stakeholders
are involved in
periodic evaluation of
educational objectives
Sustained
partnerships
with stakeholders are
developed

R
Performance
A
Objectives
T
(Graduates
performance 3-5
I
N years after completing
program)
G

Objectives are
defined

Stakeholders provide
input to development
of objectives

Number of objectives
is manageable

Objectives are aligned


with mission

Objectives are
periodically assessed

Objectives are
periodically evaluated
for relevancy

IE

R
A
T
I
N
G

Learning Outcomes
(desired knowledge,
skills, attitudes,
behaviors at
graduation)

Outcomes are
identified

Number of outcomes
are manageable

R
A
T
I
N
G

Outcomes aligned
with educational
practice

Desired outcomes are


mapped to
educational practices
and/or strategies

Outcomes are mapped


to both curricular and
co-curricular
activities

Practices/strategies
are systematically
evaluated using
assessment data
Educational practices
are modified based on
evaluation of
assessment data

Outcomes are
publicly documented

Outcomes are linked


to performance
objectives

Outcomes are defined


by a manageable
number of measurable
performance
indicators

Outcomes are aligned


with mission

Bala, Clough, Jinkins, Kile

R
A
T
I
N
G

Program and/or
Institutional
Assessment

R
A
T
I
N
G

Assessment is
systematic at the
program/ institutional
level

Assessment data are


systematically
reviewed

Multiple methods are


used to measure each
outcome

Evaluation of results
are done by those
who can effect
change

Both direct and


indirect measures of
student learning are
used to measure
outcomes

Evaluation of
assessment data is
linked to practices

Assessment processes
are reviewed for
effectiveness and
efficiency

Evaluation leads to
action

Assessment methods
are modified based on
evaluation processes

Evaluation

R
A
T
I
N
G

MATRIX LINKING UNIVERSITYS MISSION & IE PROGRAM OBJECTIVES


UNIVERSITYS MISSION

IE PROGRAM
OBJECTIVES

M1
Broad
Education

M2 BS
Programs
in Arts,
Sci., etc.

M3 BS
Programs in
Educ., Eng.,
etc.

M4
Grad.
Programs

M5
Dist.
Educ.

M6
Farm

M7
Scholarly
Activities

M8
Student
Services

M9
Resource
to WI

1 Eng. Skills &


Practice

Yes

N/A

Yes

N/A

N/A

N/A

Yes

Yes

Yes

2 Team Skills

Yes

N/A

Yes

N/A

N/A

N/A

Yes

Yes

Yes

3 Ethical,
Professional,
Social, Global
Issues

Yes

N/A

Yes

N/A

N/A

N/A

Yes

Yes

Yes

4 Solve problems

Yes

N/A

Yes

N/A

N/A

N/A

Yes

Yes

Yes

5 Professional
Growth

Yes

N/A

Yes

N/A

N/A

N/A

Yes

Yes

Yes

IE
Bala, Clough, Jinkins, Kile

MATRIX LINKING COLLEGES STRATEGIC PLAN & IE PROGRAM OBJECTIVES


College of EMS Strategic Plan Themes

IE PROGRAM
OBJECTIVES

T1
Quality
Education

T2
Culture

T3 Assessment

T4 Outreach

T5 Faculty

T6 - Funds

1 Eng. Skills & Practice

Yes

N/A

Yes

N/A

Yes

Yes

2 Team Skills

Yes

N/A

Yes

N/A

Yes

Yes

3 Ethical, Professional,
Social, Global Issues

Yes

N/A

Yes

N/A

Yes

Yes

4 Solve Problems

Yes

N/A

Yes

N/A

Yes

Yes

5 Professional Growth

Yes

N/A

Yes

N/A

Yes

Yes

IE
Bala, Clough, Jinkins, Kile

MATRIX LINKING IE PROGRAM OBJECTIVES & OUTCOMES


IE PROGRAM OUTCOMES
IE PROGRAM
OBJECTIVES
1 Eng. Skills & Practice
2 Team Skills

1Foundation

2
Communication

3Responsibility

4
Problem
Solving

5
Growth

X
X

3 Ethical, Professional, Social,


Global Issues

4 Solve Problems

5 Professional Growth

IE
Bala, Clough, Jinkins, Kile

Relationship Between IE Program Outcomes and ABET/EAC Outcomes


ABET/EAC
Outcome/
Graduate
Expectation

Industrial Engineering Program Outcomes


1
Foundation

2
Communication

3
Responsibility

4
Problem Solving

5
Growth

j
k

IE

Bala, Clough, Jinkins, Kile

Cause and Effect Diagram for Not Achieving Outcome 3 (f)


* Average Score to 3
* Fraction Satisfied to 0.6

Cause Effect Diagram for not Achieving EAC/ABET Outcome 3(f)

IE

Bala, Clough, Jinkins, Kile

10

Outcomes in IE courses: I-Introductory, E-Emphasis, and R-Reinforcement


Industrial Engineering Program Outcomes
Course
Number

1
Foundation

Communication

Responsibility

4
Design

5
Growth

IE 2130

IE 34 30

IE 3530

IE 3630

IE 4030

IE 4230

IE 4430

IE 4730

IE 4930

ME 3040

Required

IE

Bala, Clough, Jinkins, Kile

11

Outcomes in IE courses: I-Introductory, E-Emphasis, and R-Reinforcement


Industrial Engineering Program Outcomes
Course
Number

1
Foundation

3
Responsibility

4
Design

5
Growth

Communication

Elective
IE 4130

IE 4330

IE 4630

IE 4750

IE 4780

IE 4830

ME 4230

IE

Bala, Clough, Jinkins, Kile

12

DIRECT ASSESSMENT TOOLS


z
z
z
z
z
z
z
z
z
z

Evaluations of students on co-op (Objectives, Outcomes)


Student Portfolios (Outcomes)
Statistical Data (Objectives, Outcomes)
FE examination Results (Outcomes)
Placement Rate of Graduates (Objectives, Outcomes)
Participation in Co-op program (Objectives, Outcomes)
IE3530/3630 Mid-Program Evaluation (Outcomes)
Face-to-Face Meetings/Assessment in faculty offices
Future: Simulations, Oral Examinations, Focus Groups
Course Materials & Rubrics (Outcomes)

IE

Bala, Clough, Jinkins, Kile

13

INDIRECT ASSESSMENT TOOLS


z
z
z
z
z
z
z

Alumni Survey (Objectives)


Employer Survey (Objectives)
Entrance Survey
Exit Survey (Objectives, Outcomes)
Recruiter Survey (Objectives, Outcomes)
Project Sponsor Survey (Outcomes)
Self-Studies- Internal & External Reviews (Objectives,
Outcomes)

IE

Bala, Clough, Jinkins, Kile

14

Matrix Relating IE Program Outcomes and Assessment Tools


Assessment Tool

Industrial Engineering Program Outcomes


1
Foundation

2
Communication

3
Responsibility

4
Problem
Solving

5
Growth

Alumni Survey

Employer Questionnaire

IE Graduate Exit Questionnaire

Industrial Project Sponsor Survey

N/A

Employer Assessment of
Academic Preparation

Student Portfolio

Direct Assessment
of Course Activities by Students

Indirect Assessment
of Course Activities by Faculty

IE
Bala, Clough, Jinkins, Kile

15

Assessment Tools
ASSESSMEN
T TOOL

RESPONSIBILITY
FOR
ADMINISTRATION

SCHEDULE /
FREQUENCY

ASSESSMENT DATA PREPARATION


& EVALUATION PROCESS

1. Alumni
Survey

Department Chair / IE
Program Coordinator

January of each year


Poll two-year alumni
and five-year alumni.

Summer of each year


Summarize assessment data in spreadsheet.
Conduct test of hypothesis of fraction satisfied with
objectives.
Conduct test of hypothesis of mean score for achieving
objectives.
Note extreme comments.
Present findings, identify potential causes, and suggest
actions to faculty. Apply the cause-effect diagrams to
discuss action plan.

Department Chair / IE
Program
Coordinator/IE
Faculty Volunteer

Once every year, but


consolidated every three
years into action items

Major revisions and improvements are presented and


feedback is used by faculty.

.
.
11. College of
EMS Advisory
Board &
Alumni Board

IE

Bala, Clough, Jinkins, Kile

16

Rubrics
z

Systematic scoring methods that use pre-determined


performance criteria to collect direct assessment data.
Performance criteria or metrics permit numerical
measurement of a learning outcome on dimensions that
are considered to be important for that outcome.
A way to inform students what is expected and to score
student work using a fixed framework, objectively and
consistently.

IE

Bala, Clough, Jinkins, Kile

17

Rubrics IE Program at UW - P
z

z
z

Short, simple, and measurable metrics on a single


page
Emphasis is on different skills, abilities,
capabilities, and understandings.
Being reevaluated and revised for use from 2007
Provide valuable feedback to students by
identifying the degree to which a metric for an
outcome was not demonstrated
IE

Bala, Clough, Jinkins, Kile

18

Rubrics IE Program at UW P
(continued)
z

Common to all courses in the program.


Used by all faculty members. Saves time
for all parties.
See rubrics for outcomes (a) through (k)
provided as handout.
See also References.

IE

Bala, Clough, Jinkins, Kile

19

RUBRICS: ABET outcome (a) through (k)


z
z

Measurable attributes (performance criteria or metrics) for each


rubric.
Rubric for outcome (i) (life-long learning) has four performance
criteria or metrics, each with its own weight (W):
Ability to Locate and Use Resources on the Web (W=1)
Ability to Use Reference Books, Books, Periodicals, and
Archives, & Inter-Library Loans in Libraries (W=1)
Ability to Locate & Learn from Recent Publications in IE
(W=1)
Familiarity with Services Provided by Professional Societies
(W=1)

IE

Bala, Clough, Jinkins, Kile

20

RUBRICS: ABET outcome (a) through (k)


z

z
z

Each metric is scored on a descriptive, graduated rating


scale with a range. For each metric, observable levels of
performance are assigned a numerical score (S) from 0 to
3: Unacceptable (Score, S=0), Marginal (Score, S=1),
Acceptable (Score, S=2), Exceptional (Score, S=3)
Score for each metric or performance criterion = W*S.
Total points (TP) for an outcome = W*S. The TP is
used to categorize the achievement of that come as
unacceptable, marginal, acceptable, or exceptional.

IE

Bala, Clough, Jinkins, Kile

21

USE OF COURSE GRADE OR


SCORE FOR A COURSEWORK
z

Grade assigned to a student for a course or coursework


quantifies the degree to which a student has successfully met a
faculty member's requirements and expectations. Many factors
determine this grade.
Cannot be used across curriculum to measure students
progress in achieving outcomes.
Performance
Level
Unacceptable
Performance

IE

Acceptable
Performance

Bala, Clough, Jinkins, Kile

22

COURSE GRADE OR SCORE FOR


A COURSEWORK
z

Logic is similar to acceptance sampling.


Sentencing of student performance
Students need to meet with faculty for feedback or
extensive written feedback must be provided when
student work is graded. Rubrics simplify the
academic life for both faculty and students.

IE

Bala, Clough, Jinkins, Kile

23

Rubrics: Advantages
z
z

z
z
z

Serve as scoring or grading criteria for


evaluating coursework
Inform students how coursework will be scored
and emphasizes skill levels for a few
performance measures for each outcome
Provide informative feedback to students
Make scoring coursework consistent and
reliable.
How can both performers and judges focus
their preparation on excellence?

IE

Bala, Clough, Jinkins, Kile

24

Advantages of using rubrics


z
z
z
z
z
z
z

allow assessment to be more objective and consistent


focus the teacher to clarify his/her criteria in specific
terms
clearly show the student how their work will be evaluated
and what is expected
promote student awareness of about the criteria to use in
assessing peer performance
provide useful feedback regarding the effectiveness of the
instruction
provide benchmarks against which to measure and
document progress
See also References

IE

Bala, Clough, Jinkins, Kile

25

Advantage of using a rubric to


assess an outcome
Metric 2

Metric 3

Metric 1

Acceptable
Performance level
Performance level & feedback

IE

Bala, Clough, Jinkins, Kile

26

%Acceptable & Exceptional


or Average Score.

Simple Use of Assessment


Data from Rubrics in 2006

IE
Bala, Clough, Jinkins, Kile

27

%Acceptable or
Exceptional

Unacceptable
Acceptable

100%

60%

Marginal
Exceptional

IE 2130 - Intro to IE
Outcome (g)

IE 4930 Capstone Design


Outcome (g)

Planned Use of Assessment Data from


Rubrics after 2007 - Track a target Group.
IE
Bala, Clough, Jinkins, Kile

28

%Acceptable
& Exceptional
Metric 1
100%

Metric 2
Metric 3

60%

IE 2130 Intro to IE
Outcome (g)

Simple Use of Assessment Data


from Rubrics after 2007
IE
Bala, Clough, Jinkins, Kile

29

%Acceptable &
Exceptional
100%

Metric 1
Metric 2
Metric 3

60%

Outcome (a)a

IE 2130 - Intro to IE
Outcome (g)

IE 4930 Capstone
Design Outcome (g)

Planned Use of Assessment Data from


Rubrics after 2007 to track performance
of a target group
IE
Bala, Clough, Jinkins, Kile

30

Frequency of Assessment Data - ABET/EAC Outcomes


ABET/EAC
Outcome/
Graduate
Expectation

2006-2007

Assessment Data Collection Period


2007-2008

2008-2009

2009-2009

2010-2011

IE

Bala, Clough, Jinkins, Kile

31

Collecting Assessment Data from Course


Materials using Rubrics

Student work should be organized according to


outcomes (a) through (k). In addition, course
materials may be arranged by courses.
Displays materials should be organized to illustrate
compliance with criteria elements.
Display materials need to be interpreted. Show
evaluator how each criterion is being satisfied.

IE
Bala, Clough, Jinkins, Kile

32

EAC/ABET OUTCOME a: Ability to Apply Knowledge.


Direct Assessment of Outcome Using Rubrics. Tally of
data from rubric for (a) in course materials displayed.
FACULTY CONCLUSION: Achieved outcome.
Total
Points

Frequency
f

[0,4]
[5,6]
[7,11]
12

2
5
41
20
68

MidPoint
m

2
5.5
9
12

f*m

Average: Value in the


range 7 to 12 will be
acceptable

4
27.5
369
240
640.5

9.42

IE
Bala, Clough, Jinkins, Kile

33

Assessment data from displayed course materials


& rubric for outcome (a)
EAC/ABET Outcome (a)
50
Freq.

40
30
20
10
0
1

1: Unacceptable, 2: Marginal,
3: Acceptable, 4: Exceptional

IE
Bala, Clough, Jinkins, Kile

34

Examples of Assessment Data Summary,


Analysis, and Interpretation
z
z

See the list of direct and indirect assessment tools.


Following slides illustrate some possible methods
of analysis.
Not all methods may be applicable to all data sets.
Applicability will depend on the type of data and
the underlying probability distributions.

IE

Bala, Clough, Jinkins, Kile

35

IE

Bala, Clough, Jinkins, Kile

36

IE

Bala, Clough, Jinkins, Kile

37

IE

Bala, Clough, Jinkins, Kile

38

IE

Bala, Clough, Jinkins, Kile

39

Evaluation of Assessment Data from Employer Survey


Outlier Responses & Actions
FINDING
YEAR

POTENTIAL
CAUSES

ACTION

2000

One employer disagreed that


alumnus had ability to function
on multidisciplinary teams.
The null hypothesis of not
achieving this outcome was
rejected. The response may be
disregarded as an outlier, but
faculty decided to address it.

See the cause effect


diagram. .

Use the rubrics to provide effective feedback to


students.
Use the capstone design course to emphasize this skill.
Cumulative assessment data does not show that this
continues to be a problem.

2000 &
2001

One employer disagreed that


alumnus had effective oral and
written communication skills.
The null hypothesis of not
achieving this outcome was
rejected. The response may be
disregarded as an outlier, but
faculty decided to address it.

See the cause effect


diagram.

Use the rubrics to provide effective feedback to


students.
Use the senior level and capstone design course to
emphasize this skill.
Cumulative assessment data does not show that this
continues to be a problem.

IE
Bala, Clough, Jinkins, Kile

40

References
z

1. ISO 9000:2005, Quality management systems -Fundamentals and vocabulary, International Organization
for Standardization (ISO), 1, ch. de la Voie-Creuse, Case
postale 56, CH-1211 Geneva 20, Switzerland.

2. Montgomery, D., Introduction to Statistical Quality


Control, John Wiley, 2005.

3. Rogers, G., Assessment Planning Flow Chart, 5233


Wagon Shed Circle, Owings Mills, MD 21117, email:
grogers@abet.org

IE

Bala, Clough, Jinkins, Kile

41

References - Continued
z

4. CRITERIA FOR ACCREDITING ENGINEERING


TECHNOLOY PROGRAMS, Effective for Evaluations
During the 2007-2008 Accreditation Cycle. Incorporates
all changes approved by the ABET Board of Directors as
of October 28, 2006. Engineering Accreditation
Commission. ABET, Inc. 111 Market Place, Suite 1050,
Baltimore, MD 21202

IE

Bala, Clough, Jinkins, Kile

42

References - Continued
z

5. CRITERIA FOR ACCREDITING ENGINEERING


PROGRAMS, Effective for Evaluations During the 20072008 Accreditation Cycle. Incorporates all changes
approved by the ABET Board of Directors as of October
28, 2006. Engineering Accreditation Commission. ABET,
Inc. 111 Market Place, Suite 1050, Baltimore, MD 21202

IE

Bala, Clough, Jinkins, Kile

43

References - Continued
z

6. Rogers, Gloria, Rubrics: What Are They Good for


Anyway? Part I - An Assessment101 column in ABET's
Community Matters newsletter, September 2006.
7. Rogers, Gloria, Rubrics: What Are They Good for
Anyway? Part II - An Assessment101 column in ABET's
Community Matters newsletter, October 2006.
8. Rogers, Gloria, Rubrics: What Are They Good for
Anyway? Part III - An Assessment101 column in ABET's
Community Matters newsletter, November 2006.
9. Rogers, Gloria, Assessment Planning,
http://www.abet.org/assessment.shtml#Assessment%20ma
trix

IE

Bala, Clough, Jinkins, Kile

44

Department of Industrial and Mechanical Engineering


Industrial Engineering (IE) Program

EAC/ABET Criterion, Outcome 3(a): An ability to apply knowledge of


mathematics, science, and engineering.
Rubrics/Metrics/Performance Criteria/Standards
Metric &
Weight (W)

Apply
Mathematics &
Basic Science
(W=1)

Apply General
Engineering
Knowledge
(W=1)

Apply IE
Fundamental
Concepts
(W=2)

Unacceptable
(Score, S=0)

Marginal
(Score, S=1)

Acceptable
(Score, S=2)

Exceptional
(Score, S=3)

Needs assistance
realizing that systems &
processes to be designed
or improved require a
sound foundation in
mathematics, physics,
chemistry, and biology.

Appreciates that systems


& processes to be
designed or improved
require a sound
foundation in
mathematics, physics,
chemistry, and biology.

Can apply mathematical


and/or scientific
principles to design or
improve systems and
processes.

Excels in applying
mathematical and/or
scientific principles to
design or improve
systems and processes.

Mathematical and
scientific terms are not
interpreted.

Mathematical and
scientific terms are
interpreted incorrectly.

Mathematical and
scientific terms are
interpreted correctly.

Excellent interpretation
of mathematical,
statistical, and scientific
terms.

Needs assistance to apply


mathematical, statistical,
or scientific theories and
concepts to solve
problems.

Can apply mathematical,


statistical, or scientific
theories and concepts to
solve problems, but
errors are made.

Can apply mathematical,


statistical, or scientific
theories and concepts to
solve problems, but with
a few errors.

Excels in using
mathematical, statistical,
or scientific theories and
concepts to solve
problems.

Modeling and
calculations done
incorrectly.

Modeling and
calculations have 3 or
more errors.

Modeling and
calculations have very
few errors.

Modeling and
calculations are done
correctly.

Modeling, graphics, and


calculations done
incorrectly.

Modeling, graphics, and


calculations have 3 or
more errors.

Modeling, graphics, and


calculations have very
few errors.

Modeling, graphics, and


calculations are done
correctly using a variety
of software.

Needs assistance to
translate theories and
make realistic
assumptions to develop
models of systems and
processes.

Makes unrealistic
assumptions to develop
models of systems and
processes.

Can translate theories or


make realistic
assumptions to develop
models of systems and
processes.

Excels in using theories,


making realistic
assumptions and
developing good models
of systems and processes.

Assumes that computer


models of systems and
processes are valid.

Knows the difference


between a system and a
model of that system, but
cannot validate models.

Knows the difference


between a system and a
model of that system, but
is not very good at model
validation approaches.

Accepts limitations of IE
& mathematical models
of systems and processes
& establishes validity of
models before using them
to make decisions.

Needs assistance to apply


statistical techniques to
model, study, analyze,
design, or improve
systems.

Can apply statistical


techniques to model,
study, analyze, design, or
improve systems, but
makes many errors.

Can apply statistical


techniques to model,
study, analyze, design, or
improve systems with
very few errors.

Excels in applying
statistical techniques to
model, study, analyze,
design, or improve
systems.

Points (P)
P = W*S

Total Points (TP=P)


Overall
Performance
Criterion: TP7

Unacceptable
0TP3

Marginal
4TP6

Acceptable
7TP11

Exceptional
TP=12

The following details may be used for tracking student/team performance over time.
Course # & Title:
Name - Student/Team:

Reviewer/Assessor:
Date:

Department of Industrial and Mechanical Engineering


Industrial Engineering (IE) Program

EAC/ABET Criterion, Outcome 3(b): An ability to design and conduct


experiments, as well as to analyze and interpret data.
Rubrics/Metrics/Performance Criteria/Standards
Metric &
Weight (W)

Problem Recognition
and Statement
(W=1)

Selection of Response
Variable(s), Factors,
Levels, And Ranges.
(W=1)

Choice of DOE Model.


(W=2)

Perform Experiments
(W=2)

Unacceptable
(Score, S=0)

Marginal
(Score, S=1)

Acceptable
(Score, S=2)

Does not understand the


system, inputs, outputs,
and noise factors.

Understands the system,


inputs, outputs, and noise
factors.

Understands the system,


inputs, outputs, and noise
factors.

Excellent knowledge of
system, inputs, outputs,
noise factors, etc.

Does not develop


problem statement.

Can develop problem


statement, but critical
information is left out.

Can develop problem


statement satisfactorily.

Uses that knowledge to


define problem clearly.

Has severe difficulty


identifying response
variables, factors and
their levels, precision in
setting inputs, method of
measurement, etc.

Can identify one response Can identify one response Can identify multiple
variable and factors.
variable and factors.
responses.
Understands the system
Has difficulty in selecting Can choose factor levels
& can rank factors.
factor levels.
to use.
Understands sequential
experimentation and
Has difficulty setting
Knows about setting
screening experiments to
inputs or method of
inputs or method of
set factor levels.
measurement of response. measurement of response. Can define accuracy and
precision for all inputs &
outputs.

Needs assistance to
choose the model to use.
Needs assistance to
determine the need for
blocking.

Can choose the model,


but needs reassurance
from a mentor.
Can recognize
controllable noise factors
and use blocking.

Can choose model


correctly and confidently.
Applies blocking where
necessary.

Not only chooses models


correctly, but also knows
how to improve the
model through sequential
experiments.

Does not distinguish


between repetition and
replication.

Knows the difference


between repetition and
replication, but needs
reassurance.

Determines the need for


repetition or replication,
calculates sample size,
conducts experiments
confidently, and collects
data in an organized
manner.

Excellent knowledge of
repetition or replication.

Needs assistance to
determine sample size.
Needs assistance to plan
experiments & collect
data.

Can determine sample


size, but needs
reassurance and help with
experiments and
collection of data.

Exceptional
(Score, S=3)

Knows many methods to


calculate sample size.
Plans, organizes, &
conducts experiments
well. Uses dat collection
forms.

Check Validity of
Model & Apply
Statistical Tools to
Analyze Data (W=2)

Not familiar with model


validation concepts and
statistical tools for
analysis.

Familiar with a few


model validation methods
and statistical tools for
analysis , but makes
many errors.

Familiar with many


model validation methods
and statistical tools for
analysis, but makes a few
errors.

Excellent with many


model validation methods
and statistical tools for
analysis. Makes no
errors.

Make Statistical
Inferences about
Product or Process
Design or
Improvement. (W=2)

Makes very little or no


attempt to interpret data.

Makes many errors and


omissions in tests of
hypothesis or confidence
interval estimation.

Makes a few errors in


tests of hypothesis or
confidence interval
estimation.

Excels in tests of
hypothesis or confidence
interval estimation to
make good improvements
in product or process.

Interprets more than what


the data implies.

Total Points (TP=P)


Overall Performance
Criterion: TP19

Unacceptable
0TP9

Marginal
10TP18

Acceptable
19TP26

The following details may be used for tracking student/team performance over time.
Course # & Title:
Name - Student/Team:

Reviewer/Assessor:
Date:

Exceptional
27TP30

Points (P)
P = W*S

Department of Industrial and Mechanical Engineering


Industrial Engineering (IE) Program

EAC/ABET Criterion, Outcome 3(c): An ability to design a system,


component, or process to meet desired needs.
Rubrics/Metrics/Performance Criteria/Standards
Metric &
Weight (W)

Unacceptable
(Score, S=0)
No design strategy.
Haphazard approach.

Marginal
(Score, S=1)

Recognizes the need for a Develops a design


design strategy, but needs strategy independently
an example and guidance. with a few errors.

Design Strategy
(W=1)

System, Components,
and Processes. (W=1)

Creativity &
Innovation
(W=2)

Documentation &
Resources Used (W=2)

Acceptable
(Score, S=2)

May need correction or


some aspects need
embellishments.
Does not define and
represent how
components and
processes are integrated
to achieve specific
objectives of the system.

Exceptional
(Score, S=3)
Develops a design
strategy, including a plan
of attack, decomposition
of work into subtasks,
and development of a
timeline using Gantt
chart. No changes needed
to the developed strategy.

Needs an example and


guidance to define and
represent how
components and
processes are integrated
to achieve specific
objectives of the system.

Can define and represent


how components and
processes are integrated
to achieve specific
objectives of the system,
but some additional help
may be required from a
mentor.

Excels in defining and


representing how
components and
processes are integrated
to achieve specific
objectives of the system.

Tries to take existing


Needs help to develop a
designs and adapt them to good design of system,
serve the new system,
component, or process.
component, or process.
Cannot appreciate need
for alternative designs.

Develops a good design


of system, component, or
process and assumes that
to be adequate.

Suggests new approaches


and improves on past
designs of system,
component, or process.

Requires help in
developing alternative
designs.

Generates alternative
designs, evaluates these,
and selects the optimal
design.

Does not know and is not


interested in using
computer software and
engineering resources.

Good at using computer


software and engineering
resources.

Excels in using computer


software and engineering
resources effectively

Knows and learns to use


computer software and
engineering resources.

Design procedure is not


documented and
references are rare.

Design procedure
Design procedure
requires corrections and
requires additions and
references are inadequate. references are
incomplete.

Supports design
procedure with
documentation and
references.

Applying Engineering
and Science
Knowledge (W=2)

Poor in applying
engineering and/or
scientific principles
correctly to design
practical components,
processes, or systems.

Fair in applying
engineering and/or
scientific principles
correctly to design
practical components,
processes, or systems.

Good at applying
engineering and/or
scientific principles
correctly to design
practical components,
processes, or systems.

Excellent in applying
engineering and/or
scientific principles
correctly to design
practical components,
processes, or systems.

Constraints Identified
and taken into
Account (W=2)

Does not take into


account economic, safety,
environmental, and other
constraints to generate
realistic designs that
customers will prefer.

Realizes that there are


economic, safety,
environmental, and other
constraints, but needs
help to generate realistic
designs that customers
will prefer.

Some constraints are not


taken into account in
designing components,
processes, or systems.

Takes into account


economic, safety,
environmental, and other
constraints to generate
realistic designs that
customers will prefer.

Total Points (TP=P)


Overall Performance
Criterion: TP19

Unacceptable
0TP9

Marginal
10TP18

Acceptable
19TP26

The following details may be used for tracking student/team performance over time.
Course # & Title:
Name - Student/Team:

Reviewer/Assessor:
Date:

Exceptional
27TP30

Points (P)
P = W*S

Department of Industrial and Mechanical Engineering


Industrial Engineering (IE) Program

EAC/ABET Criterion, Outcome 3(d): An ability to function on multi-disc


Rubrics/Metrics/Performance Criteria/Standards

Metric &
Weight (W)

Take Responsibility
(W=1)

Unacceptable
(Score, S=0)

Marginal
(Score, S=1)

Even after prompting


from team leader to be
organized and keep track
of assigned work and due
dates, does not take
responsibility for most
work. Always tardy.

Needs prompting from


team leader to be
organized and keep track
of assigned work and due
dates. May be tardy
sometimes.

Shares information &


Rarely shares information experience sometimes.
or experience.
Must be assigned work
Does not do assigned
and due date, but will not
work. Others on the team share work and
will have to do the work
responsibility.
after due date.
Will not do research and
gather information and
data even if reminded
often.

Contribution to Team
Effort & Work (W=1)

Will not provide


innovative ideas, generate
creative solutions, and
generate good alternative
solutions. Expects others
to do more work because
of other commitments.
Discourteous to others.
Criticizes others work in
a negative way.

Does not value others


Respect, Civility,
Communication (W=1) viewpoints.
Has poor listening skills
and does not pay
attention when others
talk.

Knowledge of other
Disciplines (W=1)

Needs specific directions


to apply knowledge of
technical skills, issues,
and approaches germane
to disciplines outside of
IE.

Acceptable
(Score, S=2)

Exceptional
(Score, S=3)

Organized.

Well organized.

Will assume team


member roles most of the
time. Tardy a few times.

Has many abilities and


volunteers to do team
work. Never tardy.

Shares information &


experience most of the
time.

Shares information &


experience always.
Writes assignments and
deadlines.

Remembers work to be
done and due dates, but
may forget a few.

Delivers work on time.

Will do research and


gather information and
data if reminded often.

Will do research and


gather information and
data if requested.

Will provide innovative


ideas, generate creative
solutions, and generate
good alternative solutions
if reminded often.

Will provide innovative


ideas, generate creative
solutions, and generate
good alternative solutions
if prompted.
Always prepared for team
meetings.
Not prepared for team
meetings once or twice.

Not prepared for team


meetings sometimes.
Courteous to all
sometimes.

Usually courteous to all.

Provides positive
feedback sometimes.
Values others
viewpoints sometimes.

Has the initiative to do


research, provides
innovative ideas,
generates creative
solutions, and generates
good alternative
solutions.

Courteous and
nonjudgemental always.

Provides positive
feedback when necessary.
Values others
viewpoints almost
always.

Listening skills need


improvement.

Has good listening skills,


attention fades
occcassionally.

Does not have


knowledge of technical
skills, issues, and
approaches germane to
disciplines outside of IE,
but will acquire them
when needed.

Has elementary
knowledge of technical
skills, issues, and
approaches germane to
disciplines outside of IE,
but will augment when
needed.

Participates in
discussions, respects
colleagues, makes
significant contributions
while discussing others
work, values others
viewpoints, & functions
effectively as a team
member.
Has very good
knowledge of technical
skills, issues, and
approaches germane to
disciplines outside of IE.

Total Points (TP=P)


Overall Performance
Criterion: TP7

Unacceptable
0TP3

Marginal
4TP6

Acceptable
7TP11

The following details may be used for tracking student/team performance over time.
Course # & Title:
Name - Student/Team:

Reviewer/Assessor:
Date:

Exceptional
TP=12

Points (P)
P = W*S

Department of Industrial and Mechanical Engineering


Industrial Engineering (IE) Program

EAC/ABET Criterion, Outcome 3(e): An ability to identify, formulate,


and solve engineering problems.
Rubrics/Metrics/Performance Criteria/Standards
Metric &
Weight (W)

Unacceptable
(Score, S=0)

Marginal
(Score, S=1)

Strategy
(W=3)

Uses very few resources.


Requires guidance in
integrating knowledge
and experience.
Has no strategy to
identify, formulate, and
solve eng. problems.

Fair in locating resources,


integrating knowledge
and experience, and
formulating a good
strategy to solve
engineering problems.

Good at locating
resources, integrating
knowledge and
experience, and
formulating a good
strategy to solve
engineering problems.

Excellent in locating
resources, integrating
knowledge and
experience, and
formulating a good
strategy to solve
engineering problems.

Can use limited number


of tools and software to
solve problems related to
existing systems.

Fair use of multiple tools,


techniques, and software
for analyzing existing
systems and solving
problems.

Good at multiple tools,


techniques, and software
for analyzing existing
systems and solving
problems.

Excels in multiple tools,


techniques, and software
for analyzing existing
systems and solving
problems.

Fair in applying multiple


tools to solve problems
related to synthesis of
new systems.

Good in applying
multiple tools to solve
problems related to
synthesis of new systems.

Excels at applying
multiple tools to solve
problems related to
synthesis of new systems.

Needs assistance to break


large problems into
smaller sub-problems.

Can breakdown complex


problems into subproblems and apply
theoretical concepts, but
makes many errors.

Can breakdown complex


problems into subproblems and apply
theoretical concepts, but
makes a few errors.

Can breakdown complex


problems into subproblems and apply
theoretical concepts.

Needs significant
assistance to apply theory
and identify multiple
constraints.

Needs to be shown how


various pieces of the
large problem relate to
each other and the whole
system.

Knows how various


pieces of the large
problem relate to each
other and the whole
system, but makes a few
errors.

Understands how various


pieces of the large
problem relate to each
other and the whole
system.

Needs significant
assistance in generating
alternative solutions and
comparing them.

Needs help in taking into


account practical
constraints (social,
environmental, and
other). Can generate one
alternative only or needs
some help.

Good at taking into


account practical
constraints (social,
environmental, and
other). Generates very
few alternatives and
compares them.

Excels in taking into


account practical
constraints (social,
environmental, and
other). Generates many
alternatives and compares
them.

Does not know what


assumptions to make.

Fair at stating
assumptions.

Good at stating
assumptions.

Excels in stating
assumptions.

Problem statement and


analyses are incomplete.

Problem statement and


analyses reuire revision.

Problem statement and


analyses require minor
revisions.

Problem statement and


analyses are precise.

References are not


current.

References are
incomplete.

Report does not follow


professional standards.

Report requires a lot of


editing and revisons.

References are relevant,


References are relevant,
current and complete.
but not current or
complete. Report requires Report is written in a
minor revisons.
professional manner.

Tools Used(W=2)

Solution Approach
(W=2)

Documentation
(W=3)

Does not know synthesis


of solution to subproblems.

Acceptable
(Score, S=2)

Exceptional
(Score, S=3)

Total Points (TP=P)


Overall Performance
Criterion: TP19

Unacceptable
0TP9

Marginal
10TP18

Acceptable
19TP26

The following details may be used for tracking student/team performance over time.
Course # & Title:
Name - Student/Team:

Reviewer/Assessor:
Date:

Exceptional
27TP30

Points (P)
P = W*S

Department of Industrial and Mechanical Engineering


Industrial Engineering (IE) Program

EAC/ABET Criterion, Outcome 3(f): An understanding of professional


and ethical responsibility.
Rubrics/Metrics/Performance Criteria/Standards
Metric &
Weight (W)

Unacceptable
(Score, S=0)

Marginal
(Score, S=1)

Acceptable
(Score, S=2)

Exceptional
(Score, S=3)

Needs assistance to
locate the code of
ethics of a
professional society.

Knows about the


code of ethics of a
society, and will
access and use them
when ethical
problems are faced.

Knows where to
access code of
ethics of at least 1
professional society.

Knows where to
access code of
ethics of 2 or more
professional
societies.

Has read and


demonstrated
adequate knowledge
of at least one
professional code of
ethics.

Has read and


demonstrated
excellent knowledge
of at least one
professional code of
ethics.

Knowledge of
Professional Code
of Ethics
Has read, but does
(W=1)
not remember
professional code of
ethics.
Knowledge of
Theories of
Ethics (W=1)

No evidence of
valuing ethical
theories.

Needs assistance to
idenify ethical
dilemmas and to
Ability to
apply the code of
Recognize Ethical
ethics from
Dilemmas (W=1)
professional
societies and/or
ethical theories.
Needs assistance to
analyze ethical
problems in case
studies.
Analyze Ethical
Problems in IE
Work and Make
Decisions (W=1)

Knows one theory


Remembers a few
of ethics that will be theories of ethics.
useful personally.

Excellent
knowledge of many
theories of ethics.

Will learn to apply


the code of ethics
from professional
societies and/or
ethical theories to
recognize ethical
dilemmas when
necessary.

Can apply at least 1


code of ethics from
a professional
society and/or
ethical theory to
recognize ethical
dilemmas and
analyze them.

Can apply the code


of ethics from
professional
societies and/or
ethical theories to
recognize ethical
dilemmas and
analyze them in
many ways.

Has ability to
analyze ethical
problems in IE work
through case
studies, but is not
interested.

Has demonstrated
good ability to
analyze ethical
problems in IE work
through case
studies.

Has demonstrated
excellent ability to
analyze ethical
problems in IE work
through case
studies.

Has generated good


soultions and made
good decisions in
the IE field.

Has generated
excellent soultions
and made sound
decisions in the IE
field.

Needs assistance in Has generated fair


generating solutions soultions and made
in case studies.
fair decisions in the
IE field.

Total Points (TP=P)


Overall Performance
Criterion: TP7

Unacceptable
0TP3

Marginal
4TP6

Acceptable
7TP11

The following details may be used for tracking student/team performance over time.
Course # & Title:
Name - Student/Team:

Reviewer/Assessor:
Date:

Exceptional
TP=12

Points (P)
P = W*S

Department of Industrial and Mechanical Engineering


Industrial Engineering (IE) Program

EAC/ABET Criterion, Outcome 3(g): An ability to communicate


effectively (Oral).
Rubrics/Metrics/Performance Criteria/Standards
Metric &
Weight (W)
Organization
&
Structure
(W=2)
Content
&
Knowledge
(W=3)
Visual Aids
&
Neatness
(W=2)

Delivery
&
Speaking Skills
(W=2)

Personal
Appearance &
Rapport with
Audience (W=1)

Unacceptable
(Score, S=0)

Marginal
(Score, S=1)

Acceptable
(Score, S=2)

Exceptional
(Score, S=3)

Presentation lacks
structure.

Difficult to follow
Most information is
presentation due to erratic presented in logical order
topical shifts and jumps.
and is easy to follow.

All information is
presented in a logical,
interesting and novel
sequence and is easy to
follow.

No grasp of content.
Needs assistance
answering questions
about subject.

Uncomfortable with
content. Capable of only
answering rudimentary
questions.

At ease with content and


able to elaborate and
explain to some degree.

Demonstration of full
knowledge of the subject
with explanations and
elaboration.

No visual aids or
inadequate slides.

Poor quality of visual


aids or visual aids do not
support the text or
presentation.

Visual aids are adequate


Text and presentation are
and related to the text and reinforced by the use of
presentation.
attractive visual aids.

Several spelling and/or


grammatical errors in
slides.

Minor misspellings
and/or grammatical
errors.

No spellings or
grammatical errors.

Voice is clear and at a


proper level. Most words
pronounced correctly.

Clear voice and correct


pronunciation of terms.

Mumbling or incorrect
pronunciation of terms.
Voice level too low or
too high. Does not use
appropriate vocabulary.

Occasional
mispronunciation of
terms. Uses appropriate
vocabulary.

Monotonous, no eye
contact, rate of speech
too fast or too slow.

Little eye contact, uneven Some eye contact, steady


rate, or only little
rate, and adequately
expression.
rehearsed.

Points (P)
P = W*S

Good eye contact, steady


rate, enthusiasm, or
confidence.

Inappropriate appearance. Appearance marginally


acceptable.

Appearance is good.

Appearance is
professional.

Needs assistance to
respond to questions and
comments.

Responds to questions
and comments, but is not
at ease or confident.

Responds to questions
and comments well.

Responds to questions
and comments
confidently.

Length is inappropriate.

Length is adequate.

Length is acceptable.

Length is appropriate.

Total Points (TP=P)


Overall
Performance
Criterion: TP19

Unacceptable
0TP9

Marginal
10TP18

Acceptable
19TP26

Exceptional
27TP30

The following details may be used for tracking student/team performance over time.
Course # & Title:
Name - Student/Team:

Reviewer/Assessor:
Date:

Department of Industrial and Mechanical Engineering


Industrial Engineering (IE) Program

EAC/ABET Criterion, Outcome 3(g): An ability to communicate


effectively (Written).
Rubrics/Metrics/Performance Criteria/Standards
Metric &
Weight (W)
Organization
&
Style

Unacceptable
(Score, S=0)

Marginal
(Score, S=1)

Acceptable
(Score, S=2)

Exceptional
(Score, S=3)

Sequence of
Work is hard to follow
information is difficult as there is very little
to follow. No apparent continuity.
structure or continuity.
Purpose of work is
Purpose of work is not stated, but does not
clearly stated.
assist in following
work.

Information is
presented in a logical
manner, which is
easily followed.

Information is
presented in a logical,
interesting way, which
is easy to follow.

Purpose of work is
clearly stated and
assists the structure of
work.

Purpose is clearly
stated and explains the
structure of work.

Subject matter not


clearly explained.

Uncomfortable with
content of report.

(W=3)

No questions are
answered. No
interpretation made.

Only basic concepts


are demonstrated and
interpreted.

At ease with content


and able to elaborate
and explain to some
degree.

Demonstration of full
knowledge of the
subject with
explanations and
elaboration.

Format
&
Aesthetics

Work is illegible,
format changes
throughout, e.g. font
type, size etc.

Mostly consistent
format.

Format is generally
consistent including
heading styles and
captions.

Format is consistent
throughout including
heading styles and
captions.

Figures and tables are


sloppy and fail to
provide intended
information.

Figures and tables are


legible, but not
convincing.

Figures and tables are


neatly done and
provide intended
information.

Figures and tables are


presented logically
and reinforce the text.

Spelling
&
Grammar
(W=1)

Numerous spelling
and grammatical
errors.

Several spelling and


grammatical errors.

Minor misspellings
and/or grammatical
errors.

Negligible
misspellings and/or
grammatical errors.

References

No list of references.
Inadequate list of
Material used in text is references or
not referenced.
referencing in text.

Reference section is
not in correct format,
but is sufficient.

Reference section is in
correct format and
comprehensive.

(W=2)

Content
&
Knowledge

(W=1)

(W=2)

Points (P)
P = W*S

Total Points (TP=P)


Overall
Performance
Criterion: TP14

Unacceptable
0TP8

Marginal
9TP13

Acceptable
14TP20

Exceptional
21TP27

The following details may be used for tracking student/team performance over time.
Course # & Title:
Name - Student/Team:

Reviewer/Assessor:
Date:

Department of Industrial and Mechanical Engineering


Industrial Engineering (IE) Program
EAC/ABET Criterion, Outcome 3(h): The broad education necessary to
understand the impact of engineering solutions in a global and societal
context.
Rubrics/Metrics/Performance Criteria/Standards
Metric &
Weight (W)

Unacceptable
(Score, S=0)

Marginal
(Score, S=1)

Acceptable
(Score, S=2)

Exceptional
(Score, S=3)

Familiarity with
Applications of IE
Tools, Methods &
Techniques in
Global and
Societal Context
(W=2)

Needs assistance in
locating resources
(libraries, websites,
journals, magazines,
etc) on 2 or more
applications.

Will locate resources


(libraries, websites,
journals, magazines,
etc) on 2 or more
applications when
necessary.

Good at locating
resources (libraries,
websites, journals,
magazines, etc) on 2
or more applications.

Excels in locating
resources (libraries,
websites, journals,
magazines, etc) on 2
or more applications.

No evidence of
reading papers on
these applications.

Will read papers from


one of the above
resources only when a
need arises.

Has acquired and read


one paper from one of
the above resources.

Has acquired and read


more than one paper
from one of the above
resources.

Breath and Depth


of the Impact of
Engineering
Solutions in
Global and
Societal Context
(W=3)

Needs significant
assistance in applying
IE methods to analyze
global and social
issues.

Needs examples and


instructions for
applying IE methods
to analyze global and
social issues.

Is familiar with at least


one specific IE method
applied to analyze
global and social
issues.

Is familiar with at least


two specific IE
methods applied to
analyze global and
social issues.

Needs assistance to
review and write a
report on specific IE
methods applied to
analyze global and
social issues.

Will review and write


a report on specific IE
methods applied to
analyze global and
social issues when
required.

Has reviewed and


written a report on one
specific IE method
applied to analyze
global and social
issues.

Has reviewed and


written a report on two
or more specific IE
methods applied to
analyze global and
social issues.

Understanding of
Impact of
Engineering
Solutions in
Global and
Societal Context
(W=1)

Is familiar with one


international standard
that can alleviate the
adverse impact of
engineering solutions
in global and societal
context.

Has a good knowledge


of 1 or more
international standards
that can alleviate the
adverse impact of
engineering solutions
in global and societal
context.

Has a very good


knowledge of 2 or
more international
standards that can
alleviate the adverse
impact of engineering
solutions in global and
societal context.

Has excellent
knowledge of 3 or
more international
standards that can
alleviate the adverse
impact of engineering
solutions in global and
societal context.

Needs assistance to
use a strategy to
harmonize standards
and management
systems for quality,
environment, social
responsibility, etc.

Will use a strategy for


harmonizing standards
and management
systems for quality,
environment, social
responsibility, etc., if
details are provided.

Understands the need


for a strategy for
harmonizing standards
and management
systems for quality,
environment, social
responsibility, etc.

Can implement a
strategy for
harmonizing standards
and management
systems for quality,
environment, social
responsibility, etc.

Points (P)
P = W*S

Total Points (TP=P)


Overall
Performance
Criterion: TP11

Unacceptable
0TP5

Marginal
6TP10

Acceptable
11TP15

Exceptional
16TP18

The following details may be used for tracking student/team performance over time.
Course # & Title:
Name - Student/Team:

Reviewer/Assessor:
Date:

Department of Industrial and Mechanical Engineering


Industrial Engineering (IE) Program

EAC/ABET Criterion, Outcome 3(i): A recognition of the need for, and


an ability to engage in life-long learning.
Rubrics/Metrics/Performance Criteria/Standards
Metric &
Weight (W)

Unacceptable
(Score, S=0)

Marginal
(Score, S=1)

Acceptable
(Score, S=2)

Exceptional
(Score, S=3)

Ability to Locate
and Use
Resources on the
Web (W=1)

Does not use resources


at websites dealing
with IE topics even
when the URLs are
provided.

Uses limited resources


at websites dealing
with IE topics when
the URLs are
provided.

Uses all web resources


when URLs are
provided and attempts
to locate and use a few
additional web
resources.

Uses websites listed


by instructor and does
extensive search to
locate and use more
than 5 other sources.

Ability to Use
Reference Books,
Books,
Periodicals, and
Archives, & InterLibrary Loans in
Libraries (W=1)

Has demonstrated very


poor ability to acquire
books and journal
articles, understand,
interpret, and apply
current, new, or
innovative concepts in
IE and related fields.

Has demonstrated fair


ability to acquire
books and journal
articles, understand,
interpret, and apply
current, new, or
innovative concepts in
IE and related fields.

Has demonstrated very


good ability to acquire
books and journal
articles, understand,
interpret, and apply
current, new, or
innovative concepts in
IE and related fields.

Has demonstrated
excellent ability to
acquire books and
journal articles,
understand, interpret,
and apply current,
new, or innovative
concepts in IE and
related fields.

Ability to Locate
& Learn from
Recent
Publications in IE
(W=1)

Has a poor plan for


life-long learning.

Has fair plan and


demonstrated ability
for life-long learning.

Has good plan and


demonstrated ability
for life-long learning.

Has excellent plan and


demonstrated ability
for life-long learning.

Has not demonstrated


ability to think, learn
from mistakes, and
apply new concepts.

Has fair ability to


think, learn from
mistakes, and apply
new concepts.

Has good ability to


think, learn from
mistakes, and apply
new concepts.

Has excellent ability to


think, learn from
mistakes, and apply
new concepts.

Familiarity with
Services Provided
by Professional
Societies (W=1)

No evidence that
membership in
professional societies
is valued.

Plans to be a member
in 1 professional
society.

Member of 1
professional society.

Member of 2 or more
professional societies.

Does not plan to have


a leadership role in
professional and other
societies on campus.

Does not have


leadership role in
professional or other
societies on campus.

Has leadership role in


1 professional society
on campus.

Has a leadership role


in 2 or more
professional or other
societies on campus.

Not interested in
courses or resources
available at the
website for the
society.

Will seek courses or


resources available
from societies when
needed.

Knows that the


website for the society
lists courses on current
topics and resources
available.

Aware of courses on
the current topics and
resources available at
the website for the
society.

Does not plan to use


service provided by
the society.

May use some services Has used a magazine


provided by the
or book from a
society in the future.
professional society.

Points (P)
P = W*S

Has used 2 or more


services provided by a
professional society.

Total Points (TP=P)


Overall
Performance
Criterion: TP7

Unacceptable
0TP3

Marginal
4TP6

Acceptable
7TP11

Exceptional
TP=12

The following details may be used for tracking student/team performance over time.
Course # & Title:
Name - Student/Team:

Reviewer/Assessor:
Date:

Department of Industrial and Mechanical Engineering


Industrial Engineering (IE) Program

EAC/ABET Criterion, Outcome 3(j): A knowledge of


contemporary issues.
Rubrics/Metrics/Performance Criteria/Standards
Metric &
Weight (W)

Unacceptable
(Score, S=0)

Poor knowledge of
Knowledge of
Sarbanes-Oxley Act.
SOX (W=1)
and Impact on IE
Profession

Marginal
(Score, S=1)
Fair knowledge of
Sarbanes-Oxley Act,
but is not aware of its
impact on the IE
profession.

Acceptable
(Score, S=2)

Exceptional
(Score, S=3)

Very good knowledge


of Sarbanes-Oxley Act
and is aware of its
impact on the IE
profession.

Excellent knowledge
of Sarbanes-Oxley Act
and is well aware of its
impact on the IE
profession.

Knowledge of Job Relies primarily on the Poor knowledge of job Good knowledge of
market and relies on
job market and is
Market (W=1) Placement Services.
the Placement
Services.

building a network to
seek information.

Excellent knowledge
of job market and has
an excellent network
to seek information.

Preparation for Is not prepared.


Interviews (W=1)

Poor preparation.

Good preparation
through books in
library & IE Office on
interviews, talks with
alumni, etc.

Excellent preparation
through books in
library & IE Office on
interviews, talks with
alumni, etc.

Planning Budget, Very poorly prepared.


Insurance, &
Investment,
(W=1)

Has some ides of


personal budget,
savings & investment,
but is not concerned.

Has a good basis for


preparing yearly
personal budget
showing all costs,
savings, and
investment.

Has sound basis for


preparing yearly
personal budget
showing all costs,
savings, and
investment.

Knowledge of
Graduate School
& Related Topics
(W=1)

No evidence of
knowledge about
graduate school
programs.

Plans to get data about


graduate programs in
IE and related fields
when needed.

Aware of graduate
programs in IE and
related fields.

Excellent knowledge
of graduate programs
in IE and related
fields.

Services to
Profession and
Society (W=1)

No evidence that
service to the
profession is valuable.

Not a member, but


Member now and may
will become a member be active in the future.
and try to be active.

Has demonstrated
dedicated leadership
roles on campus and
may continue in
future.

Ability to Engage
in Conversation
about Political,
Economic,
National,
Regional, and
international
Events or Issues
(W=1)

No evidence of
interest in newspaper
or magazines.

Plans to read
newspaper or
magazines in future.

Reads newspaper or
magazines randomly.

Reads newspaper and


current magazines on a
regular basis.

Uses the Web to play


games, sports, and not
for general news.

Plans to use the Web


to keep current.

Uses the Web to keep


current randomly.

Uses the Web to keep


current daily.

Does not contribute to


discussions.

Makes minor
contributions to
discussions.

Does contribute to
discussions.

Can make substantial


contribution to
discussions.

Total Points (TP=P)


Overall
Performance
Criterion: TP12

Unacceptable
0TP6

Marginal
7TP11

Acceptable
12TP17

Exceptional
18TP21

The following details may be used for tracking student/team performance over time.
Course # & Title:

Reviewer/Assessor:

Points (P)
P = W*S

Department of Industrial and Mechanical Engineering


Industrial Engineering (IE) Program

EAC/ABET Criterion, Outcome 3(j): A knowledge of


contemporary issues.
Rubrics/Metrics/Performance Criteria/Standards
Name - Student/Team:

Date:

Department of Industrial and Mechanical Engineering


Industrial Engineering (IE) Program
EAC/ABET Criterion, Outcome 3(k): An ability to use the techniques, skills,
and modern engineering tools necessary for engineering practice.
Rubrics/Metrics/Performance Criteria/Standards
Metric &
Weight (W)

Unacceptable
(Score, S=0)

Marginal
(Score, S=1)

Apply Human
Factors
Engineering Skills
(W=1)

Fair in limited areas of


ergonomics and
human factors
engineering.

Good at a few areas of


ergonomics and
human factors
engineering.

Very good at most


areas of ergonomics
and human factors
engineering.

Excellent in all areas


of ergonomics and
human factors
engineering.

Apply Operations Poor at applying


Research Models deterministic models.
and Techniques
Poor at applying
(W=1)

Good at applying
deterministic models.

Very good at applying


deterministic models.

Excellent in applying
deterministic models.

Good at applying
Markov processes and
queuing models.

Very good at applying


Markov processes and
queuing models.

Excellent in applying
Markov processes and
queuing models.

Fair in applying work


Apply Work
measurement
Measurement
Techniques (W=1) techniques.

Good at applying work Very good at applying


measurement
work measurement
techniques..
techniques..

Excellent in applying
work measurement
techniques.

Good at applying SPC,


design of experiments,
acceptance sampling,
and standards

Excels in applying
SPC, design of
experiments,
acceptance sampling,
and standards

Markov processes and


queuing models

Acceptable
(Score, S=2)

Exceptional
(Score, S=3)

Quality
Engineering &
Management
(W=1)

Poor in applying SPC,


design of experiments,
acceptance sampling,
and standards

Lean
Manufacturing
(W=1)

Fair knowledge of lean Good knowledge of


manufacturing.
lean manufacturing.

Very good knowledge Excellent knowledge


of lean manufacturing. of lean manufacturing.

Supply Chain
Management
(W=1)

Fair knowledge of
supply chain
management.

Good knowledge of
supply chain
management.

Very good knowledge


of supply chain
management.

Excellent knowledge
of supply chain
management.

Facilities Design
(W=1)

Fair knowledge of
facility design.

Good knowledge of
facility design.

Very good knowledge


of facility design.

Excellent knowledge
of facility design.

Fair in problem
Good in problem
Problem Def.,
definition, solution,
Soln. Strategy, & definition, solution,
Research (W=1) strategy, and research. strategy, and research.

Very good in problem


definition, solution,
strategy, and research.

Excellent in problem
definition, solution,
strategy, and research.

TeamSkills, Pres., Fair team and


& Comm. (W=1) communication skills.

Very good team and


communication skills.

Excellent team and


communication skills.

Good team and


communication skills.

Very good at applying


SPC, design of
experiments,
acceptance sampling,
and standards

Points (P)
P = W*S

Total Points (TP=P)


Overall
Performance
Criterion: TP13

Unacceptable
0TP6

Marginal
7TP12

Acceptable
13TP17

Exceptional
18TP24

The following details may be used for tracking student/team performance over time
Course # & Title:
Name - Student/Team:

Reviewer/Assessor:
Date:

INDUSTRIAL ENGINEERING PROGRAM


CONTINUOUS IMPROVEMENT PLAN,
ASSESSMENT MANUAL
AND
EVALUATION PROCESS
(Abridged Version for BAP IX)

IE Faculty
Dr. S. Balachandran
Dr. Jill Clough
Dr. Patricia Jinkins
Dr. Justin Kile

University of Wisconsin Platteville


Department of Industrial and Mechanical Engineering
1 University Plaza, Platteville, WI 53818
February 2007

TABLE OF CONTENTS
1. BACKGROUND ..........................................................................................................1
2. OVERVIEW
..........................................................................................................2
3. MISSION STATEMENTS .............................................................................................9
3.1 University Mission Statement........................................................................................9
3.2 Mission Statement College of Engineering Mathematics and Science (EMS) ........10
3.3 Mission Statement IE Program .................................................................................10
4. IE PROGRAM OBJECTIVES ......................................................................................10
5. OUTCOMES
........................................................................................................11
5.1 IE Program Outcomes..................................................................................................11
5.2 ABET, Inc. /EAC Outcomes........................................................................................12
5.3 Matrix Relating IE Outcomes and ABET, Inc. /EAC Outcomes ................................13
5.4 Matrix Relating IE Outcomes and IE Courses.............................................................13
6. IE PROGRAM CONSTITUENCIES .................................................................................15
7. IE PROGRAM ASSESSMENT TOOLS.......................................................................15
7.1 Tools to Collect Assessment Data about IE Program Objectives ................................21
7.1.1 Alumni Survey ........................................................................................................21
7.1.2 Employer Survey ......................................................................................................22
7.2 Tools to Collect Assessment Data about IE Program Outcomes.................................23
7.2.1 IE Graduate Exit Questionnaire ................................................................................23
7.2.2 Statistical Data ........................................................................................................24
7.2.3 Industrial Project Sponsor Survey ............................................................................25
7.2.4 Employer Assessment of Academic Preparation......................................................26
7.2.5 Student Portfolio .......................................................................................................27
7.3 Tools used by College and University .......................................... . ....27
7.4 Rubrics/Performance Criteria for Assessing & Evaluating Outcomes ........................28
8. ASSESSMENT DATA COLLECTION TIMELINE AND ANALYSIS .....................30
8.1 Assessment Data from Alumni Questionnaire & Analysis ........................................31
8.2 Assessment Data from Employer Questionnaire & Analysis ......................................35
8.3 Assessment Data from Employer Assessment of Academic Preparation & Analysis.38
8.4 Assessment Data from Industrial Project Sponsor Survey & Analysis .......................41
8.5 Assessment Data from Graduate Exit Survey & Analysis...........................................44
8.6 Cause-Effect Diagram..................................................................................................49
8.7 FE Examination Results...............................................................................................49
8.8 Past Assessment Reports..............................................................................................49
8.9 Additional Assessment Evaluations that are informal .................................................49
9. EVALUATION PROCESS AND PROGRAM IMPROVEMENT ..............................49
REFERENCES
........................................................................................................54

APPENDICES
Appendix A Appendix B Appendix C Appendix D Appendix E Appendix F Appendix G Appendix H Appendix I Appendix J Appendix K Appendix L -

INDUSTRIAL ENGINEERING ASSESSMENT MANUAL (Oct. 2000)


COLLEGE OF EMS STRATEGIC PLAN
COLLEGE OF EMS ADVISORY BOARD
COLLEGE OF EMS ALUMNI BOARD
IE PROGRAM - ALUMNI SURVEY FORM
IE PROGRAM EMPLOYER QUESTIONNAIRE
IE GRADUATE EXIT QUESTIONNAIRE
INDUSTRIAL PROJECT SPONSOR SURVEY
EMPLOYER ASSESSMENT OF ACADEMIC PREPARATION
IE STUDENT PORTFOLIO
COLLEGE OF EMS FACULTY TEACHING EVALUATION FORM
UW-PLATTEVILLE ACADEMIC PLANNING COUNCIL FIVE-YEAR
STUDY REPORT PROCEDURE
Appendix M - CURRENT ANALYTIC RUBRICS TO ASSESS OUTCOMES
Appendix N - PREVIOUS WHOLISTIC RUBRICS
Appendix O - MATRICES THAT MAP ABET OUTCOMES TO ACTIVITIES AND
WORK IN EACH IE COURSE
Appendix P - MATRICES THAT MAP EACH ABET OUTCOME TO ACTIVITIES
IN IE COURSES
Appendix Q - ASSESSMENT DATA ALUMNI QUESTIONNAIRE ANALYSIS &
INTERPRETATION
Appendix R - ASSESSMENT DATA EMPLOYER QUESTIONNAIRE
ANALYSIS & INTERPRETATION
Appendix S - ASSESSMENT DATA EMPLOYER ASSESSMENT OF
ACADEMIC PREPARATION
Appendix T - ASSESSMENT DATA INDUSTRIAL PROJECT SPONSOR SURVEY
ANALYSIS & INTERPRETATION
Appendix U - ASSESSMENT DATA GRADUATE EXIT SURVEY
ANALYSIS & INTERPRETATION
Appendix V - CAUSE AND EFFECT DIAGRAM FOR NOT ACHIEVING
PROGRAM OBJECTIVES OR UTCOMES
Appendix W - FUNDAMENTALS OF ENGINEERING EXAMINATION DATA
Appendix X- ASSESSMENT REPORTS FOR PAST YEARS
Appendix Y- SUMMARY, ANALYSIS, AND INTERPRETATION OF ASSESSMENT
DATA COLLECTED FROM COURSE MATERIALS USING RUBRICS
Appendix Z- MATRICES LINKING PROGRAM OBJECTIVES, UNIVERSITY MISSION,
COLLEGE STRATEGIC PLAN AND PROGRAM OUTCOMES

1. BACKGROUND
Even before its first ABET/EAC accreditation in 1987, the industrial engineering
(IE) program at UW-Platteville (UW-P) had collected assessment information and data
from alumni, the College Industrial Advisory Board (IAB) and employers. IE faculty
evaluated and analyzed that assessment data annually at faculty meetings to improve
individual courses and also the entire curriculum. Changes in course titles, catalog
description of courses, laboratory projects in the courses, topics covered in courses, etc
were driven by data from the assessment process. The evaluation process interpreted
assessment data to determine the extent to which educational objectives of the IE
program were being achieved and resulted in faculty decisions and actions to improve IE
courses and curriculum.
When the ABET/EAC Engineering Criteria 2000 were published, IE faculty
created a continuous improvement plan (CIP) that was summarized in the May 2000 SelfStudy Report of the IE program. The CIP1, shown in Figure 1, is based on ISO 9000
principles, mandates and ensures systematic review and improvement of every aspect of
the undergraduate education program, and also ensures institutional memory of
improvement activities through documentation requirements. The program implemented
the CIP in 1998 and documented its plan in the 1998 Industrial Engineering Assessment
Manual which was revised in October 2000. This manual outlined both the assessment
and evaluation processes. It identified multiple assessment tools to collect and prepare
data to evaluate the achievement of program outcomes and educational objectives. It
established a formal assessment plan to collect assessment data from students, alumni and
employers. This manual listed the assessment data collection methods, their frequency,
methods of analysis and interpretation of collected data to determine the degree to which
objectives and outcomes are being achieved. The May 2000 and May 2006 Self-Study
Reports of the IE program summarize how these assessment and evaluation processes
resulted in course and curriculum improvements. Fall 2000 and 2006 ABET/EAC
reviews of the IE program found assessment and evaluation processes to be satisfactory.
IE faculty implemented a CIP for the curriculum two years before the fall 2000
ABET/EAC general review. This CIP was originally outlined in the May 2000 Self-Study
Report of the IE program. The current Self-Study Report also identifies the following
elements of the CIP: constituencies, program objectives, program outcomes, multiple
assessment measures/metrics, and frequency of assessment activities, expected
performance levels, evaluation process, and curriculum improvement. This document
embellishes the CIP by providing additional details.
The key concept in the IE CIP is the process approach which facilitates attainment
of desired objectives and outcomes by managing activities and related resources as a
process. The "process approach" is a generic management principle, which can enhance
an organizations effectiveness and efficiency in achieving defined objectives and
outcomes. The continuous improvement process implemented by IE faculty is
characterized via the PDCA (Plan-Do-Check-Act) cycle2, 10, 11 for continuous assessment,
evaluation and improvement of the program.

The PDCA cycle is an established, logical, method that can be used to improve a
process. This requires:
(P) planning (what to do and how to do it),
(D) executing the plan (do what was planned),
(C) checking the results (did things happened according to plan) and
(A) acting to improve the process (how to improve next time).
The PDCA cycle can be applied within an individual process, or across a group of
processes. This IE Assessment and Evaluation Manual provides details about assessment,
evaluation, and improvement of IE program using this cycle. Specifically, this manual
shows how each and every one of the multiple assessment tools was designed (P-Plan),
and administered (D- Do). In addition, the manual shows how the collected assessment
data will be interpreted (C-Check) and may lead to decisions (A-Act) by IE faculty to
improve courses and curriculum.
The ABET/EAC EC 2000 and PDCA cycle may be viewed together as depicted
in the Figure 2 on the next page. In every system and process there is sufficient inertia to
let a status quo prevail and complacency to creep in. However, the PDCA cycle forces IE
faculty to strive for continuous improvement of the curriculum instead of attempting to
ride on past successes. The ABET/EAC EC 2000 assist the faculty in holding on to the
gains in achieving the desired objectives and outcomes and also in attempting to make
significant gains in the future.
The CIP, as envisaged and implemented by IE faculty, applies the PDCA cycle to
the process of curriculum improvement. It emphasizes periodic assessment and
evaluation of IE program constituencies, objectives, and outcomes. It also encompasses,
as outlined in the IE Program Self-Study Report, planning, collecting data using multiple
assessment tools, and summarizing assessment data so that assessment measures may be
compared with the target performance levels predefined by IE faculty. This manual
presents more details about the above parts of the CIP and also outlines the
mechanism that determines whether program changes are needed, and the process to
implement the necessary improvements. See Table 1 for the current status of the IE CIP.

2. OVERVIEW
The industrial engineering program assessment plan and evaluation procedures
were developed by industrial engineering faculty with input from alumni, employers,
national and local conferences, colleagues at other universities, published literature,
program evaluators and commissioners of ABET, Inc. and the other interested parties. It
provides a basis for obtaining feedback on the program, its outcomes, and its objectives,
and using that feedback for making improvements. Figure 3 presents these two feedback
loops and loop interconnections in the IE continuous improvement plan. This document
describes the program assessment plan. It includes the objectives and outcomes for the
program, specifies assessment tools, and gives a timeline for assessing the program
throughout the academic year. The plan is designed to obtain input from all constituents
and to provide a structure for continuous improvement of both the program and the plan.

Input

Assessment Data
Collection and Analysis

Curriculum
Design/Revision/Updates
via Faculty. Dept,
College Exec. Council &
UUCC

Outcomes
Product/
Service

Figure 1: OVERVIEW OF IE CIP

Demings
Demings
wheel
wheel
(P.D.C.A.)
(P.D.C.A.)

PLAN

DO

ACT

CHECK

EC2000

Continuous
Improvement

e/Inertia
Resistanc

Figure 2: ABET/EAC EC 2000 & PDCA assist in continuous improvement.

Constituents

Plan & Update tools,


Outcomes &
Objectives

Objectives & Outcomes

Expectations/Requirements

Constituents

Administration/
Faculty
Evaluation

UW-P Mission,
College Mission and Strategic
Plan, Program Mission

Program & EAC/ABET


Learning Outcomes /

Constituents

Feedback for
Continuous
Improvement

Educational
Strategies,
Timeline &
C/E Diagram
Program

Evaluation
& Feedback

Program
Educational
(EAC/ABET
Objectives)

Rubrics

Evaluation:
Evidence
Interpretation

Assessment

Figure 3: Two Assessment and


Evaluation Loops

Assessment
: Evidence
Collection
& Analysis

Table 1: Self-Assessment 3 : Quality Assurance of


Institutional/Program- Level Assessment of Student Learning*

Primary stakeholders are


involved in identifying
educational objectives
Primary stakeholders are
involved in periodic
evaluation of educational
objectives
Sustained partnerships
with stakeholders are
developed

Stakeholders provide input


to development of
objectives

Number of objectives is
manageable

Objectives are aligned


with mission

Outcomes are identified

Number of outcomes are


manageable

Outcomes are publicly


documented

Outcomes are linked to


performance objectives

Objectives are periodically


assessed

Objectives are periodically


evaluated for relevancy

Outcomes are defined by a


manageable number of
measurable performance
indicators
Outcomes are aligned with
mission

Desired outcomes are


mapped to educational
practices and/or strategies
Outcomes are mapped to
both curricular and cocurricular activities

Evaluation

Assessment is systematic
at the program/
institutional level

Multiple methods are used


to measure each outcome

Practices/strategies are
systematically evaluated
using assessment data

Both direct and indirect


measures of student
learning are used to
measure outcomes

Evaluation of assessment
data is linked to practices

Educational practices are


modified based on
evaluation of assessment
data

Assessment processes are


reviewed for effectiveness
and efficiency

Evaluation leads to action

Assessment methods are


modified based on
evaluation processes

5
5

*2004 Gloria M. Rogers, Rose-Hulman Institute of Technology (gloria.rogers@rose-hulman.edu)

Assessment data are


systematically
reviewed
Evaluation of results are
done by those who can
effect change

RATING

Program and/or
Institutional Assessment

RATING

Objectives are defined

Outcomes aligned with


educational practice

RATING

Learning Outcomes
desired knowledge, skills,
attitudes, behaviors at time
of completing program)

RATING

Stakeholders are identified

Performance Objectives
(Graduates performance 35 years after completing
program)

RATING

Stakeholder Involvement
(Those who have a
vested interest in the
outcome of the program)

RATING

0-not in place; 1-beginning stage of development; 2-beginning stage of implementation; 3-in place and implemented;
4-implemented and evaluated for effectiveness; 5-implemented, evaluated and at least one cycle of improvement

5
5

During the past two decades IE faculty members have arrived at the consensus
that the primary constituents of the program are alumni, current students, potential future
students and industry. While the program is designed for students to transition smoothly
into service and manufacturing industries and hit the job running, graduate schools and
programs are also considered as constituents in the next lower tier. Minor constituents
include faculty members who teach courses related to the curriculum, families of
students, the college and the university, taxpayers of the State of Wisconsin, and the
Wisconsin State Legislature.
As outlined in the IE Program Self-Study Report, the program educational
objectives reflect the expected accomplishments of graduates during the first few years
after graduation from the program. These objectives are consistent with the mission of the
university, college, and the program. The next section lists these mission statements.

3. MISSION STATEMENTS
This section lists the mission statements for the University, the College, and the
Industrial Engineering Program.

3.1 University Mission Statement


The fundamental mission of UW-Platteville and the entire UW System is to serve
the people of Wisconsin. This basic goal is expressed in detail in the mission statement
adopted in 2002. In this statement, UW-Platteville pledges itself to:
1. Enable each student to become broader in perspective, more literate, intellectually
more astute, ethically more sensitive, and to participate wisely in society as a
competent professional and knowledgeable citizen.
2. Provide baccalaureate degree programs which meet primarily regional needs in arts
and sciences, teacher education, business, and information technology.
3. Provide baccalaureate degree programs and specialized programs in middle school
education, engineering, technology management, agriculture, and criminal justice
which have been identified as institutional areas of emphasis.
4. Provide graduate programs in areas clearly associated with its undergraduate
emphasis in education, agriculture, technology management, engineering, and
criminal justice.
5. Provide undergraduate distance learning programs in business administration and
graduate online programs in project management, criminal justice, and engineering.
6. Provide agricultural systems research programs utilizing the Pioneer Farm in
partnership with businesses, universities and agencies.
7. Expect scholarly activity, including applied research, scholarship and creative
endeavor, that supports its programs at the baccalaureate degree level, its selected
graduate programs, and its special mission.
8. Serve the needs of all students and in particular the needs of women, minority,
9

disadvantaged, and nontraditional students. Furthermore, to seek diversification of


the student body, faculty and staff.
9. Serve as an educational, cultural, and economic development resource to
southwestern Wisconsin.
These statements, along with the UW System and University Cluster mission statements,
provide a guide to UW-Platteville in what it attempts and does not attempt to accomplish
as an institution of higher education.

3.2 Mission Statement College of Engineering Mathematics and Science


The College's objective is to ensure that its students gain the knowledge and
develop the mental skills, attitudes, and personal characteristics necessary to become
successful citizens and professionals who can meet the present needs of business,
industry, government, and society, and the more demanding requirements of the future.

3.3 Mission Statement IE Program


The mission of the Industrial Engineering Program is to provide a quality
industrial engineering education with management and production emphases and with
significant hands-on and laboratory experience that will enable our graduates to practice
their profession with proficiency and integrity in manufacturing or service industries and
businesses.

4. IE PROGRAM OBJECTIVES
After discussion among the faculty and input from the Colleges Advisory Board
and students, the Industrial Engineering Program established its educational objectives.
The educational objectives of the Industrial Engineering Program are consistent with the
Mission of UW-Platteville, the Colleges Strategic Plan, and ABETs Engineering
Criteria 2000. IE program objectives are listed below. Matrices may be used to cross-link
universitys mission and college strategic plan to the IE program objectives.
1. To provide graduates with a strong foundation in engineering, mathematics, science,
and current industrial engineering practices, accompanied by experiences solving
structured and unstructured problems using conventional and innovative solutions.
2. To enhance graduates communication and interpersonal skills through a variety of
individual and team-related activities, both multi-functional and intra disciplinary.
3. To provide graduates with an understanding of the ethical and professional
responsibilities of an engineer and the impact of engineering solutions on society and
the global environment.
4. To prepare graduates to effectively describe the problem, analyze the data, develop
potential solutions, evaluate these solutions, and present the results, using their oral,
written, and electronic media skills.
5. To make graduates aware of the need for continued professional growth through the
understanding of contemporary developments in industrial engineering.

10

These objectives are published in several places including the University Catalog,
fact sheets, and the Industrial Engineering curriculum sheets. The objectives are included
on course syllabi for Industrial Engineering courses. Instructions provided to students on
preparation of a student portfolio also include the educational objectives. Most
importantly, these objectives are covered in the survey forms used to solicit feedback
from programs various constituencies.
The educational objectives of the Industrial Engineering Program are reviewed
annually by the faculty. In addition, input is requested from the College Industrial
Advisory Board every three years. Additional input is gathered through comments made
on the Alumni Survey and Employer Survey which are sent to students two and five
years after graduation. Graduating seniors complete an Exit Questionnaire which
includes an opportunity for students to cite specific examples of activities and
experiences which demonstrate achievement of program outcomes that correlate well
with these objectives. . Faculty members fully support the educational objectives and are
acutely aware of the importance of achieving the objectives. The curriculum is built
around achieving these objectives, and embellishing them when feedback from
constituents convince faculty that objectives require revision. As an example, the list of
required courses was expanded to include a course on engineering management and a
course on engineering materials so that graduates will be well prepared to demonstrate
that these objectives are achieved within a few years after graduation from the program.

5. OUTCOMES
The outcomes are abilities, skills, awareness, knowledge, and understandings that
must be inculcated in students in various courses in the curriculum. IE faculty members
design the courses and course activities to foster the achievement these outcomes so that
graduates of the program will be able to demonstrate the achievement these via
accumulated course activities.

5.1 IE Program Outcomes


IE faculty compared the program objectives listed in the above section and the
ABET/EAC Criterion 3 outcomes listed in the following section. This comparison and
discussions during the past few years concluded that the program objectives do cover the
outcomes defined by ABET/EAC. IE faculty decided to define the program outcomes so
that there is a close linkage between the program outcomes and objectives. There is a
one-to-one correspondence between program outcomes and program educational
objectives. These IE Program outcomes are listed below.
1. Foundation. Graduates will have a strong foundation in mathematics, engineering,
science, and current industrial engineering practices and will have experience solving
structured and unstructured problems using conventional and innovative solutions.

11

2. Communication. Graduates will have developed their communication and


interpersonal skills through a variety of individual and team-related activities, both multifunctional and intra-disciplinary.
3. Responsibility. Graduates will have an understanding of the ethical and professional
responsibilities of an engineer and the impact of engineering solutions on society and the
global environment.
4. Problem Solving. Graduates will be able to effectively describe the problem, analyze
the data, develop potential solutions, evaluate these solutions, and present the results
using their oral, written and electronic media skills.
5. Growth. Graduates will be aware of the need for continued professional growth
through the understanding of contemporary developments in industrial engineering.

5.2 ABET/EAC Outcomes


The ABET/EAC Criteria for Accrediting Engineering Programs (Dated Nov. 1,
2004 and Effective for Evaluations During the 2005-2006 Accreditation Cycle) list the
outcomes under Criterion 3. The engineering programs must demonstrate that the
graduates attain:
a. an ability to apply knowledge of mathematics, science, and engineering
b. an ability to design and conduct experiments, a well as analyze and interpret data
c. an ability to design a system, component, or process to meet desired needs within
realistic constraints such as economic, environmental, social, political, ethical, health
and safety, manufacturability, and sustainability
d. an ability to function on multi-disciplinary teams
e. an ability to identify, formulate, and solve engineering problems
f. an understanding of professional and ethical responsibility
g. an ability to communicate effectively
h. the broad education necessary to understand the impact of engineering solutions in a
global, economic, environmental, and societal context
i. a recognition of the need for, and an ability to engage in life-long learning
j. a knowledge of contemporary issues
k. an ability to use the techniques, skills, and modern engineering tools necessary for
engineering practice
IE faculty members demonstrate that the above outcomes are inculcated in
students before they graduate by collecting course materials and organizing them in
binders as per the above outcomes. In addition to give a holistic view of each course,
course materials are available separately for each IE course. Assessment data collected
using rubrics for each outcome are also summarized and interpreted.

12

5.3 Matrix Relating IE Outcomes and ABET/EAC Outcomes


IE faculty created the following matrix that correlates the program objectives and
the outcomes defined by ABET/EAC. Students in the program are continuously informed
in the course assignments about the program outcomes and the outcomes defined by
ABET/EAC. In a subsequent section, sample direct assessment forms used by faculty will
be provided to show that the following table forms an integral part of assessment tools
such as the student portfolio and the forms to obtain direct feedback from students about
achievement of outcomes in specific coursework.
Table 2: Relationship Between IE Program Outcomes and ABET/EAC Outcomes
Industrial Engineering Program Outcomes
ABET/EAC
Outcome/
Graduate
Expectation

1
Foundation

2
3
Communication Responsibility

4
Problem Solving

d
e

5
Growth

5.4 Matrix Relating IE Outcomes and IE Courses


The table below indicates which outcomes are achieved as the student progresses
through each course in the curriculum. The type and depth of coverage of program
outcomes in IE courses may be classified as introductory (I), emphasis (E), or
reinforcement (R). In the following table, I, E, and R are used to characterize the scope
and depth of coverage of program outcomes in respective IE courses. The introductory
coverage assumes that students do not have the ability, skill, understanding, or
knowledge of the topic or its importance. Instruction is focused on introducing students to
the respective outcome, providing meaningful course activities to help them recognize its
13

importance, and motivating them to enhance the ability or understanding in the next
upper level course. Emphasis on program outcomes in some courses is via open ended
course activities, literature search, team projects, case studies, group discussions, etc that
are designed to provide opportunities for students to explore and enhance their
competencies. Reinforcement of program outcomes is mostly achieved in upper level IE
courses. Students are assumed to possess reasonable knowledge, understanding, skill, or
ability to apply their competency to analyze a problem, case study, situation, or industrial
project. Instructional activity continues to build upon previous competency and
reinforces content/skill competency. The following table is a dynamic entity and will be
revised by faculty assigned to teach specific courses each semester. The binders for
course materials and ABET outcomes contain additional matrices that link course
activities to ABET and program outcomes.
A matrix may be developed for each ABET outcome (a) through (k) to portray
courses and coursework that cover each outcome. In addition, course matrices may
illustrate the outcomes achieved by each coursework.
Table 3: Outcomes in IE courses: I-Introductory, E-Emphasis, and R-Reinforcement
Course
Number

Industrial Engineering Program Outcomes


1
Foundation

Communication Responsibility

4
Design

5
Growth

Required
I

IE 34 30

I
I

IE 3530

IE 3630

IE 4030

IE 4230

IE 4430

IE 4730

IE 4930

ME 3040

IE 4130

IE 4330

IE 2130

Elective

14

IE 4630

IE 4750

IE 4780

IE 4830

ME 4230

6. IE PROGRAM CONSTITUENCIES
During the past two decades IE faculty members have arrived at the consensus
that the primary constituents of the program are students, alumni, and employers. While
the program is designed for students to transition smoothly into service and
manufacturing industries and hit the job running, graduate schools and programs are
also considered as constituents in the next lower tier. Minor constituents include faculty
members who teach courses related to the curriculum, families of students, the college
advisory board (AB), the college and the university, taxpayers of the State of Wisconsin,
Wisconsin State Legislature, and ABET/EAC.

7. IE PROGRAM ASSESSMENT TOOLS


IE faculty members consider assessment to be a collection of processes that are
employed to identify, collect, use, and prepare data and evidence that can assist in
evaluating whether program objectives and outcomes are being achieved. The assessment
process supplies organized data to the evaluation process that focuses on determining the
value of findings and actions to be taken to improve the program. Therefore, multiple
assessment tools are used in the program.
Extensive web search and use of the assessment related materials from the ABET,
Inc website lead to the classification of assessment tools into two categories indirect
and direct assessment tools. Direct assessment tools, methods, or measures use direct
examination or observation of student knowledge or skills against measurable learning
objectives/outcomes and these may be Standardized Exams, Locally Developed Exams,
Simulations, Performance Appraisal, External Examinations, Oral Exams, or Behavioral
Observations. Indirect assessment tools are opinion or self-report of extent or value of
learning experiences and these may be Surveys/Questionnaires (Alumni, Employers,
Recruiters, etc.), Exit & other Interviews, Archival Records, Focus Groups, or Coopeducation Assessment.
Direct assessment tools being used in the IE program now are Evaluations of
students on co-op (1 month, mid-term, and final), Student Portfolios, Statistical Data, FE
examination Results, Placement Rate of Graduates, Participation in Co-op program,
IE3530/3630 Mid-Program Evaluation, and Face-to-Face Meetings/Assessment in faculty
offices. Indirect assessment tools being used currently in the IE program are Alumni &
Employer Survey, Entrance Survey, Exit Survey, Recruiter Survey, and Project Sponsor
Survey. Direct assessment tools, methods, or measures are considered to be robust
15

compared to indirect assessment tools, methods, or measures. IE faculty plan to


implement a few more direct and indirect assessment methods in 2007 and 2008. One
tool being considered may be simulations or oral examinations with graduating seniors.
The other tool may be focus groups.
The following table summarizes the timeline and schedule for the cyclic
application of the assessment tools in the IE program. The design of some of these tools
has changed over the last few years and this manual provides the current assessment tools
being used by the program. The assessment tools are now structured so that responses
from constituents may be converted to numerical scores and test of hypothesis may be
used to determine whether program objectives and outcomes are satisfied. Two
categories of assessment tools are used and this section specifies the details of each of the
assessment tools used:

Tools to collect assessment data about achieving the program objectives which
deal with the skills, knowledge, and performance of alumni a few years after
graduation.

Tools to collect assessment data about achieving the program outcomes and
evidence about the program effectiveness. These tools collect assessment data
about students abilities and understanding before and at the time of graduation.

In the following table some tools are designed to collect assessment data about
both program objectives and outcomes. It should be noted that the IE program objectives
and outcomes stated and discussed in the May 2006 Self-Study Report have a one-to-one
correspondence. The skills, abilities, and understandings to be demonstrated are the same,
but the time at which these are demonstrated changes from the time of graduation to a
few years on a job.

16

ASSESSMENT
TOOL
1. Alumni
Survey

Table 4: Assessment Tools


SCHEDULE /
ASSESSMENT DATA PREPARATION
FREQUENCY
& EVALUATION PROCESS

RESPONSIBILITY
FOR
ADMINISTRATION
Department Chair / IE January of each year
Program Coordinator Poll two-year alumni and
five-year alumni.

2. Employer
Survey

Department Chair / IE
Program Coordinator

January of each year.


Poll employers of twoyear alumni and five-year
alumni.
Request alumni to
complete the cover page
and pass the survey to
supervisor.

3. IE Graduate
Exit
Questionnaire

College of EMS /
Department Chair / IE
Program Coordinator
IE Faculty member
teaching the senior
design course.

Last week of classes each


semester in the senior
design course (IE 4930
Industrial Systems
Design)

17

Summer of each year


Summarize assessment data in spreadsheet.
Conduct test of hypothesis of fraction satisfied with
objectives.
Conduct test of hypothesis of mean score for achieving
objectives.
Note extreme comments.
Present findings, identify potential causes, and suggest
actions to faculty. Apply the cause-effect diagrams to
discuss action plan.
Summer of each year
Summarize assessment data in spreadsheet.
Conduct test of hypothesis of fraction satisfied with
objectives.
Conduct test of hypothesis of mean score for achieving
objectives.
Present findings, identify potential causes, and suggest
actions to faculty. Apply the cause-effect diagrams to
discuss action plan.
January and June of each year.
Summarize assessment data in spreadsheet.
Conduct test of hypothesis of fraction satisfied with
objectives and outcomes.
Conduct test of hypothesis of mean score for achieving
objectives and outcomes.
Note severe comments.
Present findings, identify potential causes, and suggest
actions to faculty. Apply the cause-effect diagrams to
discuss action plan.

4. Statistical
Data
(a)
Fundamentals
of Engineering
(FE)
Examination
Results

(b) Career
Planning and
Placement
Statistics
(c) Cooperative
Education
positions and
summer
internships held
by graduating
seniors.
(d) Alumni
Survey

College of EMS

Each semester when the


College of EMS receives
results of the FE
examination from
NCEES (National
Council of Examiners for
Engineering and
Surveying)

IE faculty members.
May and December of each year.
Compute passing rate.
Assess performance in each subject area.
Present findings, identify potential causes, and suggest
actions to faculty. Apply the cause-effect diagrams to
discuss action plan.

UW-P Career Center

Each year.

Department Chair / IE
Program Coordinator
IE Faculty member
teaching the senior
design course.

Each semester

IE faculty members.
September of each year.
Analyze placement and starting salaries.
See the folder containing placement data for past few years.
IE faculty members.
January and September of each year.
Compute percentage of graduates who had Cooperative
Education positions and summer internships.
Identify potential causes, and suggest actions to be taken by
faculty in advising sessions.

Department Chair / IE
Program Coordinator

January of each year.


Poll 2-year & 5-year
alumni.

January and June of each year.


Summarize assessment data for alumni demonstrating
professional development.
Identify potential causes and suggest actions.

18

5. Industrial
Project Sponsor
Survey

IE Faculty member
teaching the senior
design course and
other IE courses
where industry
sponsored design
projects were used to
provide realistic
hands-on design
experience to
students.

Each semester at the final


presentation of the project
report to the industry.
May and December of
each year.

January and September of each year.


Summarize assessment data in spreadsheet.
Conduct test of hypothesis of fraction satisfied with skills
demonstrated by students.
Conduct test of hypothesis of mean score for satisfaction.
Note extreme comments.
Identify potential causes and suggest actions to faculty.

6. Employer
Assessment of
Academic
Preparation

College of EMS,
Director of
Cooperative
Education
Program

Last month of
cooperative education or
summer internship
position.

7. Student
Portfolio

Department Chair / IE
Program Coordinator
IE faculty member
teaching the senior
design course (IE
4930 Industrial
Systems Design) and
the IE 2130
Fundamentals of
Industrial Engineering
course.

Each semester. May and


December of each year.

January and June of each year.


Summarize assessment data in spreadsheet.
Conduct test of hypothesis of fraction satisfied.
Conduct test of hypothesis of mean score.
Note severe comments.
Present findings, identify potential causes, and suggest
actions to faculty. Apply the cause-effect diagrams to
discuss action plan.
January and September of each year.
Present findings, identify potential causes, and suggest
actions to faculty. Apply the cause-effect diagrams to
discuss action plan.

19

Frequency depends on the IE faculty members present their data at faculty meeting in
January and September for the previous semester or year.
outcome and faculty
Course folders/binders member. Course
materials may be
collected once in a few
years or each semester.

8. Direct
Measurement of
degree to which
ABET/EAC
specified
outcomes are
attained in IE
courses.
9. UW
Platteville,
Academic
Planning
Council, Fiveyear Self-Study
and Review.
10. ABET/EAC
Review

IE faculty members

11. College of
EMS Advisory
Board &
Alumni Board

Department Chair / IE
Program
Coordinator/IE
Faculty Volunteer

Department Chair / IE
Program Coordinator

Once in five years.

http://www.uwplatt.edu/committees/apc/forms/index.html

Department Chair / IE
Program Coordinator
And IE faculty
members.

Frequency depends on the


final accreditation action
by ABET/EAC. In the
past 2 decades, it has
been once every six
years.
Once every year, but
consolidated every three
years into action items

See ABET/EAC Self-Study Reports for 1994, 2000, and


2005.

20

Major revisions and improvements are presented and


feedback is used by faculty.

7.1 Tools to Collect Assessment Data about IE Program Objectives


Employer and alumni surveys are the primary instruments used to assess the IE
program objectives. These surveys are done annually and target two groups. The first
group includes alumni who have been out of school for two years and their immediate
supervisors. The second group surveyed is those alumni who have been out of school for
five years and their immediate supervisors. The surveys include questions directly
related to the program objectives. These surveys have evolved and changed over the
three decades the IE program has been in existence at UW-Platteville. The earlier version
of the IE Assessment Manual in 2000 displayed the forms that were administered by the
College of EMS as per the frequency stated above. After the fall 2000 ABET/EAC
general review of the program, these surveys are administered by the Chair-person of the
Department of Mechanical and Industrial Engineering. The same surveys are also used
with the College Advisory Board, Alumni Board, recruiters, and alumni visiting faculty
members. The current alumni and employer surveys will be made available to interested
persons via e-mail when request is sent to balachas@uwplatt.edu.
7.1.1

Alumni Survey
WHO? - The College of Engineering, Mathematics, and Science shall distribute surveys
to alumni who graduated two years and five years previously.
WHEN? - The surveys shall be distributed each annually (usually in January)
WHY? - Collect assessment data for achieving program objectives. Offer suggestions for
areas of improvement within Industrial Engineering Program. Indicate the career
progression of alumni
DATA COLLECTION: Surveys shall be returned either by mail or fax
INFORMATION OBTAINED: Basic statistical data regarding Salary ranges, Position,
Registration Status, and Evaluation of Industrial Engineering Program
WHAT TO LOOK FOR: Registration Status will show if the desire for continuous
growth has been instilled
a. Salary
b. Current job title/responsibilities
c. Disagreement to being well prepared for any of the ABET criteria
d. Information regarding the weakness of engineering education
e. Improvements to the industrial and general engineering curriculum at UW-Platteville
f. Willingness to return to UW-Platteville to present to classes or professional
organizations
g. Accessibility to internet/e-mail

21

WHAT TO DO WITH THE INFORMATION: Registration status, graduate course work,


and other professional activities will show if the desire for continuous growth has been
instilled. If not, the best way to do such must be discovered and implemented.
Salary ranges and job titles/responsibilities will be interesting to students who would like
to see where they can expect to be 2 years or 5 years after graduation. This information
should be made available to students.
If a significant number of alumni agree that a certain ABET criteria was not met, then
this criteria must be reviewed and stressed within course work.
Any comments regarding weaknesses or possible improvements to the industrial or
general engineering curriculum should be read and strongly considered on an individual
basis.
Options considering sending alumni surveys via internet or e-mail should be considered
based upon accessibility to the internet and e-mail.
7.1.2

Employer Survey
WHO? - The Department of Mechanical Engineering and Industrial Engineering and the
IE Program Coordinator shall distribute employer surveys with the alumni survey to
alumni who graduated two years and five years previously.

WHEN? - The surveys shall be distributed annually (usually in January)


WHY? - Collect assessment data for achieving program objectives. Offer suggestions for
areas of improvement within Industrial Engineering Program
DATA COLLECTION: Surveys shall be either mailed or faxed back
INFORMATION OBTAINED: Basic statistical data regarding employee responsibilities,
strengths and weaknesses of employees based on employee education, possible
improvements to the Industrial Engineering program based on employee education, and
strengths and weaknesses based upon ABET criteria
WHAT TO LOOK FOR: Current job title/responsibilities, Disagreement to being well
prepared for any of the ABET criteria, Information regarding the weakness of
engineering education, and Improvements to the industrial and general engineering
curriculum at UW-Platteville
WHAT TO DO WITH THE INFORMATION: If a significant number of employers
agree that a certain ABET criteria was not met, then this criteria must be reviewed and
stressed within course work. Any comments regarding weaknesses or possible
improvements to the industrial or general engineering curriculum should be read and
strongly considered on an individual basis.
In addition to the above formal survey instruments, faculty members meet with
alumni and employers each semester during Career Fair, senior (capstone) design project
visits to industry, College Advisory Board meetings on campus, homecoming,
22

presentations by alumni in IE courses, on-campus recruitment visits, etc. These meetings


are sometimes in faculty offices and other times at restaurants, but these settings provide
opportunities for extended and candid discussions about the attainment of the program
objectives. There is overwhelming evidence from the formal surveys and informal
meetings that the constituents of the program concur that the program has evolved and
improved the past few decades and it continues to achieve its objectives consistently.

7.2 Tools to Collect Assessment Data about IE Program Outcomes


To assess the IE program outcomes, different tools are used and these correspond
to the tools 3 through 10 in the above table. In addition to the tools listed in the table
there a few other tools that are used by the college and the program. These are: course
folders collected for ABE/EAC Review, the course evaluations conducted by the college
each semester, letters and e-mails received from alumni and employers, conversations
with employers and alumni, feedback received by the Career Center from alumni and
employers. It must be noted that the tools in this category comprise of both direct and
indirect measurement tools.
In general, assessment data is summarized in a spreadsheet that permits multiple
ways to organize data and evidence. In addition data may be easily imported into Minitab
for additional statistical analysis. A test of hypothesis is used to interpret data, when
distributional assumptions are satisfied, to take into account variability in data and
evidence. These statistical analyses are presented and discussed in the evaluation process
outlined in section 9.
7.2.1 IE Graduate Exit Questionnaire
WHO? - Students graduating in the immediate semester from the Industrial Engineering
Program.

WHEN? - Distribute prior to graduation in the senior design course.


WHY? - To obtain feedback from students graduating from the Industrial Engineering
Program encompassing all of the engineering program and main aspects of the students
education at the university.
DATA COLLECTION: The students must return the surveys to the Deans Office.
Questionnaires are distributed in IE 4930 Industrial Systems Design or by the IE Program
Secretary. This survey must be completed before a student is allowed to graduate from the
program. Students are given the opportunity to request a private exit interview with the Dean,
Department Chair, Women in Engineering Director, or may specify with whom they wish to
interview.
INFORMATION OBTAINED: Overall assessment of educational experience relative to the
expectations for graduating engineers.
Assessment of the academic advising process
Feedback about the perceived quality of mathematics, chemistry, physics, and general
engineering courses
23

Students opinion of their preparedness for co-op experience and for the Fundamentals of
Engineering Exam
Quality of the Industrial Engineering Programs classes
Personal examples supporting the achievement of the IE Program Objectives
WHAT TO LOOK FOR: Within each category the rating should be above average or
excellent.
The number of co-ops and/or internships per student
Additional comments after each section
Comments about the quality of the engineering education, including:
Strengths
Weaknesses
What is missing
What could be eliminated
Why they would or would not recommend a friend to attend UW- Platteville for
Industrial Engineering
WHAT TO DO WITH THE INFORMATION: If any of the categories within the sections
receive a rating of average or below, the department has to give special attention to each section
individually. The department will have to develop a program to improve the categories which
received a low rating. Studies can be done to see if the same categories always receive a low
rating and why they receive that rating. With regards to the co-op and internship section of the
survey, the department can use the information to correlate a students previous experience to
receiving a job before graduation. The comments sections can be used to establish
improvements in the Industrial Engineering Program and the courses it provides. Many ideas
given at the end of the survey including the strengths, weaknesses, and what is missing can be
used as building blocks towards the improvement of the Industrial Engineering Program at the
University of Wisconsin - Platteville.
7.2.2 Statistical Data
WHO? - Industrial Engineering students taking the Fundamentals of Engineering (FE) exam:
Graduating Industrial Engineers
Industrial Engineering Alumni
Career Planning and Placement
WHEN? Fundamentals of Engineering (FE) Exam results every semester
Career Planning and Placement statistics once a year
Alumni surveys every semester
WHY? - To provide data about how well students are meeting the Industrial Engineering
Program Objectives and ABETs eleven graduate expectations.
INFORMATION OBTAINED:
Percentage of students passing the FE exam
Number of cooperative education (co-op) experiences and/or internships graduating
students have obtained
24

Placement and starting salaries of graduating seniors


Number of alumni developing further professionally
WHAT TO LOOK FOR: In comparison to state and national percentages, examine the
percentage of industrial engineering students passing the FE exam.
The number of co-op and/or internship experiences graduates had on the average. Ideally,
each graduate should have at least one educational work experience.
Trends in the percentage of graduating students placed in full-time employment in the field
and starting salary.
Trends in the number of alumni who are developing further professionally.
WHAT TO DO WITH THE INFORMATION: If the percentage of Industrial Engineering
students passing the FE exam is less than 50%, the faculty can take several courses of action.
For those who are taking the Industrial Engineering portion in the afternoon, the faculty
could offer review sessions in these classes. Finally, the faculty could ask students who have
taken the exam to speak with students and encourage those taking the exam to participate in
the review sessions.
If 100% of graduating students have at least one co-op or internship experience, the program
has been successful in motivating students to practice what they have learned. Any other
percentage should provoke the faculty into stressing the need for this type of experience, and
help students understand the opportunities. Furthermore, if less than 50% of the graduating
students have participated in a co-op, the department should evaluate the methods by which a
student chooses to participate in a co-op or not. Once this method is determined, the faculty
can access it for improvements. Also, professor can draw on experiences that students have
during their cooperative educations. This will hopefully motivate students to pursue co-op
opportunities.
If the percentage of alumni who are further developing professionally either by pursuing a
higher degree, seminars, or workshops shows a declining trend over a few years, the faculty
will have to evaluate the methods by which they convey the message that continuous
professional development is necessary. In addition, the professional societies such as IIE,
SME, etc. can meet with their faculty advisors to evaluate how they encourage continuous
education within that particular society. From there, modifications can be made.
7.2.3 Industrial Project Sponsor Survey
WHO? - Any company who sponsors a project in an Industrial Engineering course
WHEN? - At the final presentation for that particular project
WHY? - To provide feedback from an outside source on how well students are meeting

the Industrial Engineering Program Objectives and ABETs eleven graduate expectations.
DATA COLLECTION:
The surveys should be returned one week after their distribution. Surveys may be
returned at different times of the semester due to the variation in final presentation times.
Surveys may be returned via postage paid envelope or fax.

25

INFORMATION OBTAINED:
Level of skills and abilities students possess in accordance with the Industrial
Engineering Program Objectives
WHAT TO LOOK FOR:

Within the characteristics, all the categories should receive a rating of four or more
additional comments
WHAT TO DO WITH THE INFORMATION:
OVERALL BASIS:
If any of the items are insufficient, the faculty will have to review the project
administration, integration of the project into the course, and perhaps the course
content. Upon doing so, the group should identify areas where the deficient skills
could be developed further. In addition, the methods by which this material is
conveyed may have to be changed to make it more appealing and applicable to real
world situations.
INDIVIDUAL:
If an individual group receives a four or less, the professor should have a group
meeting with them as soon as the surveys are reviewed. Within this meeting, the
professor should provide methods by which these students can improve upon the
skills that were lacking according to the sponsor.
7.2.4 Employer Assessment of Academic Preparation
WHO? - Immediate supervisors of students participating in co-op or internship experiences
will complete an evaluation prior to the end of the students work experience.
WHEN? - The CO-OP office shall distribute evaluations to students during the last month of
the co-op or intern position.
WHY? - Assess the preparation of students compared to graduate expectations. Offer
suggestions for areas of improvement within Industrial Engineering Program
DATA COLLECTION: Evaluations shall be mailed or faxed back
INFORMATION OBTAINED:
Responsibilities of student worker
Ability of student to meet expectations
Information regarding the weakness of engineering education
Student preparedness based on graduate expectations
WHAT TO LOOK FOR:
Job responsibilities
Disagreement with respect to being well prepared for any of the expectations
Information regarding the weakness of engineering education
Inadequacies in meeting employer expectations based upon education

26

WHAT TO DO WITH THE INFORMATION:


Job responsibilities should be made available to students looking for co-op and internship
experiences
Any comments regarding weaknesses or possible improvements to the industrial general
engineering curriculum should be strongly considered on an individual basis.
7.2.5 Student Portfolio
WHO? - Each graduating Industrial Engineering student beginning with those graduating
after May 2000
WHEN? - IE 2130 Fundamentals of Industrial Engineering and IE 4930 Industrial
Systems Design
WHY? UWP-IE Assessment, Student Self Assessment, & Interviewing Aid
DATA COLLECTION:
Student portfolios will be turned in for grading in IE 2130 Fundamentals of Industrial
Engineering and kept in the Industrial Engineering program. Students may check out
their portfolios to add information. Assessors may copy completed portfolios and
students may take the original upon graduation.
INFORMATION OBTAINED:
Reflection of students learning
Balance of student learning
Development of student as a professional
WHAT TO LOOK FOR:
Reviewed by instructor and/or advisor
Contents which reflect the students experience and ability
Consistency and organization of presentation
WHAT TO DO WITH THE INFORMATION:
Improve upon portfolio implementation
Encourage student development
Look for program deficiencies

7.3 Tools Used by College and University


The Faculty Teaching Evaluation form is the same one used by the entire College of
Engineering, Mathematics, and Science. The procedure is that used by the UW-Platteville
Academic Planning Council (APC) for Self-Study Report. The IE program was reviewed by
the APC in the 2001-2002 and 2006-2007 academic years.

27

7.4 Rubrics/Performance Criteria for Assessing & Evaluating Outcomes


Rubrics are systematic scoring methods that use pre-determined criteria. Rubrics
help instructors assess student work more objectively and consistently. Assessment data
for each outcome is collected using a rubric. These rubrics are common to all IE courses.
IE faculty decided to use one rubric for each outcome so that student or target group
development over a long time period may be tracked and compared. Currently student or
target group development over a long time period is not tracked and compared. See
Figure 4 (b) through (d) for examples of future application of the rubrics. The Figure 4
demonstrates that IE faculty designed the current rubrics with a strong focus on future
evolution of the assessment and evaluation process in the IE program. In this sense this
assessment and evaluation manual is a living document and all tools and processes in this
manual will keep improving continuously along with the program improvements.
Figure 4(a) summarizes how assessment data collected through the rubrics for
each outcome will be used in the evaluation process. The criterion for concluding that an
outcome is achieved satisfactorily is that the average score or points for that outcome
must be in the acceptable or exceptional range. Alternatively, the % of scores or points in
the acceptable or exceptional range must be more than 60%.

%Acceptable &Exceptional or
Average Score.

Outcome
a

Figure 4(a): Simple Use of Assessment Data from Rubrics in 2006

28

%Acceptable or Exceptional
Unacceptable
Marginal

100%

Acceptable
Exceptional

60%

IE 2130 - Intro to IE
IE 4930 Capstone
Outcome (g)
Design Outcome (g)
Figure 4(b): Planned Use of Assessment Data from Rubrics in 2007/2008 - Track a
target Group.

%Acceptable
&Exceptional

Metric 1
Metric 2

100%

Metric 3
60%

IE 2130 - Intro to IE
Outcome (g)

Figure 4(c): Simple Use of Assessment Data from Rubrics in 2007/2008


Consider Figures 4(c) and (d) and rubrics for outcome (g). The rubric for outcome
(g) communication - comprises of five metrics. The use of the rubrics to collect assessment
data allows evaluation of student performance in many ways. For example, student performance
in outcome (g) in the course IE 2130 Introduction to IE may be monitored for each metric
(content, structure, delivery, etc). Alternatively, improvement of student communication skills
may be tracked longitudinally as students progress through the curriculum.

29

%Acceptable
&Exceptional

Metric 1
Metric 2

100%

Metric 3
60%

Outcome (a)a
IE 2130 - Intro to IE
IE 4930 Capstone
Outcome (g)
Design Outcome (g)
Figure 4(d): Planned Use of Assessment Data from Rubrics in
2007/2008 to track performance of a target group.

8. ASSESSMENT DATA COLLECTION TIMELINE AND ANALYSIS


The table in the above section provided the timeline for administering each
assessment tool. The last section of that table summarized also how the collected data
will be organized and analyzed. The collected data are summarized in a spreadsheet so
that many statistical tools may be used in the analysis. One faculty member takes the
primary responsibility for entering summary of assessment data into spreadsheet,
conducting analysis and arriving at interpretations. Further analysis and interpretation
take place at faculty meetings. Any comments, suggestions, and recommendations
resulting from the evaluation process are translated by IE faculty into actions and changes
in teaching methods, course descriptions, assignments, laboratory projects, curriculum,
etc. Changes range from modifying the way a topic is covered in a course to addition and
deletion of required courses. Faculty is developing a schedule for collecting assessment
data in courses and the schedule being discussed is given below.

30

Table 5: Frequency of Assessment Data - ABET/EAC Outcomes


Assessment Data Collection Period
2008-2009
2009-2010
2010-2011

ABET/EAC
Outcome/
Graduate
Expectation

2007-2008

2011-2012

8.1 Assessment Data from Alumni Questionnaire (Survey) and Analysis


The alumni questionnaire or survey is sent out each spring semester in January.
During the summer, the returned survey data are tabulated and a report is written by the
program coordinator. Sometimes a faculty member takes the primary responsibility for
entering summary of assessment data into spreadsheet, conducting analysis and arriving
at interpretations. The report is made available to the IE faculty at the start of the fall
semester. The report is presented at the fall meeting of Advisory Board for comments
and suggestions. Table 6 below presents the interpretations and actions taken by faculty.
Feedback from employers was positive and all the responses affirm that the program
objectives are attained. Further analysis will be carried out when additional data becomes
available.
Consider first the assessment data collected from Alumni Survey. From 1999
through 2001, survey items 13 through 23 covered IE program objectives and outcomes.
In the 2004 survey, these outcomes and objectives were covered in questions 12 through
21. Each survey item had five response levels, namely, Strongly Agree, Agree, Neutral,
Disagree, and Strongly Disagree. To perform statistical analysis of the data, numerical
weights were assigned the responses. The numerical weights are: Strongly Agree=5,
Agree=4, Neutral=3, Disagree=2, and Strongly Disagree=1. A weighted average score of

31

more than 3 is considered to be satisfactory attainment of the respective objective or


outcome. The corresponding test of hypothesis used is:
H0: Weighted average =3 (outcome or objective is not achieved)
H1: Weighted average >3 (outcome or objective is not achieved)
Weighted average = (# of responses for a category*weight)/Total # of Responses
Normality of underlying distribution is assumed in this analysis and this may be
appropriate because of the Central Limit Theorem. This assumption is usually tested
using normal probability plot analysis and the following test of hypothesis is used only
when the underlying distribution is normal. If the null hypothesis rejected, then the
outcome or objective is achieved. In this test of hypothesis, Z = (Weighted Average 3)
/ (Std. Dev/Sqrt(n)). The test of hypothesis is carried out for each objective and outcome
using a level of significance of =0.05. If the test of hypothesis is not applicable, a
simple bar graph is used to check if the weighted average score is 3 or more. These
analyses of assessment data for 1999 through 2004 establish that each and every one of
the program objectives and outcomes is achieved.
Alternatively, the same alumni data may be analyzed using a test of hypothesis
about attribute p which is defined as fraction of alumni satisfied with an outcome or
objective. The corresponding test of hypothesis for each outcome and objective is:
H0: Fraction Satisfied with objective or outcome = 0.6
H1: Fraction Satisfied with objective or outcome > 0.6
Fraction satisfied = (# Strongly Agree + Agree + Neutral)/Total Responses
Normality approximation to binomial distribution is assumed in this analysis. This
assumption is usually tested using normal probability plot analysis and the following test
of hypothesis is used only when the underlying distribution is normal. If the null
hypothesis is rejected, then the outcome or objective is achieved. In this test of hypothesis,
Z or t = (p0.6) /Sqrt(p(1-p)/n). The test of hypothesis is carried out for each objective
and outcome using a level of significance of =0.05. If the test of hypothesis is not
applicable, a simple bar graph is used to check if the fraction satisfied is 0.6 or more.
These analyses of assessment data for 1999 through 2004 establish that each and every
one of the program objectives and outcomes is achieved.
Even though statistical analysis of alumni data reveals that alumni are satisfied
with the overall achievement of objectives and outcomes, faculty decided to use the raw
data from these surveys to identify the areas where program may be improved. Table 6
summarizes analyses of these outliers.

32

Table 6: Evaluation of Assessment Data from Alumni Questionnaire/Survey Outlier Responses & Actions
YEAR
FINDING
POTENTIAL CAUSES
ACTION
1999-2000 One alumnus out of 9
IE curriculum deals with
Life Cycle principles, life cycle costing, and life cycle
strongly disagreed with I
management and production. management are being taught in IE courses. Relate these
feel that my education at
Product design is not covered concepts to understanding the impact of engineering
UW-Platteville enables me
explicitly.
design on society and the environment.
to understand the impact of
engineering design on
See the cause effect diagram. This concept is now introduced in IE 4730 and is
society and the
emphasized, or reinforced in IE 4830.
environment.
Cumulative assessment data does not show that this
The null hypothesis of not
continues to be a problem.
achieving this outcome was
rejected. The response may
be disregarded as an outlier,
but faculty decided to
address it.
1999-2000 Two out of 9 disagreed with Cause and effect possibilities Cumulative assessment data does not show that this
acquired effective oral and
reflect number of possibilities continues to be a problem.
written communication
including changes in faculty
skills
at the time, student avoidance Identify courses where oral and written communication
of opportunities in class, and
skills will be introduced, emphasized, or reinforced.
The null hypothesis of not
greater emphasis needed in
achieving this outcome was
job than anticipated.
All IE courses require presentations and reports. Faculty
rejected. The response may
Cause and effect possibilities will be more rigorous in grading the reports and
be disregarded as an outlier, reflect number of possibilities presentations. Arrange to provide a formal feedback to
but faculty decided to
including changes in faculty
students so that faculty attempts to introduce, enhance,
address it
at the time, student avoidance and reinforce this skill will be remembered.
of opportunities in class, and
greater emphasis needed in
job than anticipated.

2001

One out of 10 disagreed with


acquired effective oral
communication skills

See the cause effect diagram.


No support found.

Identify courses where oral and written communication


skills will be introduced, emphasized, or reinforced.

The null hypothesis of not


achieving this outcome was
rejected. The response may
be disregarded as an outlier,
but faculty decided to
address it
2000-2005
Cumulative
Data from
Alumni
Survey

None. All outcomes were


achieved.
Test statistic values were
very high and null
hypothesis of not achieving
outcomes is rejected with
very high certainty.

Cumulative assessment data does not show that this


continues to be a problem.

All IE courses require presentations and reports. Faculty


will be more rigorous in grading the reports and
presentations. Arrange to provide a formal feedback to
students so that faculty attempts to introduce, enhance,
and reinforce this skill will be remembered.
The very large cumulative
sample size had nullified the a
few low scores for some
survey items.

Reassure IE faculty that courses seem to be planned,


organized, and executed very well. The surveys of
graduates demonstrate the achievement of all ABET
specified outcomes.

Faculty members are very


diligent in teaching IE courses
and structuring them very
well to achieve IE program
objectives and goals. Faculty
efforts ensure that all
outcomes are achieved.

Remind faculty to maintain, or even improve the level


of performance in planning and supervising the
industrial projects.

34

Instead of disregarding outlier responses, try to


improve achievement of outcomes by more refined
attempts to achieve outcomes in IE courses. Target
some courses for introduction of concept, emphasizing
the topic, and reinforcing the outcomes.

8.2 Assessment Data from Employer Questionnaire (Survey) and Analysis


The alumni questionnaire or survey is sent out each spring semester in January.
During the summer, the returned survey data are tabulated and a report is written by the
program coordinator. Sometimes a faculty member takes the primary responsibility for
entering summary of assessment data into spreadsheet, conducting analysis and arriving
at interpretations. The report is made available to the IE faculty at the start of the fall
semester. The report is presented at the fall meeting of Advisory Board for comments
and suggestions.
As the response was low in some years, the assessment data was analyzed in raw
form each year and Table 7 below presents the interpretations and actions taken by
faculty with respect to outlier responses. Feedback from employers was positive and all
the responses affirm that the program objectives and outcomes are attained. Further
analysis will be carried out when additional data becomes available.
In accumulating data for several years, care was exercised to total the responses
for identical questions in each survey. Each survey item had five response levels, namely,
Strongly Agree, Agree, Neutral, Disagree, and Strongly Disagree. To perform statistical
analysis of the data, numerical weights were assigned the responses. The numerical
weights are: Strongly Agree=5, Agree=4, Neutral=3, Disagree=2, and Strongly
Disagree=1. A weighted average score of more than 3 is considered to be satisfactory
attainment of the respective objective or outcome. The corresponding test of hypothesis
used is:
H0: Weighted average =3 (outcome or objective is not achieved)
H1: Weighted average >3 (outcome or objective is not achieved)
Weighted average = (# of responses for a category*respective weight) / Total Res.
Normality of underlying distribution is assumed in this analysis and this may be
appropriate because of the Central Limit Theorem. This assumption is usually tested
using normal probability plot analysis and the following test of hypothesis is used only
when the underlying distribution is normal. If the null hypothesis is rejected, it means
that the respective outcome or objective is achieved. In this test of hypothesis, Z =
(Weighted Average 3) / (Std. Dev/Sqrt(n)). The test of hypothesis is carried out for
each objective and outcome using a level of significance of =0.05. If the test of
hypothesis is not applicable, a simple bar graph is used to check if the weighted average
score is 3 or more. These analyses of assessment data for 2000 through 2004 establish
that each and every one of the program objectives and outcomes is achieved.
Alternatively, the same employer survey data may be analyzed using a test of
hypothesis about attribute p which is defined as fraction of alumni satisfied with an
outcome or objective. The corresponding test of hypothesis for each outcome and
objective is:

35

H0: Fraction Satisfied with objective or outcome = 0.6


H1: Fraction Satisfied with objective or outcome > 0.6
Fraction satisfied = (# Strongly Agree + Agree + Neutral)/Total Responses
Normality approximation to binomial distribution is assumed in this analysis and
is verified using normal probability plot analysis. This assumption is usually tested using
normal probability plot analysis and the following test of hypothesis is used only when
the underlying distribution is normal. If the null hypothesis is rejected, it means that the
respective outcome or objective is achieved. In this test of hypothesis, Z = (p0.6)
/Sqrt(p(1-p)/n). The test of hypothesis is carried out for each objective and outcome using
a level of significance of =0.05. If the test of hypothesis is not applicable, a simple bar
graph is used to check if the fraction satisfied is 0.6 or more. These analyses of
assessment data for 2000 through 2004 establish that each and every one of the program
objectives and outcomes is achieved.
Even though statistical analysis of employer data reveals that employers are
satisfied with the overall achievement of objectives and outcomes, faculty decided to use
the raw data from these surveys to identify the areas where program may be improved.
Table 7 summarizes analyses of these outliers.

36

Table 7: Evaluation of Assessment Data from Employer Questionnaire/Survey Outlier Responses & Actions
YEAR
FINDING
POTENTIAL
ACTION
CAUSES
2000
One employer disagreed
See the cause effect Use the rubrics to provide effective feedback to students.
that alumnus had ability to
diagram. .
function on
Use the capstone design course to emphasize this skill.
multidisciplinary teams.
Cumulative assessment data does not show that this continues to
The null hypothesis of not
be a problem.
achieving this outcome was
rejected. The response may
be disregarded as an outlier,
but faculty decided to
address it.
2000 &
One employer disagreed
See the cause effect Use the rubrics to provide effective feedback to students.
2001
that alumnus had effective
diagram.
oral and written
Use the senior level and capstone design course to emphasize this
communication skills.
skill.
The null hypothesis of not
achieving this outcome was
rejected. The response may
be disregarded as an outlier,
but faculty decided to
address it.

Cumulative assessment data does not show that this continues to


be a problem.

37

8.3 Assessment Data from Employer Assessment of Academic Preparation and Analysis
This survey or questionnaire is completed by employer or supervisor when IE
student completes cooperative education assignments or summer internships. During the
summer, the returned survey data are tabulated and a report is written by the program
coordinator. Sometimes a faculty member takes the primary responsibility for entering
summary of assessment data into spreadsheet, conducting analysis and arriving at
interpretations. The report is made available to the IE faculty at the start of the fall
semester. The report is presented at the fall meeting of Advisory Board for comments
and suggestions. Table 8 below presents the interpretations and actions taken by faculty.
Feedback from employers was positive and all the responses affirm that the program
objectives are attained. Further analysis will be carried out when additional data becomes
available.
In accumulating data for several years, care was exercised to total the responses
for identical questions in each survey. Each survey item had six response levels, namely,
Excellent, Very Good, Average, Below Average, Very Poor, and N/A. To perform
statistical analysis of the data, numerical weights were assigned the responses. The
numerical weights are: Excellent=5, Very Good=4, Average=3, Below Average=2, Very
Poor=1, and N/A=0. A weighted average score of more than 3 is considered to be
satisfactory attainment of the respective objective or outcome. The corresponding test of
hypothesis used is:
H0: Weighted average =3 (outcome or objective is not achieved)
H1: Weighted average >3 (outcome or objective is not achieved)
Weighted average = (# of responses for a category*respective weight)/Total Resp.
Normality of underlying distribution is assumed in this analysis and this may be
appropriate because of the Central Limit Theorem. This assumption is usually tested
using normal probability plot analysis and the following test of hypothesis is used only
when the underlying distribution is normal. If the null hypothesis is rejected, it means that
the respective outcome is or objective is achieved. In this test of hypothesis, Z =
(Weighted Average 3) / (Std. Dev/Sqrt(n)). The test of hypothesis is carried out for
each objective and outcome using a level of significance of =0.05. If the test of
hypothesis is not applicable, a simple bar graph is used to check if the weighted average
score is 3 or more. These analyses of assessment data for 1999 through 2004 establish
that each and every one of the program objectives and outcomes is achieved. In arriving
at this conclusion, it should be noted that students in cooperative education and summer
internship positions may be sophomores, juniors, or seniors. Further, the N/A response is
quite large in many instances and this leads to the rejection of the null hypothesis and the
conclusion that the respective objective or outcome is not attained.
Alternatively, the same data may be analyzed using a test of hypothesis about
attribute p which is defined as fraction of employers satisfied with an outcome or
objective. The corresponding test of hypothesis for each outcome and objective is:

38

H0: Fraction Satisfied with objective or outcome = 0.6


H1: Fraction Satisfied with objective or outcome > 0.6
Fraction satisfied = (# Excellent+Very Good+Average)/Total Responses
Normality approximation to binomial distribution is assumed in this analysis. This
assumption is usually tested using normal probability plot analysis and the following test
of hypothesis is used only when the underlying distribution is normal. If the null
hypothesis is rejected, it means that the respective outcome is or objective is achieved. In
this test of hypothesis, Z = (p0.6) /Sqrt(p(1-p)/n). The test of hypothesis is carried out
for each objective and outcome using a level of significance of =0.05. If the test of
hypothesis is not applicable, a simple bar graph is used to check if the fraction satisfied is
0.6 or more. These analyses of assessment data for 1999 through 2004 establish that each
and every one of the program objectives and outcomes is achieved. In arriving at this
conclusion, it should be noted that students in cooperative education and summer
internship positions may be sophomores, juniors, or seniors. Further, the N/A response is
quite large in many instances and this leads to the rejection of the null hypothesis and the
conclusion that the respective objective or outcome is not attained.
Even though statistical analysis of alumni data reveals that employers are satisfied
with the overall achievement of objectives and outcomes, faculty decided to use the raw
data from these surveys to identify the areas where program may be improved. Table 8
summarizes analyses of these outliers.

39

YEAR
1999-2004
Cumulative
Data

1999-2004
Data for
Individual
Years

Table 8: Employer Assessment of Academic Preparation Outlier Responses & Actions


FINDING
POTENTIAL
ACTION
CAUSES
The null hypothesis of
The finding is due to the 15 N/A responses out of a total of 57
not achieving this
responses.
outcome was rejected. It
implies that students on
DOE is covered extensively in IE 4430 course and is applied in IE
coop did not have ability
4130 and IE 3430 courses.
to design and conduct
experiments (DOE).
The raw data must be
used carefully in this
case.
The null hypothesis of
not achieving this
outcome was rejected. It
implies that students on
coop did not have that
particular ability.

See the cause effect


diagram.

The findings in every case are due to the large number of N/A
responses out of a total of about 7 to 15 responses.

The raw data must be


used carefully in this
case.

40

8.4 Assessment Data from Industrial Project Sponsor Survey


This survey or questionnaire is completed by all personnel invited by the
industrial project sponsor to attend the final presentation by the team. During the
summer, the returned survey data are tabulated and a report is written by the program
coordinator. Sometimes a faculty member takes the primary responsibility for entering
summary of assessment data into spreadsheet, conducting analysis and arriving at
interpretations. The report is made available to the IE faculty at the start of the fall
semester. The report is presented at the fall meeting of Advisory Board for comments
and suggestions. Table 9 below presents the interpretations and actions taken by faculty.
Feedback from sponsors was positive and all the responses affirm that the program
objectives are attained. Further analysis will be carried out when additional data becomes
available.
Consider the cumulative data for 1998 to 2005 from Industrial Project Sponsor
Survey. Each survey item had six response levels, namely, Very Competent, Competent,
Average, Below Average, Poor, and N/A. To perform statistical analysis of the data,
numerical weights were assigned the responses. The numerical weights are: Very
Competent=5, Competent =4, Average=3, Below Average=2, Poor=1, and N/A=0. A
weighted average score of more than 3 is considered to be satisfactory attainment of the
respective objective or outcome. The corresponding test of hypothesis used is:
H0: Weighted average =3 (outcome or objective is not achieved)
H1: Weighted average >3 (outcome or objective is not achieved)
Weighted average = (# of responses for a category*respective weight)/Total Resp.
Normality of underlying distribution is assumed in this analysis and this may be
appropriate because of the Central Limit Theorem. This assumption is usually tested
using normal probability plot analysis and the following test of hypothesis is used only
when the underlying distribution is normal. If the null hypothesis is rejected, it means that
the respective outcome is or objective is achieved. In this test of hypothesis, Z =
(Weighted Average 3) / (Std. Dev/Sqrt(n)). The test of hypothesis is carried out for
each objective and outcome using a level of significance of =0.05. . If the test of
hypothesis is not applicable, a simple bar graph is used to check if the weighted average
score is 3 or more. These analyses of assessment data for 1998 through 2005 establish
that each and every one of the program objectives and outcomes is achieved. In arriving
at this conclusion, it should be noted that the N/A response is quite significant in many
instances, but it had very little impact on the final conclusion.
Alternatively, the same data may be analyzed using a test of hypothesis about
attribute p which is defined as fraction of sponsors satisfied with an outcome or objective.
The corresponding test of hypothesis for each outcome and objective is:
H0: Fraction Satisfied with objective or outcome = 0.6
H1: Fraction Satisfied with objective or outcome > 0.6
Fraction satisfied = (# Very Competent+Competent+Average)/Total Responses

41

Normality approximation to binomial distribution is assumed in this analysis. This


assumption is usually tested using normal probability plot analysis and the following test
of hypothesis is used only when the underlying distribution is normal. If the null
hypothesis is rejected, it means that the respective outcome is or objective is achieved. In
this test of hypothesis, Z = (p0.6) /Sqrt(p(1-p)/n). The test of hypothesis is carried out
for each objective and outcome using a level of significance of =0.05. If the test of
hypothesis is not applicable, a simple bar graph is used to check if the fraction satisfied is
0.6 or more. These analyses of assessment data for 1998 through 2005 establish that each
and every one of the program objectives and outcomes is achieved. In arriving at this
conclusion, it should be noted that the N/A response is quite significant in many instances,
but it had very little impact on the final conclusion.
Even though statistical analysis of alumni data reveals that employers are satisfied
with the overall achievement of objectives and outcomes, faculty decided to use the raw
data from these surveys to identify the areas where program may be improved. Table 9
summarizes analyses of these outliers.

42

YEAR
2000-01

Table 9: Evaluation of Assessment Data from Industrial Project Sponsor Surveys N/A Responses
FINDING
POTENTIAL CAUSES
ACTION
Ability to verbally
One out of eight persons on
Create course assignments that allow students to improve
and visually
sponsoring team assigned N/A for verbal and visual communication skills. Use rubrics to
communicate
this survey item. Some personnel provide effective feedback in all IE courses. See binders
effectively is below
on sponsoring team attended only for outcome (g) for evidence that this is not a current
average or is not
the final presentation and did not problem.
applicable.
read the project report.
Small sample size may have lead
to this conclusion.

1998-2005
Cumulative
Data from
the Sponsor
Survey

None. All outcomes


were achieved.

2002-2005

Test statistic values


were very high and
null hypothesis of not
achieving outcomes
is rejected with very
high certainty.

The distributional assumptions


may not have been satisfied in the
statistical tests used in the
analysis.
The very large cumulative sample
size of 93 had nullified the N/A
response for some survey items in
1998, 1999, 2000, and 2001.

Contact sponsors and get more information about why


they feel that verbal and visual communication skills
were not applicable in their observation.

Reassure IE faculty that industrial projects seem to be


planned, organized, and executed very well. These
projects demonstrate the achievement of all ABET
specified outcomes.
Remind faculty to maintain, or even improve the level of
performance in planning and supervising the industrial
projects.
Faculty members are very diligent Reassure IE faculty that industrial projects seem to be
in teaching IE courses and
planned, organized, and executed very well. These
structuring them very well to
projects demonstrate the achievement of all ABET
achieve IE program objectives
specified outcomes.
and goals. Faculty efforts ensure
Remind faculty to maintain, or even improve the level of
that ABET specified outcomes
performance in planning and supervising the industrial
are also achieved.
projects.

43

8.5 Assessment Data from Graduate Exit Survey and Analysis


The Graduate Exit Survey is administered in the capstone design course (IE 4930)
in the last two weeks of every semester. Graduates are required complete this survey and
return it to the Office of the Dean, College of EMS. During the summer, the returned
survey data are tabulated and a report is written by the program coordinator. Sometimes a
faculty member takes the primary responsibility for entering summary of assessment data
into spreadsheet, conducting analysis and arriving at interpretations. The report is made
available to the IE faculty at the start of the fall semester. The report is presented at the
fall meeting of Advisory Board for comments and suggestions.
The assessment data was analyzed in raw form and Table 10 below presents the
interpretations and actions taken by faculty with respect to outlier responses. Feedback
from graduates was positive and all the responses affirm that the program objectives and
outcomes are attained. Further analysis will be carried out when additional data becomes
available.
Consider the cumulative data for 2000 to 2005 from Graduate Survey. Each
survey item had five response levels, namely, Excellent, Very Good, Average, Below
Average, and Poor. To perform statistical analysis of the data, numerical weights were
assigned the responses. The numerical weights are: Excellent=5, Very Good=4,
Average=3, Below Average=2, and Poor=1. A weighted average score of more than 3 is
considered to be satisfactory attainment of the respective objective or outcome. The
corresponding test of hypothesis used is:
H0: Weighted average =3 (outcome or objective is not achieved)
H1: Weighted average >3 (outcome or objective is not achieved)
Weighted average = (# of responses for a category*respective weight)/Total Resp.
Normality of underlying distribution is assumed in this analysis and this may be
appropriate because of the Central Limit Theorem. This assumption is usually tested
using normal probability plot analysis and the following test of hypothesis is used only
when the underlying distribution is normal. If the null hypothesis is rejected, it means that
the respective outcome is or objective is achieved. In this test of hypothesis, Z =
(Weighted Average 3) / (Std. Dev/Sqrt(n)). The test of hypothesis is carried out for
each objective and outcome using a level of significance of =0.05. If the test of
hypothesis is not applicable, a simple bar graph is used to check if the weighted average
score is 3 or more. These analyses of assessment data for 2000 through 2005 establish
that each and every one of the program objectives and outcomes is achieved.
Alternatively, the same graduate survey data may be analyzed using a test of
hypothesis about attribute p which is defined as fraction of graduates satisfied with an
outcome or objective. The corresponding test of hypothesis for each outcome and
objective is:

44

H0: Fraction Satisfied with objective or outcome = 0.6


H1: Fraction Satisfied with objective or outcome > 0.6
Fraction satisfied = (# Excellent + Very Good+ Average)/Total Responses
Normality approximation to binomial distribution is assumed in this analysis. This
assumption is usually tested using normal probability plot analysis and the following test
of hypothesis is used only when the underlying distribution is normal. If the null
hypothesis is rejected, it means that the respective outcome is or objective is achieved. In
this test of hypothesis, Z = (p0.6) /Sqrt(p(1-p)/n). The test of hypothesis is carried out
for each objective and outcome using a level of significance of =0.05. If the test of
hypothesis is not applicable, a simple bar graph is used to check if the fraction satisfied is
0.6 or more. These analyses of assessment data for 2000 through 2005 establish that each
and every one of the program objectives and outcomes is achieved.
Even though statistical analysis of employer data reveals that employers are
satisfied with the overall achievement of objectives and outcomes, faculty decided to use
the raw data from these surveys to identify the areas where program may be improved.
Table 10 summarizes analyses of these outliers.

45

YEAR
Fall 2001,
Spring
2002,
F2004 &
S2005

Table 10: Evaluation of Assessment Data from Graduate Exit Surveys


FINDING
POTENTIAL CAUSES
ACTION
Failure to understand
IE curriculum deals with
Life Cycle principles, life cycle costing, and life cycle
the effects that the
management and production.
management are being taught in IE courses.
products they develop
Product design is not covered
will have on the
explicitly.
This concept is now introduced in IE 4730 and is
environment.
emphasized, or reinforced in IE 4830.
Sample size may be too small to
distort sample statistics.
Cumulative assessment data from 2000 to 2005 does
not show that this continues to be a problem.
Distributional assumptions may not
be satisfied in the statistical tests
Finding is contradicted by the other test of hypothesis
used in analysis.
and may be a false-negative.
Test statistic value may be very
close to the critical value even
though the null hypothesis was
rejected and the failure to achieve
the outcome was not very severe.

46

Fall 2002,
Fall 2003,
Spring
2004,
F2004,
S2005

Failure to use
industrial-quality
laboratory equipment
and engineering
software for
analysis, testing,
design, and
communications.

Human Performance Lab has


equipment for measuring human
skills and performance. But these
do not look like industrial quality
equipment.
The IE Systems Design Lab has
recent versions of AutoCAD,
AutoMod, MS Office, Minitab, etc.
But all available software cannot
be bought due to budget limitation.
Sample size may be too small to
distort sample statistics.
Distributional assumptions may not
be satisfied in the statistical tests
used in analysis.
Test statistic value may be very
close to the critical value even
though the null hypothesis was
rejected and the failure to achieve
the outcome was not very severe.

47

Continue current process of using regular capital


equipment budget and periodic DIN Budgets to
upgrade software and lab equipment.
New faculty member has chosen and ordered software
for use in IE courses taught by him.
Cumulative assessment data from 2000 to 2005 does
not show that this continues to be a problem.
Finding is contradicted by the other test of hypothesis
and may be a false-negative.

F2004S2005 Academic preparation


as an engineer not
satisfactory.

Finding is contradicted by the other Cumulative assessment data from 2000 to 2005 does
not show that this continues to be a problem. So the
test of hypothesis and may be
small sample size distorted results.
false-negative.
Contact these graduates as alumni and see if the survey
reveals different results.
Investigate this issue further and develop action items
to improve the curriculum.

2000-2005
Cumulative
Data from
the Sponsor
Survey

None. All outcomes


were achieved.
Test statistic values
were very high and null
hypothesis of not
achieving outcomes is
rejected with very high
certainty.

The very large cumulative sample


size of 91 had nullified the a few
low scores for some survey items
in 2004 and 2005.
Faculty is very diligent in teaching
IE courses and structuring them
very well to achieve IE program
objectives and goals. Faculty
efforts ensure that ABET specified
outcomes are also achieved.

48

Finding is contradicted by the other test of hypothesis


and may be false-negative.
Reassure IE faculty that courses seem to be planned,
organized, and executed very well. The surveys of
graduates demonstrate the achievement of all ABET
specified outcomes.
Remind faculty to maintain, or even improve the level
of performance in planning and supervising the
industrial projects.

8.6 Cause-Effect Diagram


When assessment data is tabulated and analyzed, the evaluation process may lead
to conclusion that an objective or an outcome is not achieved. In such cases, a formal
procedure is required to determine the root cause of the failure to attain the respective
objective or outcome. The cause - effect diagram provides a roadmap for the root cause
analysis procedure.
8.7 FE Examination Results
IE students have done well above the national and state averages for several
decades in the FE examination. These data permit IE faculty to redesign IE courses
periodically so that students will do well in this examination. Current and past data from
these examinations establish that the program is preparing students excellently to do well
in this examination.
8.8 Past Assessment Reports
IE faculty meeting minutes in the IE Office contain records of assessment data
evaluation and IE curriculum improvements. This manual revises the past format for
assessment reports.
8.9 Additional Assessment Evaluations that are Informal
Course folders are reviewed on an ad-hoc basis. The results are informally
conveyed from the person doing the checking to the person who created the folder.
Comments and suggestions from the Advisory Board are discussed as part of any changes
that stem from other assessments. Benchmarking against other IE programs is done on an
ad-hoc basis. Typically, the result is a repackaging of topics in existing courses, if
deemed appropriate and introduction of new textbooks, handouts, and software into IE
courses. Graduate placement and starting salaries are tracked every semester. No direct
action is generally taken as a result beyond trying to understand or informally explain any
trends noticed.

9. EVALUATION PROCESS AND PROGRAM IMPROVEMENT


Throughout the year, the assessment tools will be used to determine areas that
need improvement. These improvements will be implemented as agreed upon by the IE
faculty. If there are program objectives that are not being met, the faculty will take steps
to fix the problem. The steps include, but are not limited to:
1. Take appropriate corrective action in the specified course
2. Make curriculum changes to address the deficiencies
3. Reevaluate the program outcomes

49

4. Revise assessment tools and procedures


This assessment plan will be reviewed annually by the IE faculty and amended
with appropriate changes.
The following is a summary of the cumulative improvements implemented by IE
faculty in courses during the past six years.
IE 2130: Fundamentals of Industrial Engineering
Introduction of lean manufacturing concepts
Expanded discussion of facility layout and assembly line balancing
Expanded discussion of professional ethics and NSPE Code of Ethics
IE 3630: Work Measurement and Design
Expanded discussion of Principles of Work Design especially as related to hand tool
design
Inclusion of development of standard data
Expanded discussion of professionalism and dealing with conflicts during time study
IE 4030: Production and Operations Analysis
Expanded discussion of lean manufacturing concepts
Expanded discussion of supply chain management
Inclusion of ethical and societal implications of decisions
IE 3430: Human Factors Engineering
Update textbook to re-emphasize application
Include greater emphasis on design for special populations, ADA
Include activity to emphasize evaluation of computer workstation and input devices
Include career planning and advancement information regarding conferences and
certification as Human Factors Professional or Certified Professional Ergonomist
Emphasize applications in work and workplace design in Health Systems and other
non-traditional fields
IE 4730: Engineering Management
Include greater emphasis on ethics for working with technology
Include greater emphasis on current literature and topics including: outsourcing and
global competition
Include current business topics from Wall Street Journal and Harvard Business
Review
Add safety program design including hazard identification and amelioration
Add experience with self-assessment and 360 evaluation processes
Add diversity issues for managers and ethical implications
Include work on managing for innovation and Intellectual Property issues

50

IE 4750: Project Management


Include emphasis on team skills, communications, status reporting
Semester projects in service to community or university
Add concepts of high performance work groups and autonomous work groups from
current literature
Add information on Project Management Institute certification for career
advancement
Invite guest speakers on project management in industry
IE 4830: Cost and Value Analysis
Eliminated accounting aspects of the course
Added Lean concepts for production and office applications
Added Value Stream Management for both office and production applications
Added 5S project
Include Value Engineering concepts
Add current applications from industry and current publications
IE 3530: Operations Research I
Used updated computer programs to solve problems
Included material on Markov Processes
IE 4230: Facilities Design
Included material on Kan Ban Calculations
Included material on Cell Design
Used new software program for solving layout problems
IE 4330: Material Handling and Warehousing
Used new software program
Included coverage of current topics including RFID and WMS
IE 4630: Manufacturing Systems Design
Used updated computer programs
Included new hands on lab equipment
IE 4780: Principles and Design of Engineering Management Information
Systems
Used updated computer programs
Covered new topics including updated information on:
o IT Infrastructure and Platforms
o Wireless Revolution
o IS security and Control
o Enterprise Applications for Digital Integration

51

IE 4130: Simulation
Use of papers on application of simulation modeling in the military, urban planning,
spread of diseases, spread of invasive vegetation, etc.
More detailed coverage of ethics and professional responsibility.
Use of full factorial design of experiments, collection of experimental data from
simulation runs, analysis of data, and interpretation of data.
Use of life-long learning activities in an assignment.
IE 4430: Total Quality Management
Emphasis on six-sigma quality improvement approach
Coverage of black belt skills
Coverage of ISO 9000-2000, ISO 16949, cGMP, and related standards.
Coverage of global issues and environmental issues
Assessment of knowledge of quality tools
Case studies on ethics with emphasis on quality management
Common framework for social accountability, environmental, and quality standards.
Emphasis on the use of Minitab.
IE 4930: Industrial Systems Design
Emphasis on different industries in WI
Cover may areas of industrial engineering
Current projects that are useful to sponsoring industry.
Emphasis on life-long learning
Emphasis on graduate school information, investment planning and retirement
planning
Emphasis on Sarbanes-Oxley Act
Emphasis on current topics relevant to the industrial projects.

52

APPENDICES ARE DELETED TO REDUCE THE SIZE OF HANDOUT.


SEE TABLE OF CONTENTS FOR LIST OF APPENDICES.

53

REFERENCES
1. ISO 9000:2005, Quality management systems -- Fundamentals and vocabulary, International
Organization for Standardization (ISO), 1, ch. de la Voie-Creuse, Case postale 56, CH-1211
Geneva 20, Switzerland.
2. Montgomery, D., Introduction to Statistical Quality Control, John Wiley, 2005.
3. Rogers, G., Assessment Planning Flow Chart, 5233 Wagon Shed Circle, Owings Mills, MD
21117, email: grogers@abet.org
4. R.M. Felder and R. Brent, The ABCS of Engineering Education: ABET, Blooms
Taxonomy, Cooperative Learning, and so on, Proceedings of the 2004 American Society for
Engineering Education Annual Conference & Exposition, Session 1375,
5. Assessment Mechanisms, http://www.unh.edu/ccec/assessment/assessment.html.
6. B. S. Bloom (Ed.), Taxonomy of Educational Objectives: The Classification of Educational
Goals; pp. 201-207; Susan Fauer Company, Inc. 1956.
7. Rubrics for Evaluating Student Work, http://www.engr.sjsu.edu/assessment/topic/t1.html.
8. Use of Bloom's Taxonomy to Enumerate Attributes of EC-2000 Outcomes,
http://www.engrng.pitt.edu/~ec2000/ec2000_attributes.html
9. Defining the Outcomes: A Framework for EC-2000, Mary Besterfield-Sacre, Larry J. Shuman,
Member, IEEE, Harvey Wolfe, Cynthia J. Atman, Member, IEEE, Jack McGourty, Ronald L.
Miller, Barbara M. Olds, and Gloria M. Rogers, IEEE TRANSACTIONS ON EDUCATION,
VOL. 43, NO. 2, MAY 2000
10. Deming, W. E. (1986). Out of the Crisis. Cambridge, MA: Massachusetts Institute of
Technology, Center for Advanced Engineering Study.
11. The PDCA Cycle, http://www.dartmouth.edu/~ogehome/CQI/PDCA.html
12. Outcomes Assessment Rubrics, Department of Chemical Engineering, West Virginia University,
http://www.che.cemr.wvu.edu/ugrad/outcomes/
13. Assessment: Tutorial, Rubrics, etc., Electrical and Computer Engineering, Indiana University
Purdue University Indianapolis, http://www.engr.iupui.edu/ece/assessment/scoringRubrics.html
14. California State University Engineering Assessment Clearinghouse,
http://www.engr.sjsu.edu/assessment/
15. Assessment Rubrics, Department of Chemical Engineering, Auburn University,
http://eng.auburn.edu/programs/chen/programs/accreditation/assessment-rubrics.html
16. Authentic Assessment Toolbox, http://jonathan.mueller.faculty.noctrl.edu/toolbox/
17. Assessment Tools and Resources, University of Delaware, http://www.assessment.udel.edu/
18. Environmental Engineering Undergraduate Program - Scoring Rubrics for Our 12 Defined
Outcomes, Dept. of Civil and Environmental Engineering, University of Delaware,
http://www.ce.udel.edu/ABET/Current%20Documentation/ABET_scoring_rubrics_enveng.html
19. CRITERIA FOR ACCREDITING ENGINEERING PROGRAMS, Effective for Evaluations
During the 2007-2008 Accreditation Cycle. Incorporates all changes approved by the ABET
Board of Directors as of October 28, 2006. Engineering Accreditation Commission. ABET, Inc.
111 Market Place, Suite 1050, Baltimore, MD 21202.
20. Rogers, Gloria, Rubrics: What Are They Good for Anyway? Part I - An Assessment101
column in ABET's Community Matters newsletter, September 2006.
21. Rogers, Gloria, Rubrics: What Are They Good for Anyway? Part II - An Assessment101
column in ABET's Community Matters newsletter, October 2006.

54

22. Rogers, Gloria, Rubrics: What Are They Good for Anyway? Part III - An Assessment101
column in ABET's Community Matters newsletter, November 2006.
23. Rogers, Gloria, Assessment Planning,
http://www.abet.org/assessment.shtml#Assessment%20matrix

55

You might also like