You are on page 1of 75

Running head: COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

Community Connections Model Evaluation Plan


Nicole Gottleib, Zachary Lindsey, Ashley Trewartha
Loyola University Chicago

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

Table of Contents
Evaluation Plan for the Residence Life Community Connections Model
Context and History of the Program
Rich Description of the Program

3
4
4

Statement of the Problem


Significance of the Problem

6
7

Stakeholder Analysis
Role of the Evaluators

7
10

Logic Model

11

Methodological Strategy
Quantitative Evaluation Approach

13
13

Population and Sampling Strategy


Survey Instrument
Implementation of Survey Instrument
Statistical and Data Analysis Procedures
Validity Concerns
Quantitative Data Presentation
Quality Considerations

Qualitative Evaluation Approach


Population and Sampling Strategy
Protocol Instrument
Implementation of Exit Interview Protocol
Quality and Validity Considerations
Data Analysis Procedures
Qualitative Data Presentation

14
14
17
20
20
23
23

24
25
25
27
29
31
32

Conclusion
Limitations
Timeline
Next Steps

32
32
33
34

References

36

Appendices
Appendix A: Community Connections Model
Appendix B: Department of Residence Life Organizational Chart
Appendix C: Loyola University Organizational Chart
Appendix D: Logic Model for Community Connections Model
Appendix E: Screenshots of the Community Connections Model Survey
Appendix F: Pre-Test Community Connections Model Survey
Appendix G: Mid-Test Community Connections Model Survey
Appendix H: Post-Test Community Connections Model Survey
Appendix I: Exit Interview Email Invitation
Appendix J: Exit Interview Confirmation Email
Appendix K: Exit Interview Protocol (Question Guide for RDs)
Appendix L: Handout on the Loyola Experience
Appendix M: Interview Submission Form
Appendix N: Timeline and Budget
Appendix O: Presentation of Evaluation Plan

38
38
39
40
41
42
44
49
54
59
60
61
63
64
68
69

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

Evaluation Plan for the Residence Life Community Connections Model


The Department of Residence Life at Loyola University Chicago is a large department,
made up of professional staff members, graduate student assistants, undergraduate student staff
members, and student residents of the various Residence Halls. With student staff members
involved most closely in the day-to-day lives of the residents, the Department recently developed
a model for student staff referred to from now on as Resident Assistants, or RAs to use when
engaging with members of their communities. This model is called the Community Connections
Model, or CCM, and was created by the Loyola University Department of Residence Life
Academic Support and Programming Committee to enhance the resident and student staff
experience. In order to align with the Division of Student Development, the committee created
the new programming model around the Loyola Experience. The Loyola Experience was
developed as a four-year plan for students, which consists of key opportunities that guide the
overall student experience.
According to Fitzpatrick, Sanders, and Worthen (2010), evaluations primary purpose is
to provide useful information to those who hold a stake in whatever is being evaluated
(stakeholders), often helping them to make a judgment or decision (p. 9). This is different from
research because evaluations focus is on determining judgments or decisions, whereas research
emphasizes finding conclusions (Fitzpatrick et al., 2010). Additionally, and perhaps more
directly, evaluation aims to provide details around program improvement. With this in mind, the
purpose of this evaluation project is to assess the impact the CCM implementation has on the
leadership efficacy of the Resident Assistants (RAs). It is important to note that there are also
student staff members in the Residence Halls who work directly with Living Learning
Communities, called Learning Community Assistants, but due to the different nature of how they

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

engage with the CCM, their roles will not be addressed in this evaluation. The following
sections will outline details about the CCM the program being evaluated including the
context, history, and stakeholders involved with the program. Our evaluation is also addressed,
including the specific problem we are looking at and questions we are asking.
Context and History of the Program
The CCM was created in 2014 and first implemented for the 2015-2016 academic year.
The CCM was created due to inconsistencies within the Department of Residence Life and issues
implementing a programming model. RAs and students were not receiving the same experiences
across Residence Life, and it was causing a strain on professional staff. Therefore, the CCM was
developed as a method to relieve those concerns. It was based on a review of similar models at
other institutions, an extensive review of both formal and informal feedback on the old model
from student staff, and input from divisional stakeholders. Additionally before the final model
was implemented, feedback on the model was solicited from departmental leadership and
professional staff who would be responsible for the implementation of the model.
Rich Description of the Program
The CCM is a guideline for how student staff build community within their residence
halls. It is comprised of three major components designed to incorporate all major aspects and
responsibilities of Resident Assistants (RAs) in the Department of Residence Life. The three
components include Programmatic Connections, Administrative Connections, and Resident
Connections. The specific requirements for each of these components are outlined in Appendix
A. All components are intended to provide RAs a framework in which they can interact with
both individual residents as well as the community as a whole.

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

Programmatic Connections requires individual RAs to create six small scale programs for
their residents each semester. These programs can range in size, scope, and learning outcome
based on the needs of the individual community as determined by the RA and their supervisor.
The second component is Resident Connections, which requires RAs to interact individually
with twenty percent of their floor each week. These interactions can be brief but should be a
meaningful way for RAs to connect with individuals in their community. The final component
of the CCM is Administrative Connections. This component outlines the administrative tasks of
the RA role and is intended to help RAs use these tasks to continue building community rather
than as simple tasks to check off a list.
The CCM is overseen by the Academic Support and Programming Committee of the
Department of Residence Life and Marci Walton, the Assistant Director (AD) for Academic
Support and Learning Communities. In addition to strategic and planning oversight led by AD
Walton and the Academic Support and Programming Committee, day to day implementation of
the program and supervision of RAs is handled by Residence Life professional staff, made up of
eleven Resident Directors and twelve Assistant Resident Directors. Collectively, the
professional staff supports and supervises 115 RAs and Learning Community Assistants (LCAs
not assessed in this evaluation), who are ultimately responsible for meeting the requirements of
the model on their floors. In order to facilitate this work, professional staff have programming
budgets based on the number of students in their respective halls or areas.
Student participation and engagement with the CCM includes the 115 RAs, and the
approximately 4,600 residential students at Loyola University Chicago. All on-campus students
participate in the CCM in at least indirectly, even if they choose not to actively attend programs.
This is because the model includes those administrative tasks, via the Administrative

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

Connections component, that encourage RAs to continue to build community on their floors in
addition to the programs they are required to develop.
Statement of the Problem
With the CCM currently in its infancy stage, having just been implemented during the
2015-2016 academic year, this program is in need of a formative evaluation, in order to provide
information for program improvement (Fitzpatrick et al., 2010, p. 20). Additionally, because
the program is still new, this formative evaluation will allow necessary changes to be made
before the CCM has become routine and therefore harder to alter and to gain stakeholder buy-in.
Rather than evaluate every aspect of the program, the purpose of this evaluation is to assess the
impact of the CCM on the Resident Assistants (RAs) who utilize the model to engage with their
communities of residents. In this evaluation, we seek to determine how well the CCM helps
develop the leadership skills of the RAs, specifically in relation to effective relationship building
and leadership efficacy.
The information gained as a result of the evaluation will be useful to professional
Residence Life staff members and other key stakeholders. It will allow them to consider the
impact that this model has on the student staff members, rather than just on the residents, which
is the type of information the program itself is already collecting. Ideally, this would allow for
improving hiring practices, training procedures, and the overall retention of RAs. It could also
enhance how professional Residence Life staff support RAs and help them reflect on the
experiences they are having throughout their time in the position.
We propose a mixed-methods approach to evaluate the impact of the CCM on RA
leadership development, including quantitative and qualitative measures. The quantitative
measures will include pretest and posttest surveys that allow us to look at the outcomes of the

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

CCM and its effectiveness (Schuh et al., 2009). The qualitative measure will involve exit
interviews with RAs who are leaving the position by choice or due to graduation at the end of the
academic year.
Significance of the Problem
The CCM is an important element in the overall function of the Department of Residence
Life. It incorporates a significant portion of the job responsibilities of RAs and it also exists as
the central plan for how the department carries out its mission in the residence halls. The Loyola
University Chicago Department of Residence Life vision statement states, "We contribute to
students transformative education by offering student centered programs, services and
environments that foster student involvement, responsibility and leadership" (Loyola University
Chicago, 2015). The CCM is a means through which the department is able to carry out this
vision by providing a structure for RAs to create programs and build inclusive communities on
their floors. The CCM also potentially affects the RA experience at LUC, as RAs are still
students first and foremost in addition to being staff members. The CCM provides a structure
that the department hopes will help build leadership skills of RAs. Through this evaluation, we
hope to assist the Department of Residence Life by showing how well this is being done and how
it can be improved.
Stakeholder Analysis
According to Fitzpatrick et al. (2010), stakeholders are the clients, sponsors,
participants, and effected audiences of a program, the people who are influenced or affected by
whatever is being evaluated (p. 316). The CCM has many stakeholders who work directly with
the development, implementation, and daily use of the program. For this evaluation, we decided
to work directly with a few key stakeholders of the program in order to enhance the validity of

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

the study[and to] increase the use of the results (Fitzpatrick et al., 2010, p. 317). In order to
identify important stakeholders involved with the CCM and our evaluation, a power versus
interest grid (Table 1) was created. According to Bryson and Patton (2011), power versus
interest grids typically help determine which players interests and power bases must be taken
into account in order to produce a credible evaluation (p. 5). There are four categories of
stakeholders: the players, or those with high interest and high power; the subjects, those with
high interest and low power; the context setters, those with low interest and high power; and the
crowd, or those with low interest and low power (Bryson, Patton, & Bowman, 2011).
The players include Marci Walton, the Assistant Director for Academic Support and
Learning Communities, who is the most influential player and stakeholder within this evaluation.
We are working closely with Marci to develop the evaluation and determine the ideal questions
to ask and the best methods to assess the CCM. Additionally, Marci sits on the Academic
Support and Programming Committee, another player stakeholder, who assists her in providing
us feedback and information about the program and evaluation ideals. The final player we have
identified is the Training Task Force, which will be the group of Residence Life staff members
who implements the evaluation beginning in the Spring of 2016. Additionally, we would like to
note that the Department of Residence Life is currently without a Director, so we are unable to
identify if that role would be considered a player or a context setter in relation to this program.
That information will be updated if the position is filled, or will need to be taken into
consideration when implementing the evaluation if it is filled within that timeframe.
The next level of stakeholders, who have high interest in the program but less power than
the players, are the subjects. We have identified the remainder of people affiliated with
Residence Life as subjects, including Resident Directors (RDs) and Assistant Resident Directors

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

(ARDs) who are not specifically affiliated with the Academic Support and Programming
Committee or the future Training Task Force. Other subjects in the program include the
Resident Assistants (RAs), who are very invested in the CCM but who have little power in the
overall program. Finally, all residents within the Department of Residence Life have been
identified as subjects in relation to their stakeholder status.
The additional two levels of stakeholders include the context setters, or those with high
power but low interest, and the crowd, who have low power and low interest. We have identified
two individuals as context setters in the CCM. Ray Tennison, the Associate Director for
Residence Life, and Jane Neufeld, Vice President for the Division of Student Development, are
the two high power but low interest stakeholders. Please see Appendices B and C for
organizational charts of the Department of Residence Life and for the institution as a whole,
which provide visualization in how Ray Tennison (as well as Marci Walton and other Residence
Life staff members) and Jane Neufeld are connected. Finally, the general student body, and
specifically students who do not live in a Residence Hall on campus, are considered the crowd,
who may or may not be informed about the program and evaluation.
Table 1
Players High Power, High Interest
Subjects Low Power, High Interest
o Marci Walton, Assistant Director for
o RDs (those not involved with the
Academic Support & Learning
Committee and Task Force)
Communities
o ARDs (those not involved with the
o The Academic Support and Programming
Committee and Task Force)
Committee, made up of Resident
o RAs
Directors and Assistant Resident Directors
o Residents (students who live in Resident
o The Training Task Force (when it comes
Halls)
to implementing the evaluation)
Context Setters High Power, Low Interest
The Crowd Low Power, Low Interest
o Ray Tennison Associate Director for
o General student body (students who do
Residence Life
not live on campus in Residence Life)
o Jane Neufeld, Vice President of Student
Development

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

10

Role of the Evaluators


According to Fitzpatrick et al. (2010), the adjectives internal and external distinguish
between evaluations conducted by program employees and those conducted by outsiders (p.
27). Our evaluation consists of both internal and external evaluators, which provides us with the
advantages of both types of evaluators and helps eliminate some of the disadvantages of only
having one type of evaluator, all of which are outlined below. Zachary Lindsey and Ashley
Trewartha are internal evaluators, as they are both employed as Assistant Resident Directors in
the Department of Residence Life. Nicole Gottleib is an external evaluator, who has no
connection to the Department and was recently introduced to the CCM.
The advantages of the internal evaluators are as follows: the benefit of having insider
knowledge of the development and implementation of the program, contact with the main
stakeholders of the program, and the ability to navigate systems of power within the program.
Additionally, as we aim to implement the first stage of our evaluation in the spring of 2016, their
time of employment also provides them with the advantage of aiding in that implementation and
assessing the results. The disadvantages of internal evaluators are as follows: having limited
power in the Department of Residence Life and management of the CCM, direct connection to
and daily interaction with the model and the RAs who implement it, and a biased desire to want
to see it succeed. The advantages of the external evaluator are as follows: unbiased perspective
of the model, distance and perceived objectivity about the Department of Residence Life and
the program, and the ability to rearticulate questions and elements of the evaluation without
using unclear or insider terminology (Fitzpatrick et al., 2010, p. 28). The disadvantages of the
external evaluator are as follows: distance from the Department of Residence Life and the

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

11

program, a lack of understanding of the day-to-day experiences in Residence Halls, and limited
relationships with key stakeholders of the programs.
Logic Model
Our team created a logic model (see Appendix D) that details the components of the
CCM as it relates to the training and leadership development of RAs. According to McLaughlin
and Jordan (2010), a logic model is a plausible and sensible model of how a program will work
under certain environmental conditions to solve identified problems (p. 56). The logic model
outlines program inputs, outputs, and outcomes related to the development and implementation
of the CCM. Though the CCM has a variety of intended outcomes related to the development
and sense of belonging of residents who live in the residence halls, our team narrowed the scope
of the logic model to focus on the outcomes specifically related to the development of RAs.
The logic model begins with inputs, or resources, that support the CCM. The inputs
consist of personnel, time, and financial resources. The personnel that currently support the
model include members of the Department of Residence Life Leadership Team, such as the
Associate Director for Residence Life, the Assistant Directors that oversee staff development and
training as well as academic support. The program is also supported by residence hall staff,
including Resident Directors (RDs), Assistant Resident Directors (ARDs), and student staff
members, specifically the Resident Assistants (RAs). The Academic Programming Committee
developed the CCM in 2014 and oversee the continued support of the program. Training for
professional staff and student staff are essential in ensuring that staff are able to implement the
program in each residence hall, which is ultimately supported through the programming funds of
each hall.

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

12

The next section of the logic model is outputs, which includes the activities and
participants within the program. Residence Life staff utilize student staff training, weekly
meetings between RDs, ARDs, and RAs, and departmental meetings to ensure that the program
is being implemented and to provide updates about different components of the program. The
components of the program specifically ongoing events that RAs lead occur regularly in each
hall. Participants of the program include residents in the residence halls and RA staff, as well as
professional staff that support RAs in creating events that are outlined in the CCM. Finally, a
variety of campus partners work with RAs and hall staff to help create meaningful events for
residents.
Together, the inputs and outputs aim to result in a variety of short-term and long-term
outcomes. Though the CCM was designed to support the development and community building
of residents, the program also affects the RAs who implement it on their residence hall floors.
Because our teams assessment specifically focuses on how the CCM affects RA staff, the
outcomes in our logic model outline the intended outcomes of the program on student staff.
Whereas the short-term outcomes measure the basic understanding of the CCM and the Loyola
Experience, all of the long-term outcomes incorporate leadership development and leadership
efficacy.
Finally, our logic model includes contextual assumptions and external factors related to
the CCM. RAs are hired by exhibiting a certain level of role modeling and leadership skills, so it
is assumed that RAs have built some of the skills that we are measuring already. Additionally,
professional staff play a crucial role in supporting RAs, so our logic model assumes that this
support is occurring. External factors that affect the implementation of the CCM include the
efforts of campus partners throughout the Division of Student Development. Additionally,

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

13

because RAs are undergraduate students, their academic and personal experiences outside of the
RA position can affect the program and their own development.
Methodological Strategy
Quantitative Evaluation Approach
The quantitative evaluation approach that we are taking in order to assess the leadership
efficacy of the Resident Assistants (RAs) in relation to their understanding and use of the CCM
is an electronic survey that will be shared with them on the platform, OrgSync, they all utilize on
a regular basis. We have developed a longitudinal survey, including pre-test and post-test
elements, that will be shared with the participants at various points within the academic year in
which they hold their RA position. We chose to do pre-test and post-test methods in order to
assess the effect the CCM has on the leadership efficacy of the RAs. In order to assess that
effect, we plan to collect survey responses at three different points in a RA position cycle. The
first collection point will be our pre-test, which will occur during the spring training session that
occurs the academic year prior to the beginning of the position. This allows us to collect data on
the participants leadership efficacy prior to their connection with the CCM. The mid-test
collection will then occur after the two-week long, in-depth fall training period, which transpires
at the beginning of the academic year in which the position is held. This will be the RAs first
true experience learning about the CCM, and will allow us to assess how the in-depth training
process and introduction to the CCM impacts leadership efficacy. Our post-test collection for
our quantitative evaluation will occur at the end of the fall semester, during the check-out
process for the residence halls. This will allow us to collect information on how the CCM is
affecting leadership efficacy. Ideally, this will show a continuation of growth in the RAs

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

14

between the pre- and post-tests, and the outcomes around leadership development will have been
met by that time.
Population and Sampling Strategy
The population that is being assessed is the Resident Assistants in the Residence Halls at
Loyola University Chicago. This is a population of 115 student employees, who serve various
communities within the Residence Halls. Because the full population of RAs is a manageable
number of participants for an assessment, our quantitative evaluation approach will be shared
with all RAs at Loyola, or a census sample of the RAs. A census sample indicates that we will
share the assessment with the full population, and collect data from as many participants as
possible (Wholey, Hatry, & Newcomer, 2010). Additionally, we are using a single stage
sampling design, as we have access to names in the population and can sample the
peopledirectly (Creswell, 2009, p. 148). The population of RAs will not be stratified, as all
RAs will be selected for our assessment (Creswell, 2009).
Survey Instrument
We developed the quantitative survey to assess the extent to which the leadership
approach of RAs changes over a year in the RA position, RAs understanding of the Loyola
Experience that is used in the CCM, and their ability to integrate the Loyola Experience and their
leadership approach in the CCM. The largest component of the survey comprises the questions
related to leadership, which are based off the Authentic Leadership Questionnaire, which is
described below. The next section of the survey will assess some of the short- and long-term
learning outcomes as outlined in our Logic Model (see Appendix D). Finally, demographic
information will be collected to examine to what extent target social identities affect responses.

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

15

To assess leadership, we decided to use the Authentic Leadership Questionnaire (ALQ)


developed by Walumbwa, Avolio, Gardner, Wernsing, and Peterson (2008) because the
leadership constructs measured in the ALQ align with how the Department of Residence Life
expects RAs to approach leadership within their roles. Authentic leadership, as defined by
Walumbwa et al. (2008), is:
A pattern of leader behavior that draws upon and promotes both positive psychological
capacities and a positive ethical climate, to foster greater self-awareness, an internalized
moral perspective, balanced processing of information, and relational transparency on the
part of leaders working with followers, fostering positive self-development (p. 94).
Self-awareness is related to how individuals make meaning of the world and how that affects the
ways in which they view themselves (Walumbwa et al., 2008). Self-awareness also includes
awareness of strengths and weaknesses and understanding how individuals affect others
(Walumbwa, 2008). Through the CCM, RAs are meant to learn how they affect their residence
hall community and residents experiences. Relational transparency refers to presenting ones
authentic self...to others (Walumbwa, 2008, p. 95). The CCM, which is flexible enough for
RAs to implement in ways that align with who they are, hopefully allows for RAs to bring who
they are and what their passion areas are into the programs and interactions they have with their
floor. Balanced processing refers to the ability and willingness to make decisions based on their
own and others views (Walumbwa, 2008). The CCM allows for RAs to create programs that
match their residents interests, which has the potential to build balanced processing. Finally,
internalized moral perspective is how well individuals decision-making and behavior align with
their moral standards and values (Walumbwa, 2008). The flexibility within the CCM allows
RAs to respond to situations and issues arising in their residence hall community. Not only

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

16

should the CCM allow RAs to enact authentic leadership within their roles; it should also help
RAs build the skills to enact authentic leadership.
The ALQ is a 5-point Likert scale questionnaire constructed of 16 statements
(Walumbwa, 2008). The ALQ measures self-awareness, internalized moral perspective,
balanced processing, and relational transparency to measure authentic leadership as a whole.
Our ALQ, which can be found in its three versions in Appendices F-H, was developed out of a
sample questionnaire that can be used for practical use (Northouse, 2016). To measure the
extent to which the CCM affects RAs leadership approach, rather than broadly measuring
authentic leadership, ALQ statements were modified to include The Community Connections
Model helps me to the beginning of each statement. The components of the CCM
programming, interactions with residents, interactions with supervisors and peers, and
administrative tasks are clearly outlined in the instructions to help RAs recall the CCM and
specific aspects of their job that we are assessing.
The survey also includes six Likert-scale statements related to some of the learning
outcomes outlined in the Logic Model (Appendix D). Specifically, these items aim to assess
RAs understanding of the Loyola Experience and how their own experiences mirror components
of the Loyola Experience as well as how they will implement the CCM in their residence hall
community. Statements were worded in such a way that they can be applicable across different
points in time to ensure consistency of questions throughout the longitudinal assessment.
The final components of the quantitative surveys are two sections of demographic
information. Because RAs work in a variety of types of residence halls, work with students of
various ages, and may be employed for more than one year, questions regarding what type of hall
and student population RAs work with, as well as how many years they have been working as an

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

17

RA, will help the Department of Residence Life. The Department can use this information to
compare how effective the CCM is within different residence halls and if multiple years in the
RA position affect scores, particularly those related to authentic leadership development. This
demographic information was intentionally placed at the beginning of the survey and labeled as
Your RA Role to distinguish this less-personal information from the demographic details
around social identities collected at the end of the survey.
Because a leaders social identities can affect their ability to be authentic and can affect
interactions with others, stakeholders within the Department of Residence Life wanted to have
some data on respondents various target and agent identities. The diversity of identities within
the RA population is limited and the power dynamic between the department, who is conducting
and analyzing the survey, and the RAs, whose job status is determined by the department, creates
challenges in asking identity-based questions. Both to protect the anonymity of RAs and to
encourage RAs to be honest in their responses, demographic items were phrased so that RAs
could indicate whether they hold target or agent identities, instead of disclosing specific
identities. For example, respondents are asked to indicate whether they have a target or agent
racial identity rather than being asked to disclose their specific racial identity. Additionally, an
option is offered for respondents to not disclose their social identities. These questions help
retain anonymity while still collecting data that the Department of Residence Life is interested in
understanding. Because all of the surveys will be anonymous, we want to collect this
information to see how results change over time.
Implementation of Survey Instrument
The survey instrument will be used at multiple points in the year in order to collect
multiple data points from respondents to show development over time. The survey will be

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

18

administered through OrgSync, an online platform used by Residence Life to communicate


information to both RAs and the student body. This platform is identified by Loyola University
Chicago (LUC) as LUCentral both OrgSync and LUCentral can be used interchangeably.
Using a platform RAs are familiar with will increase their comfort with the survey instrument.
There are several other advantages to OrgSync. First, the platform allows the instrument to be
collected confidentially. The identities of respondents will only be known to the platform (e.g.
OrgSync, likely as data collected by the organization) and will not be accessible to either the
assessors or Residence Life. The platform also keeps respondents data together over multiple
points in time, so that if a student withdraws from the RA position and therefore does not
complete all four surveys, their data can be removed from the aggregate. OrgSync also allows a
great deal of accessibility for respondents. The survey instrument can be completed on any
Internet-enabled device, including tablets, smart phones, and computers.
The first survey will be given to RAs at the end of spring training, during the last session
of that training. Spring training is a fundamental orientation to the RA role for all newly hired
RAs during the spring semester in the academic year prior to the official start date of their
position. This initial assessment will be given to RAs before they have been exposed to the
CCM and will serve as the pre-test for the assessment model. All RAs will be strongly
encouraged to complete the survey before leaving spring training. After finishing the survey, a
page will appear that states the survey was completed, and will then instruct respondents to show
facilitators the page on their device. This will both ensure that all RAs take the survey, as it is a
built-in requirement of training, but also that they maintain anonymity and the ability to remain
anonymous is clear to them, as they will only need to share the image of the completion page to
the facilitators.

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

19

For the mid-test, the survey will be collected after the completion of fall training, at the
start of the academic year in which the RA holds the position. This is the largest training RAs
receive, is required of the position, and occurs in August, just before the beginning of the school
year. This session is also when RAs are trained on the CCM. Collecting survey data at this
point will provide insight on the short-term effectiveness of the training on the CCM, and how
that training and learning about the model impacts RA leadership efficacy. This data will also
show where RAs are at in terms of authenticity and efficacy before implementing the CCM in
their communities. This is in addition to data collected from the pre-test because recognition of
the tools provided via the CCM may impact leadership efficacy. Completion of the survey will
occur during the last training session of the overall fall training period, and will be verified with
facilitators following the same procedure as outlined for spring training.
The final data point will be collected at the end of the fall semester. The CCM is a
semester-long model, and therefore it will be important to collect survey data from students after
completion of the model. Collecting the post-test will allow us to see if RAs develop and grow
as a result of the model, if development stagnates, or if no change occurs during the model
implementation in Residence Hall communities. This will also be measured by comparing the
results of first year RAs with second and third year RAs who will have completed the model at
least one time before the assessment year. Supervisors will verify completion of the survey
before RAs are dismissed for winter break each semester. This will be done by a similar method
to the first two collections, where a completion page appears on the LUCentral instrument after
concluding the survey that instructs respondents to show the page to supervisors.

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

20

Statistical and Data Analysis Procedures


The data collected will be analyzed using several methods. The first will be using a t-test
to compare responses of RAs of target and agent identities to each question on the survey in
order to see if there is a difference in how the CCM impacts students based on social identity
(Newcomer & Conger, 2010). The evaluators will then examine the data using an analysis of
variance, or ANOVA, repeated measures (Schuh et al., 2009). This will allow evaluators to
examine how RAs responses change over time, comparing the three data points. Additionally,
this method of data analysis will allow us to assess differences across social identities as well,
including first or second/third-plus year RAs. Finally, the evaluators will use the analytic
method described by Northouse (2016) for the ALQ. The analytical method for calculating
scores within the ALQ involves summing the responses to the items that correspond with each of
the four constructs addressed in a previous section self-awareness, internalized moral
perspectives, balanced processing, and relational transparency (Northouse, 2016). Higher scores,
between 16-20, indicate stronger authentic leadership in that construct, while lower scores, 15
and below, indicate weaker authentic leadership. This allows evaluators to assess which
components of the authentic leadership process are stronger or weaker through a comparison of
scores in each construct (Northouse, 2016). This is different than a factor analysis because the
criteria has been set within the ALQ analytic method, and evaluators will sum the results versus
looking at each of the constructs separately (Newcomer & Conger, 2010).
Validity Considerations
Validity concerns, as well as methods in which to maintain validity, have been
considered with the quantitative aspect of our assessment. Concerns around validity lead to the
inability to conclude that the intervention affects an outcome and not some other factor

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

21

(Creswell, 2009, p. 162). Internal and external threats to validity have been considered in the
design of our survey. A key internal threat is the concern around collecting data as a part of the
position requirements of the RAs. Although no punishment or repercussions will be enacted if
RAs do not complete the surveys, because there will be no way for supervisors to see which
individuals have or have not completed it due to the anonymity of the collection method, they
will still be encouraged to complete the surveys as part of their position requirements. This
validity concern has been taken into consideration in the way in which we have emphasized the
necessity to have data be collected anonymously. However, there is less concern around typical
internal validity threats, such as selection threats or mortality threats, because of this same reason
(Creswell, 2009). The RAs have already been selected for a paid position that involves a
detailed application and interview process, and are likely to share specific characteristics
beneficial to the position before we complete our census selection of the RA population.
If RAs do leave the position within the process of the data collection, we will maintain
the validity of data by having our key stakeholder, Marci Walton, utilize information available in
OrgSync to identify the data collected from that specific individual and actually remove that data
from the collection. Marci, or the individual in that position, will be the only person with access
to the data in a way that is not anonymous. However, we do not consider this a threat to validity
or an ethical concern because Marci does not supervise the students in the RA positions and
would have to go through a detailed process to access individual data in OrgSync. Additionally,
LUCentral has the ability to make a survey anonymous and not require a LUC email/password to
access the survey. This would allow us to store the data securely on the LUCentral account, but
the answers would not be linked to the student LUCentral accounts.

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

22

External threats to validity affect our ability to draw inferences from the assessment data,
and include possible issues around testing and instrumentation (Creswell, 2009). Due to the
relatively frequent nature of our longitudinal survey, and the fact that RAs can hold their position
for more than one year, there is a risk that participants may become familiar with the instrument
and the questions asked, and respond in a similar manner during later testing. However, this
concern has been taken into consideration in the question design for our survey, and because the
questions revolve around the participants self-reflection around their leadership, repetitive
responses should be less of an issue over time. Additionally, repetitive responses would indicate
that participants are not benefiting from the CCM, and would be valuable data. As for the
instrumentation, there is a risk that by using a third-party vendor to house our assessment, there
may be changes made to the design and function of OrgSync over time. However, this concern
will be lessened through the ability to easily move the instrument to other platforms without
actually changing the information being collected from the instrument.
In relation to the ALQ, which was used as a model for our quantitative surveys, research
indicates there is validation around the assumptions of authentic leadership within the results of
the questionnaire. According to Northouse (2016), Walumbwa and associates validated the
dimensions of the instrument and found it positively related to outcomes such as organizational
citizenship, organizational commitment, and satisfaction with supervisor and performance (p.
217). These concepts relate to the outcomes outlined in the Logic Model (Appendix D) for RA
leadership development, because organizational citizenship and commitment speak to the
development of community a key aspect of the CCM and performance satisfaction can be
interpreted as leadership efficacy.

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

23

Quantitative Data Presentation


Results from this survey instrument will be compiled into both an excel spreadsheet as
well as a written narrative. The presentation of the data analyses will follow the tips for
presenting such information as outlined in Newcomer and Conger (2010), including a clear
identification of tables and figures, the exclusion of abbreviations and acronyms, and the
inclusion of graphics to aid in the visual representation of the results. These results will first be
presented to Marci Walton, AD for Academic Support and Learning Communities, and then to
an additional set of stakeholders, the Academic Support and Programming Committee, by the
internal evaluators after the spring training for new RAs during the 2015-2016 academic year. If
the committee decides it is appropriate, the results will then be presented to departmental
leadership and the entire Department of Residence Life staff at a regular staff meetings by
members of the Academic Support and Programming Committee. During the implementation of
the mid-test at the end of fall training, as well as the post-test at the end of fall semester, data
collected will be compiled by Marci Walton or another member of the Department of Residence
Life who is tasked with analyzing and presenting the data.
Quality Considerations
In order to maintain the quality of our results, the internal evaluators will be involved
with the implementation of the first round of pre-tests, during the spring semester of the
academic year prior to when the RAs official position begins. They will aid stakeholders in
implementing the surveys during the spring training, which will also allow them to check the
quality of the electronic surveying method on OrgSync. Additionally, they will be available to
respond to questions from the RAs, as well as questions from the stakeholders as we roll out this
new assessment plan. They will also be able to check on responses and data being collected, in

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

24

order to review our surveying method as it rolls out and make any necessary changes or fixes
prior to the mid-test in the fall semester of the following academic year (Newcomer & Triplett,
2010). The quality will also be controlled due to the nature of our electronic survey method. All
data will be sent out from and collected in one source, so information about when the surveys
were sent or made available will be collected, as well as data on when the surveys were
completed (Wholey et al., 2010).
Qualitative Evaluation Approach
We will be using exit interviews as qualitative evaluation approach to evaluate assess the
leadership efficacy of Resident Assistants (RAs) in relation to their understanding and use of the
CCM. Exit interviews with RAs who are graduating, moving on to a different position, or who
are leaving by choice will allow evaluators to gain insight as to what RAs consider leadership
and how they have developed their understanding as a result of implementing the CCM in their
position. Interviews will be conducted in April prior to the end of the semester and the closing
process in the residence halls.
This specific time frame, in the spring semester, was chosen because conducting an
intervention on survey participants before the last data collection would skew the results of the
quantitative piece. The exit interviews will have to be conducted by professional staff members
of Residence Life specifically Resident Directors due to limited resources available to the
evaluation. This means that the individuals conducting the qualitative interviews have power
over the participants in the form of supervision, however indirect. However, in interviewing
RAs leaving the position at the end of the year, this power dynamic is reduced. Only RAs who
chose not to reapply to the position, or who could not because they are graduating, are being

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

25

invited to interview. This is to eliminate any potential bias in the data that would have been
collected from RAs who were not rehired or were dismissed from the position.
Population and Sampling Strategy
Due to the time limitations at the end of academic year, and especially within the
Department of Residence Life around the hall closing process, we will use a convenience sample
in order to collect qualitative data. A convenience sample is ideal because not all RAs have an
equal chance of being selected; some will not be selected at all, because they are returning to the
position (Schuh et al., 2009). Additionally, due to the scheduling constraints and limited
resources of RDs conducting the interviews, only RAs who are available at the times RDs are
also available will be selected for interviews. This represents a convenience sample as outlined
in Schuh et al. (2009). All RAs who are graduating, moving on to a different position, or who
are leaving by choice will be invited to participate via a letter delivered by e-mail from the acting
Director of Residence Life (see Appendix I). Typically, there are between 50-60 RAs who
choose not to return to the position each year. The RAs who respond to the letter and sign up for
an interview though an online scheduling tool will then be considered participants in the
qualitative portion of our assessment.
Protocol Instrument
As previously outlined, the qualitative instrument we have created is an exit interview
that will be conducted by Resident Directors with Resident Assistants (RAs) who are not
returning to their position the following year. RAs who are not returning to the position include
those who are graduating, moving on to a different position, or who are leaving by choice. The
exit interviews will not include students who have been fired from the RA role or those who
applied to return to the position but were not rehired by the department. This is to eliminate

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

26

potential bias in the sample, such as in case that individuals have a negative perception of the
department due to termination or lack of rehiring. Each exit interview will be conducted by one
RD at a time, with one RA at a time.
The protocol instrument includes 10 questions that will be asked by the RD for all
interviews that are scheduled, which are scheduled for 30 minute time slots. We aim to make
sure all 10 questions are asked of each participant in order for the data collection to maintain
consistency. Therefore, participants will be notified of the risk of interviews going over the
allotted time however, the time period will be tested and confirmed or altered during pilot tests
of this instrument prior to implementation (Krueger & Casey, 2010; Schuh et al., 2009). The
questions included in the protocol revolve around the RAs application of the CCM and their
leadership development (or lack thereof) through the use of the model.
The questions are in the form of an interview guide, otherwise known as the Exit
Interview Protocol (Schuh et al., 2009; see Appendix K). Questions move from general
questions around the RAs experience at Loyola to specific questions around the CCM and their
skills, values, interests, and ability to be their authentic selves in the RA role. These questions
directly relate to the questions asked in the instruments that make up our quantitative analysis
and will be used to support and enhance quantitative data (Schuh et al., 2009, p. 127).
Additionally, the questions assess the way the CCM has or has not provided RAs with the
resources? They cannot provide an ability to develop their leadership skills and efficacy around
being a leader, and how the model relates to their ability to be authentic in their leadership.
Probing questions, both general suggestions for probing questions that allow the
interviewer to dig deeper into responses as well as specific probing questions for a few of the
main questions, have been included in the protocol in Appendix K. These add-ons have been

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

27

included in order to prepare for RAs that provide limited responses in their interviews, and that
ideally allow for better data collection and more detailed notes that make the data collection and
coding process easier (Krueger & Casey, 2010).
Implementation of Exit Interview Protocol
As touched upon briefly in the above section, the implementation of this protocol will
involve the participation of Resident Directors (RDs) in the Department of Residence Life. The
evaluators will not be conducting the exit interviews for a few reasons. First is because of the
time commitment needed when there are only three interviewers (both the internal evaluators as
well as the external evaluator). In comparison, there are 11 RDs in the department.
Additionally, interviews will not be conducted by the evaluators even during the first year of
implementation of this assessment due to the internal evaluators relationships with the RAs at
Loyola, including the RAs that they currently supervise. This would be avoided by having RDs
conduct the interviews and not allowing RDs to interview any RAs they supervise. Although
potentially 50-60 RAs could be invited to interview, because interviews will only be scheduled
with RAs that have times available when RDs are also available, we anticipate each RD
interviewing one to two individuals. This will still lead to data from 11-24 individuals, which
would suffice as a saturation point. Finally, it is unrealistic to have the evaluators conduct the
interviews due to the longevity of the assessment, which will take place annually and for as long
as the department wishes. RDs are better suited for this process, as they will always be available
to conduct the interviews even if specific individuals have left and new individuals have taken on
the role at Loyola.
The evaluators are not concerned about the RDs ability to conduct effective exit
interviews and to collect the necessary data that will be coded and analyzed in this part of the

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

28

assessment. This is because RDs interact with students on a regular basis, including holding
conduct meetings that can involve sensitive topics, and who should have good rapport with
students as a general requirement of their positions. The evaluators trust that any RD conducting
the exit interview will be able to follow the protocol outlined in Appendix K and ask any
necessary probing questions successfully. The protocol will initially be addressed with RDs
during a RD staff meeting, with stakeholder Marci Walton providing details. The protocol
outlines the necessary information the RD needs to conduct each interview, and encourages them
to record the interview with, and only with, permission from the interviewee to do so in order
to take thorough notes that will be submitted for data collection. Recordings would need to be
deleted upon completion of the Interview Submission Form (see Appendix M).
In terms of the timeline of the protocol, the exit interviews will occur between
completion of the post-test which occurs between April 1 and April 15 of each academic year
and the last day of Residence Hall closing of the year, which is when RAs are released for the
summer. Invitations will be sent out the week following the post-test deadline, via email (see
Appendix I). The invitation will include a link to GenBook, which is an online appointment
scheduling tool used by the Department of Residence Life. As the available interview dates will
change each time this assessment occurs, due to the schedules of the RDs conducting the
interviews and the timeline of when invitations for the exit interview are sent out prior to hall
closing, RDs will be responsible for creating the GenBook schedule. RDs will be able to use that
tool to develop time slots that work with their schedules and then the RAs will be able to select
the option that works for them. Additionally, RAs will be able to see which RDs are offering
which time slots, and select the RD they would prefer to speak with or feel most comfortable

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

29

with in the interview. Once schedules are set, the Department of Residence Life should send out
the confirmation email (see Appendix J) to each participating RA.
As stated in the invitation email, and reiterated in the confirmation email, the exit
interviews are allotted to take 30-45 minutes per participant. There is a risk that the allotted time
will not be completely accurate due to having to respond to all 10 required questions. However,
this risk will be minimized through preliminary tests of the interview process, allowing us to
confirm the 30-45 minute timeframe or alter it to reflect a more accurate schedule based on the
questions. The general topics that will be addressed in the questions are shared in the
confirmation email (see Appendix J) so that participates may feel more comfortable and better
prepared coming into the interview.
The general purpose of this protocol, in addition to the quantitative instruments within
this assessment, is to provide the evaluators and stakeholders with more detailed information
about the RA experience using the CCM and how that relates to their leadership development. A
face-to-face conversation allows us to gain richer feedback on these experiences that will
supplement the exit surveys. Additionally, as a supplement to our quantitative data collection,
the combination of the two methods creates a holistic approach to our assessment (Schuh et al.,
2009).
Quality and Validity Considerations
Due to the number of interviews occurring and the power dynamic that exists between the
department, who will be collecting and analyzing data, and the student staff participating,
recording interviews for the Academic Support and Programming Committee to code is
unrealistically time-consuming and could also affect respondents honesty and vulnerability in

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

30

the interview. Since multiple individuals are conducting interviews and because interviews will
not be recorded, this poses concerns for validity.
One concern is the relationship between the interviewer and respondent and the
positionality of the interviewers and those analyzing the data. Positionality, which includes
social identities, power within those identities, and position within the department, can affect
what respondents are willing to share and how data is interpreted (Schuh et al., 2009).
Triangulation, which is the use of multiple forms of data collection and multiple individuals
collecting and analyzing data (Schuh et al., 2009), will be used to address positionality and
validity concerns. When coding, we encourage the Academic Support and Programming
Committee, along with interviewers, to review interview data and revisit the quantitative surveys
to develop themes for coding all data. This will ensure that the themes assessed are consistent
among quantitative and qualitative components. We also encourage the committee and
interviewers to code independently for each submission form, then cross-check to verify that
coding themes and where they are located in the notes are consistent (Creswell, 2009). Having
multiple individuals creating codes can ensure that multiple perspectives, identities, and
positionalities are represented so that data does not get dismissed as irrelevant; however, making
sure codes are consistent will aid in reliability.
To help address interviewer positionality, respondents will choose their interviewer,
which can help ensure that the respondent is comfortable with the interviewer and that they trust
that their responses will be valued by the interviewer. The agency provided to respondents to
choose can minimize risk and can lessen the effect of positionality because respondents will most
likely choose someone that they believe understands them and relates to them. Despite this, the
interviewers positionality as a staff member of the department can skew the data. For example,

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

31

we will want to know whether or not the RA position was mentioned in examples or in their
development of leadership. Respondents may, then, already be thinking about the RA position,
making it easier to recall examples from the RA position than from other leadership experiences.
Ultimately, the comfort of the participants and the increased honesty is more important since we
will already have quantitative data that will be used to assess change over time.
A final concern is the intent of the RAs for participating. For example, RAs may
participate because they had a really positive experience or they may participate because they
had a negative experience or have animosity toward the department. For this reason, we chose
not to interview any RA who may have been fired or any RA who went through the application
process and did not get rehired, which could ultimately skew results. The RAs who will be given
the opportunity to participate will only be individuals who have elected not to reapply for the
position, whether for graduation or other reasons. This will ensure that there is still a range of
experiences represented, both individuals who enjoyed the position and others who may not have
enjoyed the position.
Data Analysis Procedures
Data will be entered by interviewers on a LUCentral form (see Appendix M).
Interviewers will be instructed to include as much data as they have in their notes, including any
direct quotes they have written and additional follow up questions they ask during the interview.
Because we are not recording each interview, we will use the notes entered in the LUCentral
form to code for themes related to authentic leadership, the CCM, and the Loyola Experience.
We encourage referencing the outcomes listed on the logic model, the data collected in the
quantitative component, and the responses of the interviews to develop coding schemas
(Creswell, 2009). We recommend that once interviews are conducted, the Academic Support

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

32

and Program Committee, who is overseeing the assessment, read through one or two interviews
for macro topics among the interviews that may not be included in the initial coding themes
before coding all of the interviews (Schuh, 2009). Each individual should read through an
interview and write a list of general topics. Individual lists should then be compared for trends to
determine topics to code in the data. We recommend assigning each code a different color to
highlight in each of the interview notes to make it easier to compile into topics after all
interviews are coded.
Qualitative Data Presentation
Qualitative findings will be compiled and organized around themes related to
components of authentic leadership, the use of the Loyola Experience and CCM, and, and overall
feedback. Respondents identities will remain confidential in both the submission form and the
narrative report of findings. Frequencies of the codes will be compiled and listed in a frequency
table that includes the code, the frequency, and the relative frequency (Huck, 2012). Once data
is analyzed, we recommend compiling findings into a presentation for the department prior to
training for the upcoming academic year so that the department and RA supervisors can
brainstorm ways to be more intentional in leadership development.
Conclusion
Limitations
Due to the limited resources available to the assessment team, the scope of the evaluation
is restricted. While the Department of Residence Life had hoped for a comprehensive evaluation
of the CCM, the lack of available resources to do such an evaluation required the assessment
team to determine the best way to assess the overall impact of the model without trying to
conduct a comprehensive evaluation. A budget outline using these limited resources can be seen

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

33

in Appendix N. In choosing to evaluate how the CCM impacts RA ability to build relationships
and leadership efficacy, the evaluation will not directly examine how residential students are
impacted by the CCM. This is one area in which further assessment is recommended.
Another limitation of the evaluation model is the limited qualitative component of the
model. Due to the already heavy workload of professional staff in the department, and the lack
of any resources to hire additional staff or interns in the department to conduct the analysis of
qualitative instruments, the assessment team decided to limit the scope of our qualitative
assessment. The amount of time asked from staff to code and analyze the data would exceed the
capabilities of the department otherwise an area which will already be stretched from the
qualitative assessment as it exists currently.
A final limitation of our evaluation procedure relates to the power dynamics within the
Department of Residence Life itself. Because all the data collected will be collected by
professional staff in the department, there is a chance that data will be skewed as a result. While
the assessment team has tried to address this as best as possible, and have created an extensive
assessment that limits the risk of skewed data, the lack of a budget or other resources restricts our
recommendations to an assessment conducted by internal employees of the department.
Timeline
The assessment will take a full calendar year to run April to April/May. The
assessment will begin with the pre-test Community Connections Model Survey, which will be
administered in April after RA staff for the upcoming academic year (August-May) are
selected. The mid-test survey will be administered the following September, after RA staff have
completed RA Training. The post-test survey will be administered in December at the end of the
Fall semester before RAs are released for winter break. At this point, all quantitative data will

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

34

have been collected. However, because the assessment is not fully complete, none of the data
will be analyzed at this point in the process.
The final component of the assessment the qualitative portion, or the exit interviews
will be administered the following April, at the end of the full academic year of the RAs
employment. Staff conducting the interviews will turn in the interview data no later than the
beginning of May. This will also be when the next year of assessment will begin, as outlined
above, with the pre-test survey administered in April to the incoming cohort of RA staff for the
upcoming year. Once all components of one cycle of the assessment (pre-, mid-, and post-test
surveys and exit interviews) have been completed, data analysis can begin. We encourage
Department of Residence Life to have the data analyzed and summarized for presentation to all
departmental professional staff members during staff training in July to make any necessary
changes before the next academic year begins. This will not impact the assessment timeline or
validity of the assessment, even though the pre-test for that year would have been administered
in April prior to that July meeting, because the pre-test is assessing where RAs are at before
introduction to the CCM (which occurs during fall training in September each year). A general
timeline for implementation of the evaluation beginning in Spring 2016 is found in Appendix N.
Next Steps
The purpose of this assessment is to evaluate the leadership development and efficacy of
RA staff within the Department of Residence Life as they implement the CCM. Additionally,
the evaluators and stakeholders hope to assess whether or not leadership development and
efficacy is occurring across residence hall populations and RA identities. Next steps will depend
on the findings of the initial implementation of the assessment. If the findings show a positive
change in leadership development and efficacy consistently, no programmatic changes will be

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

35

suggested other than continued assessment to ensure that the department continues to help RAs
build leadership and efficacy. If the intended changes in leadership development and efficacy
are not met consistently across groups, we encourage the department to critically evaluate
specifically why certain groups are not receiving the same benefits or making the same gains in
leadership development in the department and consider changes to the program or other
processes in the department to address this.
The Department of Residence Life plans to implement this assessment in April 2016 for
the incoming 2016-2017 cohort of RAs. In order to have this assessment ready for
implementation, this assessment plan will be presented at an upcoming Academic Support and
Programming Committee meeting for evaluators to provide an overview of the plan and answer
any questions stakeholders may have about the instruments and implementation process. Once
the assessment is finalized by the department, we encourage the Academic Support and
Programming Committee to present the assessment to all professional staff within the
Department of Residence Life before implementing pre-test survey to RAs in April
2016. Finally, if the Department of Residence Life seeks to expand the assessment of the CCM,
we recommend developing an additional assessment to address the impact of the CCM on
residents in the residence halls.

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

36

References
Bryson, J. M., Patton, M. Q., & Bowman, R. A. (2011). Working with evaluation stakeholders:
A rationale, step-wise approach and toolkit. Evaluation and Program Planning,
34(2011), 1-12.
Creswell, J. (2009). Research design: Qualitative, quantitative, and mixed methods approaches
(3rd ed.). Thousand Oaks, CA: Sage Publications.
Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2010). Program evaluation: Alternative
approaches and practical guidelines (4th ed.). Upper Saddle River, NJ: Pearson
Education, Inc.
Huck, S. W. (2012). Reading statistics and research (6th ed.). Boston, MA: Pearson Education,
Inc.
Krueger, R.A. & Casey, M.A. (2010). Focus group interviewing. . In J. S. Wholey, H. P. Hatry,
& K. E. Newcomer (Eds.), Handbook of practical program evaluation (3rd ed., 378-403).
San Francisco, CA: Jossey-Bass.
Loyola University Chicago. (2015). Mission, values, & vision. Retrieved from
http://www.luc.edu/reslife/about/mission/
McLaughlin, J. A., & Jordan, G. B. (2010). Using logic models. In J. S. Wholey, H. P. Hatry, &
K. E. Newcomer (Eds.), Handbook of practical program evaluation (3rd ed., pp. 55-80).
San Francisco, CA: Jossey-Bass.
Newcomer, K. E., & Conger, D. (2010). Using statistics in evaluation. (In J. S. Wholey, H. P.
Hatry, & K. E. Newcomer (Eds.), Handbook of practical program evaluation (3rd
ed.,454-491). San Francisco, CA: Jossey-Bass.

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

37

Newcomer, K. E. & Triplett, T. (2010). Using surveys. In J. S. Wholey, H. P. Hatry, & K. E.


Newcomer (Eds.), Handbook of practical program evaluation (3rd ed., 262-297). San
Francisco, CA: Jossey-Bass.
Northouse, P. G. (2016). Leadership: Theory and practice (7th ed.). Thousand Oaks, CA: Sage
Publications, Inc.
Schuh, J. H., & Associates. (2009). Assessment methods for student affairs. San Francisco, CA:
Jossey-Bass.
Walumbwa, F. O., Avolio, B. J., Gardner, W. L., Wernsing, T. S., & Peterson, S. J. (2008).
Authentic leadership: Development and validation of a theory-based measure. Journal of
Management, 34(1), 89-126.
Weiss, C. H. (1998). Evaluation: Methods for studying programs and policies (2nd ed.).
Englewood Cliffs, NJ: Prentice-Hall.

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN


Appendix A
Community Connections Model

38

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN


Appendix B
Department of Residence Life Organizational Chart

39

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

40

Appendix C
University Organizational Cha

University
Organizational
Chart

Board of Trustees
Chairman
Robert L. Parkinson, Jr.

Interim President
John P. Pelissero, PhD
Office of President
Trustee Relations
Office of Chairman
Donna Curin
Assistant VP &
Assistant to Chairman

Jesuit &
Catholic Identity
John Hardt, PhD

Administrative
Services

Academic / Student

Advancement

Administrative
Services

Health Sciences
Campus

Academic Affairs
Health Sciences

Academic Affairs
Lakeside

Thomas M. Kelly
Sr. Vice President

Steve Bergfeld
Sr. Vice President

Margaret Callahan, CRNA, PhD


Provost

Sam Attoh, PhD


Interim Provost

General Counsel
Pamela Costas, Esq
V.P., General Counsel
& Secretary

Human Resources
& Diversity

Info Technology &


Services

Marketing &
Communication

Winifred Williams, PhD

Susan Malisch
VP & CIO

Kelly Shannon
Vice President

VP, CHRO & CDIO

Development

Stritch School of
Medicine

Justin Daffron, SJ
Vice President

Linda Brubaker, M.D., M.S.


Dean & CDO

Niehoff School of
Nursing

Athletics
Steve Watson
Director

Human Resources
Joan Stasiak
Director

LUMA &
Cultural Affairs

HR: Benefits &


Compensation

Pam Ambrose
Director

Pre School
Beverly Donovan
Director

Academic
Technology Services

Advancement
Communication

Bruce Montes
Director

Brendan Keating
Director

Chancellor
Michael J. Garanzini, SJ

Communications
/ Marketing
Megan Tropito
Director

Brand Marketing

Regents

Deborah Meister
Director

Enterprise Architecture
& Project Management
Jim Sibenellar
Director

Katie Hession
Director

John Costello, SJ
Vice Chancellor

Shawn Vogen, PhD


Assistant VP

Process Improvement
& Technology Adv

Infrastructure
Services

Communications
& Media

Advancement
Services

Danielle Hanson
Director

Dan VonderHeide
Director

Financial
Services

Steve Christensen
Director

Open
Assistant VP

Lauren Hagan
AVP Finance

Systems
Implementation &
Consulting

Web
Communication

Alumni Relations

Informatics /
Academic
Technology

Safety & Security


Thomas Murray
Chief of Police &
Director

Educational
Resources
Keith Muccino, SJ
Associate Provost

Faculty Admin

Kevin Smith
Director

John Drevs
Director

Leticia Nieto
Director

Campaign
Strategy
Joe Gregoire
Exec. Director
& Vice Dean-QSOB

Operations
Open

Research
Services

Angela Liegel
Director

Kelly Feehan
Director

Jamie Orsini
Associate VP

Bold Outlined Boxes indicate members of Presidents Sr.


Leadership Team.

Ron Price
Associate VP

Corporate &
Foundation Relations

Major & Planned


Giving

Martha King

Steve Katsouros, SJ
Dean & Executive Dir.

Arts & Sciences


Thomas Regan, SJ
Dean

Quinlan School
of Business
Kevin Stevens, PhD
Dean

Communication
Donald Heider, PhD
Dean

Director

Mission / Identity
John Hardt, PhD
Associate Provost

Campus Ministry
Virginia McCarthy
Director

Continuing &
Profl Studies
Walter S. Pearson, Ph.D.
Dean

Education
Terri Pigott, Ph.D.
Interim Dean
Institute for
Environmental
Sustainability
Nancy Tuchman, PhD
Director

Graduate School
P. Mooney-Melvin, PhD
Interim Dean

Law
David Yellen, JD
Dean

Principal Gifts
Christine Ott

Social Work
Susan Grossman, PhD
Interim Dean

Special Events
Open
Director

Last Updated: September 14, 2015

Development

Vicki Keogh, RN, PhD


Dean

Arrupe College

Finance

Faculty / Academic
Resources
David Prasse, PhD
Vice Provost
Business Operations
Joanna Pappas, MBA
Asst Provost
Faculty Administration
Beverly Kasper, Ph.D.
Asst Provost & Director
Institutional Research
Richard Hurst, Ph.D.
Director
Academic Centers
& Global Initiatives
Patrick Boyle, PhD
Vice Provost

Academic Services
Dale Tampke, PhD
Assistant Provost

Corporate
Engagement
Janet Deatheridge, PhD

Curriculum
Development

Enrollment
Management
Paul Roberts, MBA
Associate Provost

Institute for
Pastoral Studies

Research Services
Bill Sellers
Director

Robert Seal, MLS


Dean

Wayne Magdziarz
Sr. Vice President

John Felice
Rome Center

Jane Neufeld
Vice President

Emilio Iodice
Vice President & Director

Dean of Students
K.C. Mmeje, Ed.D.
Dean & Asst VP

Administration WTC
Jack McLean, J.D.
Assistant VP

Campus Ministry
Lisa Reiter, PhD
Director

Open
Director

Student Complex
Campus Reservations
Dawn Collins
Director

Wellness Center
Diane Asaro
Director

Budget &
Financial Analysis
Ben Smigleski
Associate Vice President

Bursar
John Campbell

Controllers Office
Andrea Sabitsana
Associate VP &
Controller
General Accounting
Open
Director
Financial Systems
& Payroll
R. Gomez
Director
Internal Audit
Baker Tilly, LLP
Reports to Trustee
Audit Committee

Sponsored Program
Accounting
Donna Quirk
Associate Vice President

Executive Director

JoBeth DAgostino, PhD


Associate Provost

Capital Planning &


Campus Management

Student
Development

Residence Life
Academic
Administration
Marian A. Claffey, PhD
Associate Provost

Capital Planning /
Campus Management

Financial Services

Robert A. Munson
Sr. V.P. & CFO

Brian Schmisek, PhD


Director

University
Libraries

Government Affairs
Phil Hale
Vice President

Purchasing
Sam Perry

Capital Projects &


Campus Mgt
Kana Wibbenmeyer
Associate Vice President

Capital Project
Management

Capital & Business


Operations
David Beall
Director

HSD Facilities
Operations
Tom Earley
Director

Campus Services
Tim McGuriman
Associate Vice President

Lakeside Facilities
Operations
William Sherry
Superintendent

Conference /
Catering Services
Dana Adams
Director

Facilities
Management
Water Tower Campus
Mark Feiereisel
Director
Facilities
Management
Lakeshore Campus
Michael Jurewitch
Director
Environmental
Services
Bill Curtin
Director

Special Projects
Brian Slavinkas
Director

Treasurer
Eric Jones
Chief Investment Officer
& Treasurer
Strategic Debt &
Risk Management
Sue Bodin
Director
Cash Management
Cory OBrien
Director

Community Affairs &


Campus Outreach
Jennifer Clark
Associate Vice President

Retail / Residential
Property Asset Mgt
(Lakeside Management NFP)
Brian OLeary
Director

LUREC & CUNEO


Kevin Ginty
General Manager

Parking /
Transportation

Alexander Evers, PhD


Associate Dean

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

41

Appendix D
Logic Model for Community Connections Model

Inputs
Personnel:
1 Assoc. Dir. for
Residence Life
1 Asst. Dir. for Staff
Development &
Training
1 Asst. Dir. For
Academic Support
and Learning
Communities
11 RDs
12 ARDs (GAs)
115 RAs
Time:
Academic
Programming
Committee
meetings
Professional Staff
Training
Summer & Fall
Training
Financial:
Programming
Budget

Outputs

Outcomes

Activities

Participation

Training for
Student Staff
Weekly One-onOne meetings
Weekly Hall Staff
meetings
Weekly
Department
Meetings
Ongoing hall
events

Students:
Residents in residence
halls
Student Staff (RAs)
Professional Staff:
Leadership Team
RDs and ARDs
Campus Partners:
Wellness Teams (Campus
Ministry and Wellness
Center)
Campus Partners for
Programming

Assumptions:
RA staff have preexisting role modeling skills
Professional staff are adequately supporting
development of student staff

Short Term

Long Term

RAs will be able to:


articulate the Loyola
Experience
connect their own
undergraduate
experience to the
Loyola experience.
recognize and what
has affected their
Loyola Experience.
understand how their
leadership can impact
the experiences of
their residents.

RAs will be able to:


create programming for their
floor community that connects
to the Loyola Experience using
the Community Connections
Model.
evaluate their approach by
reflecting on their floor
programming and the changes
that they have noticed in their
residents mid-semester.
articulate the skills they have
built over the course of the year
by implementing the
Community Connections
Model.

External Factors:
Support, programming efforts, and collaboration
efforts of campus partners in the Division of
Student Development
Academic, social, and personal experiences of
RA staff

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

42

Appendix E
Screenshots of the Community Connections Model Survey
These images provide a visual representation of the surveys as the RAs will see them on
their electronic devices. Appendices F-H are the printable versions of the surveys, so
although full content is provided, the layout is slightly different than it will appear to
RAs.

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

43

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

Appendix F
Pre-Test Community Connections Model Survey Spring Training Survey

44

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

45

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

46

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

47

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

48

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

Appendix G
Mid-Test Community Connections Model Survey Fall Training Survey

49

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

50

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

51

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

52

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

53

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

Appendix H
Post-Test Community Connections Model Survey End of Fall Semester Survey

54

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

55

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

56

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

57

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

58

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

59

Appendix I
Exit Interview Email Invitation
Dear [STUDENT NAME],
As a Resident Assistant who will not be returning to the position next year, the
Department of Residence Life invites you to participate in an exit interview with a
Resident Director (who is not your direct supervisor) during the final weeks of the
semester. This interview, as part of the assessment procedures you have participated in
throughout the academic year via LUCentral, will help the Department to evaluate and
improve the Community Connections Model, the model that you use in creating
programming, interacting with residents, and completing administrative tasks. The
interview will cover topics such as:
Your experience this year at Loyola
What leadership means to you
Resident interactions
Use of the Community Connections Model
Your values and interests
This interview will take approximately 30-45 minutes, and can be scheduled via
GenBook, [INCLUDE LINK TO SCHEDULE OPTIONS]. If there are no available times
listed in GenBook that work with your schedule, but you would like to have an exit
interview, we encourage you to connect with your supervisor in order to arrange a time to
meet with a RD that works with both parties. Please select your preferred meeting time in
GenBook no later than [DEADLINE], and note that it is first come, first serve for
scheduling meetings.
Once we have received your meeting date and time request, we will schedule the exit
interview and send you a confirmation email with more details.
Thank you for assisting Resident Life with our ongoing assessment of the Community
Connections Model. We hope you will consider setting this final meeting to discuss your
role and experience as an RA.
In Maroon & Gold,
The Department of Residence Life

Residence Life
1032 West Sheridan Road
Chicago, IL 60660
Phone: 773-508-3300
Email: res-life@luc.edu
http://www.luc.edu/reslife/

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

60

Appendix J
Exit Interview Confirmation Email
Dear [STUDENT NAME],
Thank you for scheduling an exit interview with the Department of Residence Life. We
have scheduled your meeting for [DATE] [TIME] at [LOCATION]. The Resident
Direction you will be meeting with is [NAME OF RD]. If you are no longer able to meet
at this time, or would prefer to meet with a different RD, please let us know as soon as
possible.
In order to help you feel more prepared for your conversation with the RD, please note
that the following topics will be addressed in the questions you will be asked:
Your experience this year at Loyola
What leadership means to you
Resident interactions
Use of the Community Connections Model
Your values and interests
. Each interview will take approximately 30-45 minutes. Please plan accordingly, but we
do not anticipate this being an issue with most of these interviews.
Thank you again for participating in this exit interview, and we look forward to hearing
your feedback around your experience as an RA at Loyola, and in using the Community
Connections Model.
Please RSVP by emailing res-life@luc.edu to confirm your attendance.
We look forward to seeing you on [DATE].
In Maroon & Gold,
The Department of Residence Life

Residence Life
1032 West Sheridan Road
Chicago, IL 60660
Phone: 773-508-3300
Email: res-life@luc.edu
http://www.luc.edu/reslife/

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

61

Appendix K
Exit Interview Protocol (Question Guide for RDs)
Preliminary
Instructions for Resident Director
Greet the interviewee, introduce yourself if you do not already know each other, and
articulate the purpose of the interview.
Purpose of interview: As part of the ongoing assessment procedures that have
been taking place over the course of the academic year in Residence Life, this
conversation will allow us to learn more about your experience as a RA and in
using the Community Connections Model. A face-to-face conversation allows us
to gain more objective feedback on your experience that will supplement the
surveys that were already completed by you and all RAs this year. Your identity
will not be tied in with your feedback from today when I submit my notes on our
conversation to the department.
In order to collect data during the exit interview, please take thorough notes on each
response the RA provides to the questions listed below. Recording the meeting will help
you provide better notes, so you are encouraged to do so. However, you must gain
permission from the interviewee before doing so, and the recording would not be shared
with anyone else (due to confidentiality concerns). It would only be a tool for your notetaking, and would be deleted upon completion of the Interview Submission Form on
LUCentral.
Exit Interview Questions
Please ask all of the following questions (probing questions are only necessary if the
RAs response is not detailed enough for thorough notes). Let the participant know that
although the meeting is scheduled for 30 minutes, you need to ask all 10 questions that
make up this exit interview, and therefore it is possible the interview may either take
slightly longer than 30 minutes or could run short, depending on the participants
responses.
During the exit interview, please make sure your reactions and body language so as not to
lead students on or provide false encouragement or disapproval of their opinions. When
student answers are unclear, you may have to repeat their statement to make sure you
understand. You may also need to ask further follow up questions (beyond the specific
probing questions listed with the main questions below) such as the following:
What do you mean?
Can you give me an example?
Tell me more about that.
How does that compare with what you said before?
What happened after that, how did you feel about that?

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

62

Questions to ask (you must ask all 10 main questions):


What are some major highlights from this year for you as a Loyola student? Is
there anything that you would change about your experience this year?
How has your undergraduate experience compared to the Loyola Experience?
A handout on the Loyola Experience has been provided with this protocol
and can be used as a guide for the RA if they would like to review the
specifics of the Loyola Experience when responding to this question
Describe what leadership means to you.
o Probing question: Can you share a specific example of how you have
utilized this definition of leadership or how you have seen it enacted by
someone else?
What have your conversations been like with your residents in your interactions
with them this year?
What has affected your ability to connect with your residents (positively or
negatively)?
o Probing question: Have your identities or your residents identities
affected this?
What skills do you feel confident about that youve gained or improved upon
from using the Community Connections Model?
o Probing question: Can you provide some examples?
What are some of your core values that you have or have not enacted in your RA
role? Why do you think they were utilized? Or why werent they?
How have you incorporated the needs and/or interests of your residents
throughout the year?
Do you feel like youve been able to be yourself in this position and/or when
interacting with residents?
o Probing question: If so, why do you think that is?
o Probing question: If not, whats caused that for you?
Do you have any additional comments or feedback about your role as an RA?
Conclusion
Thank the participant for meeting with you today and let them know again that their
name/identity will not be included on the notes you are submitting to the Department of
Residence Life. If there is additional time remaining, allow them to provide you with any
additional comments, questions, or feedback. Finally, thank the participant for all the
work they have done as a RA and for helping Residence Life evaluate and improve the
Community Connections Model.

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

Appendix L
Handout on the Loyola Experience

The Loyola Experience


A four-year plan to develop students into graduates that are reflective of a Jesuit
Education
Year One
Connect With Community
Commit To Faith, Justice, And
Service
Engage In Chicago And The
World
Expand Your Knowledge
Lead With Values
Build Your Skills
Focus On Your Well-being

Year Three
Commit To Faith, Justice, And
Service
Connect With Community
Create Your Future
Engage In Chicago And The
World
Expand Your Knowledge
Find Your Calling
Lead With Values

Year Two
Connect With Community
Commit To Faith, Justice, And
Service
Engage In Chicago And The
World
Expand Your Knowledge
Lead With Values
Find Your Calling

Year Four
Reflection
Celebrate Your Achievements
Milestones

Graduate Students
Build your Skills
Connect with Community
Expand your Knowledge
Commit to Faith, Justice, and Service
Engage in Chicago and the World
Focus on your Well-being
Lead with Values
Reflection Celebrate
Your Achievements
Milestones

63

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

Appendix M
Interview Submission Form

64

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

65

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

66

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

67

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

68

Appendix N
Timeline and Budget
General Timeline of Evaluation (as implemented in Spring 2016)

Budget Outline
Activity
Quantitative Portion
Qualitative Portion

Item
Electronic surveys (OrgSync)
GenBook (interview
scheduling software)
Resident Director time
Academic Support and
Programming Committee
time
Electronic submission form
(OrgSync)
Printing of completed
submission forms (for coding
purposes)

Cost
$0.00*
$0.00
$0.00
$0.00
$0.00*
$0.00**

Total cost: $0.00

The annual cost of OrgSync subscription is already factored into the Department of Residence
Lifes budget

The annual cost of GenBook subscription is already factored into the Department of Residence
Lifes budget

All RD time-related needs will be scheduled within working hours and therefore the cost is
already factored into the individual salaries of staff

All committee time-related needs will be scheduled within working hours and therefore the cost
is already factored into the individual salaries of staff
**
Printing costs will be absorbed into overall printing budget of the Department of Residence
Life

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

Appendix O
Presentation on Evaluation Plan

69

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

70

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

71

Community Connections Model Context

Created in 2014, first implemented


in 2015-2016 academic year
Developed in order to enhance both
resident and student staff
experience
Overseen by the Academ ic Support
and Programming Committee of the
Department of Residence Life and
Marci Walton, Assistant Director
(AD) for Academic Support and
Learning Communities
115 student staff and approximately
4,600 student residents

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

72

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

73

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

74

COMMUNITY CONNECTIONS MODEL EVALUATION PLAN

75

You might also like