Team Lead Succeed – Helping you and your team achieve high-performance teamwo...
Rooted in Research: Establishing Coherent Partnerships between Institutional Research and the Quality Enhancement Plan
1. Establishing Coherent Partnerships
between Institutional Research and
the Quality Enhancement Plan
Dr. Ghazala Hashmi, Coordinator of the Quality Enhancement Plan
Dr. Jackie Bourque, Director, Office of Institutional Effectiveness
J. Sargeant Reynolds Community College, Richmond, Virginia
SACS Annual Conference Orlando, Florida December 2011
2. The Session’s Goals
• Present J. Sargeant Reynolds Community
College’s use of institutional research to
1) identify, develop, and implement its QEP, and
2) bring a variety of college units into effective
partnerships.
• Share the challenges and the successes the
College has encountered in gathering data and in
making effective use of data to improve student
learning in distance education.
3. Outcomes of this Session
By the end of this session we hope you will be able to
• Identify best practices in effectively gathering and using
institutional data points for the selection, development,
and implementation of a QEP topic.
• Develop effective, collaborative partnerships between
the QEP Team and the Office of Institutional Research.
• Share with your colleagues templates for gathering the
grounding institutional data that helps
to guide QEP selection teams towards an effective QEP topic,
to develop the substance of the QEP, and
to assess the effectiveness of the QEP during
implementation.
4. The Ripple Effect: Transforming Student Success in Distance
Learning, One Student & One Instructor at a Time
The Quality Enhancement Plan at J. Sargeant Reynolds Community College
Student
Orientation
& Student
Support
Student Faculty
Readiness Training
Student Success
The Ripple Effect in
Distance
Learning
14. Growing the Plan
External Research
Best practices: Identifying national standards
relevant to our QEP Topic
Other institutional efforts: Identifying other
institutional efforts, particularly at compatible
colleges, similar to our own
A review of the literature: Developing annotated
bibliographies in order to present an academic
review of the material
15. Growing the Plan
Internal Research
Evaluating topical data:
Distance Learning Student Survey and Report – what do our students think about
course design; who are our online students; what are the barriers to their success?
Discipline Review of Online Courses demonstrating broad gaps between face-to-face
and online student success rates
Consulting faculty:
Distance Learning Faculty Focus Group – what are their perceptions; what do they
consider to be the barriers to student success?
Faculty training needs – evaluating needs in technology training and instruction in
pedagogy, course design, assessment of student learning.
Evaluating general college data:
Continued evaluation of general college data: enrollments, success rates, persistence
rates
16. An Example of Data Collected and Evaluated:
FTES Comparison - Fall 2009 with Fall 2008
by campus
Campus Fall 2008 Fall 2009 FTES %
10/20/08 10/19/09 Change Change
Campus One 3,060.93 3,375.27 314.33 10.27
Campus Two 1,692.20 1,867.67 175.47 10.37
Campus Three 228.93 250.87 21.93 9.58
Off-Campus 12.00 9.60 -2.40 -20.00
Off-Campus 830.60 482.27 -348.33 -41.94
Virtual 792.00 1,056.20 264.20 33.36
Unknown 12.40 0.00 -12.40 -100.00
Total 6,629.07 7,041.87 412.80 6.23
18. The Driver:
the QEP Assessment Plan
Using the Emerging Data to Drive the Plan
Student Readiness –
1. A student profile emerges through SmarterMeasure.
2. We evaluate the relationships between this profile and
student success.
3. We construct our distance learning orientation around
institutional data.
Student Orientation –
1. Students and faculty provide qualitative feedback.
2. We evaluate success of orientation by measuring impact on
students.
3. We modify the orientation based on data
19. The Driver:
the QEP Assessment Plan (continued)
Faculty Training –
1) Faculty provide self-assessments of their own skills and
understanding of course design and online teaching
2) Ongoing peer-to-peer reviews provide qualitative and
quantitative data.
3) We evaluate student success and student persistence rates
of trained and untrained faculty.
4) Faculty provide feedback about the impact of the training.
5) We assess our own training services through the modules
that have been designed and delivered.
6) The QEP Team makes modifications based on the results of
the data.
20. Partnerships in Institutional Research
Executive
Vice President
QEP Office of
Center for Coordinator Institutional
Distance Faculty Member Effectiveness
Learning with online • Research Analyst
experience
Office of
Office of Student Affairs
Academic
Affairs QEP Assistant
Coordinator
Office of
Technology Professional
Training
Development
21. The Detours
Assessments leads into new sprouts.
Leaving room for growth and expansion
into new territories has been both
challenging and rewarding.
Peer Academic Leaders (PALs) Program
Rewards and Recognitions Program
New Faculty Certification for Distance Learning Policy
New Communications Tools
The New Faculty Development Database
22. Peer Academic Leaders (PALs) Program
Although PALs developed from the QEP and from
a limited, on-campus program, it presented new
challenges:
• Funding
• Recruitment and Training of Peer Leaders
• Administrative Oversight
• New Marketing
• New Assessment Tools and Activities
24. A New Faculty Development Database
that also integrates HRMSIS and the Knowledge Center
Fall 2010 Statistics
Number of Current Number of Current Number of Current
Distance Learning Faculty Distance Learning Faculty Distance Learning Faculty
who have completed who have not completed
either Tier One or Tier Two either Tier One or Tier Two
Training Training
153 48 105
120
100 Number of Current DL
80 Faculty who have
completed either TOP or
60 IDOL
40 Number of Current DL
Faculty who have not
20
completed either TOP or
0 IDOL
Current Distance Learning Faculty
25. The Documentation
using data to support QEP initiatives
• Reporting to College Executives: WEAVE
Online
• Reporting to Broader QEP Team: SharePoint
• Annual Reports to general college audience
• Regular summaries and reports of ongoing
assessment efforts and results on public blog
• Ongoing presentations to internal audiences
26. Digging, Driving, Documenting
In summary, we have found that the ongoing
effectiveness of and enthusiasm for the QEP is
built upon three primary factors, and they all
relate to the research of the QEP:
Digging into the data (gathering, evaluating, discussing)
Driving with data as the guide (building, daydreaming,
detouring)
Documenting the data (communicating and sharing)
27. for more information
Jackie Bourque
Director, Office of Institutional Effectiveness
jbourque@reynolds.edu
Ghazala Hashmi
Coordinator, Quality Enhancement Plan
ghashmi@reynolds.edu
www.reynolds.edu/qep