This is a presentation that I had developed to propose an evaluation and assessment plan for a new degree program at the University of Illinois - the iMBA (online MBA) degree. This was prior to any job offer and was NEVER implemented as an official evaluation plan. The purpose of sharing here is to show self-generated work outlining a proposed assessment and evaluation plan.
2. BACKGROUND
• iMBA
• Online, stackable MBA program
• Offered through UIUC
• True UIUC degree
• Low-Cost
• ~$20,000
• Compared to other online programs at ~$75,000-$100,000
4. COURSERA PARTNERSHIP
• “Students”
• Individuals enrolled through UIUC and taking the prescribed courses on Coursera
• Credit and MBA recipients
• “Participants”
• Individuals not paying but consuming content provided on Coursera
• No credit or MBA receipt
5. IS IT A “MOOC MBA?” YES AND NO
• Yes
• Offered for free, with open enrollment, to anyone interested who has Internet access
• No
• The MBA itself is not open, nor free-of-charge, hence will not be “massive” compared
to the free offering
6. POTENTIAL KNOWLEDGE GAINS
• MOOCs/Non-MOOCs
• Compare curricula, students/participants, progress, learning with identical
content
• Graduate-level MOOCs
• Very few truly open graduate-level courses
• Differences in demographics, engagement, success
• MOOC-based learning
• Is learning any different in MOOC spaces due to the massive size of courses
and the diversity of interaction?
7. ASSESSMENT AND EVALUATION
ASSESSMENT
• Improvement-focused
• Engages faculty, content creators,
administrators, and assessment
personnel as “partners” in
improvement
• Cyclical, focusing on areas that can
be improved and of top priority/need
PROGRAM EVALUATION
• Decision-focused and often finite in
length
• Compares variables to
expectations/baselines
• Informs decision-makers and
administrators
• Typically descriptive and easier-to-
understand
8. ASSESSMENT AND EVALUATION, CONT.
• Similarities
• Data-based process
• Extensive overlap on process/methods
• Which to pick?
• Both!
• Evaluation: Inform decision-makers and ensure that expectations are being met
• Assessment: Work with practitioners and faculty to ensure top-quality content and
instruction
9. STEP 1: ESTABLISH PRIORITIES AND OUTCOMES
• August-December 2015
• Individual/Group Meetings
• Dean/Associate Deans
• Director of eLearning
• Select faculty
• iMBA-specific staff
• Graduate College Dean/Associate Deans
• Provost office representatives
10. PLAN, CONT.
• Purpose of Meetings: To identify key priorities of leadership, hone expected
outcomes, and garner buy-in from all levels
• Product of Meetings: Rough outline of expected outcome variables, logic models,
and basic data strategies
11. LOGIC MODELS
• A Good Assessment
Starts With a
Complete
Understanding of
Process
• STEP ONE: FYE
Logic Model
– Inputs
– Activities/Outputs
– Outcomes
• Short-Term (0-24
months)
• Medium-Term (2-
5 years)
• Long-Term (6+
years)
– Impacts (Abstract)
12. LOGIC MODELS
• Complete
Understanding of
Process
• STEP ONE: FYE
Logic Model
– Inputs
– Activities/
Outputs
– Outcomes
• Short-Term (0-
24 months)
• Medium-Term
(2-5 years)
• Long-Term (6+
years)
– Impacts
(Abstract)
ASSESSMENT
* Assessment measures all steps of the process,
except for the “abstract concepts”
13. LOGIC MODELS, CONT.
• Purpose
• Establishes necessary variables that can and should be measured
• Identifies connection points between inputs, activities, and expected outcomes
• Analyses can then answer whether there is that connection and how strong the connections
are
• Establishes organizational unity on priorities and inquiry
14. STEP 2: DEVELOP ASSESSMENT AND EVALUATION
PLAN
• November-December 2015
• Identify activities, data, sources, and next steps
• Variables established from logic models and data source (Banner, Coursera
instruments, surveys, etc.)
• Collection plan (pre-course, post-course, instrument)
• Quantitative and qualitative data are both key to activities
• Rough analytical planning
• Regression of outcomes to demographic/input variables
• Comparison of iMBA students, participants, and traditional MBA students
• Approaches for semester-to-semester program evaluation
15. STEP 3: EXECUTE PLAN
• January-May 2016
• First iteration of survey instruments, evaluative approaches, etc.
• High level of qualitative data collection on satisfaction, progress, etc.
• Faculty
• Students
• Participants
16. STEP 4: INITIAL REPORTING
• May-August 2016
• First semester report
• First quantitative analyses
• PR/Marketing
• Interim survey of faculty and administrators
• Address satisfaction with iMBA
17. SUMMER 2016 DELIVERABLES
• Semester Report
• Baseline Report
• Used for comparative purposes in follow-up analyses
• PR Materials
• Brochures and publications with statistics, anecdotes, etc.
• Scholarly Publications, as accepted
• Peer-reviewed journal articles
• Conference presentations
18. FUTURE YEARS
• Assessment Cycles
• Academic years
• Focus on 2-3 issues of highest priority
• Evaluation Cycles
• Focus on shifts year-to-year
• Highlight major changes, improvements, issues
• Priority will be automating the process so metrics are available in real-time and require
little human resources after initial implementation
19. IDEAL HUMAN RESOURCES
• Evaluation and Assessment Specialist, 1.0 FTE
• Liaison with decision-makers
• Craft assessment/evaluation plan
• Carry-out initial and advanced data collection and analyses
• Graduate Research Assistants, 0.5 FTE
• As needed, can provide detail on specific research questions, carry-out basic analyses
• Data Specialist, 0.25 FTE
• Develop needed data queries
• Work on visualization tools