Drinking The Kool-Aid: Assessment for New Professionals, by New Professionals
1. Drinking the Kool-Aid!
Assessment for New Professionals,
by New Professionals
Ryan Greelish, Resident Director, Bridgewater State University
Julie Hayes, Resident Director, Bridgewater State University
P. Max Quinn, Resident Director, Bridgewater State University
3. Program Abstract
“Don’t understand why assessment is so important?
Unsure how or why we should demonstrate the value of our work?
Learn how we as new professionals have taken the plunge into the world
of assessment and why we would advocate that you join us in “drinking
the Kool-Aid”!
This peer-to-peer session will focus on ways we have used assessment to
improve our work, to promote student learning, and incorporating it into
job searches, building relationships, and educating students.
We are excited to share our opportunities and experiences with you
because assessment is here to stay, and it is up to us to embrace it.
OOH-YEAH!”
#NEACUHO2013 @PMaxQuinn @J_E_Hayes @RyanGreelish
4. Goals
Learn how we, as new professionals, have
taken the plunge into the world of
assessment and why we would advocate
that you join us in “drinking the Kool-Aid”!
This peer-to-peer session will focus on ways
we have used assessment to improve our
work, to promote student learning, and
incorporating it into job searches, building
relationships and educating students.
#NEACUHO2013 @PMaxQuinn @J_E_Hayes @RyanGreelish
5. Learning Outcomes
Participants will be able to recognize the
importance of embracing assessment.
Participants will be able to identify the high
value assessment provides in our work.
Participants will be able to learn how to
incorporate assessment into their jobs.
#NEACUHO2013 @PMaxQuinn @J_E_Hayes @RyanGreelish
6. Assessing the Kool-Aid...
Who had assessment in formalized education?
informal education?
What experience do you have with assessment?
None
Some
Much
I’m an Expert
What are your thoughts and feelings about
assessment?
#NEACUHO2013 @PMaxQuinn @J_E_Hayes @RyanGreelish
7. Kool-Aid for Thought...
“Every student affairs professional I
know is overworked, underpaid, and
underappreciated.
I truly understand that the last thing
you and your colleagues have time for is
another responsibility...
Yet, one could argue that assessment is
not an added responsibility and that
high-quality programming has always
required an evaluation component to
learn if that programming has been
effective”
“I don’t have time
to do assessment.
Just give me the
form to complete
or tell me which
box to check.
I just want to be
done with it”
(Bresciani, 2003, p. 14).
(Upcraft and Schuh, 1996 from Bresciani, 2003, p. 15).
#NEACUHO2013 @PMaxQuinn @J_E_Hayes @RyanGreelish
8. Why Drink The Kool-Aid?
What is Assessment?
Why assessment is important?
Accountability
Improvement
ACPA Ask Standards
“A milestone development in the
history of ACPA’s commitment to
assessing student learning and
development, the ASK Standards
now place this work among the
necessary responsibilities of student
affairs professionals.”
- Peggy Maki, Sept ‘06
#NEACUHO2013 @PMaxQuinn @J_E_Hayes @RyanGreelish
9. Why Drink The Kool-Aid?
Who is it important to?
What are the implications if
you don’t drink the Kool-Aid?
#NEACUHO2013 @PMaxQuinn @J_E_Hayes @RyanGreelish
10. Kool-Aid Basics
Learning Outcomes
SWBAT + Bloom Word + Condition
Goals
Direct vs. Indirect
Quantitative vs. Qualitative
#NEACUHO2013 @PMaxQuinn @J_E_Hayes @RyanGreelish
11. Surveys
Not the Only Type of Assessment!
Develop in a Team
Types of Questions
Layout
Common Mistakes/ Challenges
#NEACUHO2013 @PMaxQuinn @J_E_Hayes @RyanGreelish
12. Rubrics
Important Questions
Types of Rubrics
Holistic vs. Analytical
Considerations
Review Rubric
Benefits
http://rubistar.4teachers.org/
#NEACUHO2013 @PMaxQuinn @J_E_Hayes @RyanGreelish
13. Interpreting & Reporting
Why Document Your Results?
Considerations
Interpreting Data
Quantitative vs. Qualitative
Reporting
Trustworthy & Credibility
Institutional Surveys
#NEACUHO2013 @PMaxQuinn @J_E_Hayes @RyanGreelish
14. Common Myths
Complex Analysis is What Impresses People
Analysis Happens at the End
Quantitative Data is the Most Important
Data is the Most Important
Stating Limitations Weakens the Evaluation
#NEACUHO2013 @PMaxQuinn @J_E_Hayes @RyanGreelish
15. Developing a Kool-Aid Plan
Identify Assessment Methods
Develop a plan for collecting data
Prioritize goals and set timelines
Implement the Assessment Plan
Use data to improve processes
Communicate Results
#NEACUHO2013 @PMaxQuinn @J_E_Hayes @RyanGreelish
16. Considerations
Simple
Aligned with the institutional mission
Focus on one or two at a time
Borrow ideas from other institutions
Start Backwards
#NEACUHO2013 @PMaxQuinn @J_E_Hayes @RyanGreelish
18. Assessment is Kool...
Follow the leader, not the wolf...
Give IN to peer-pressure...
Find Kool-mentors
Sell your own brand of Kool-Aid
#NEACUHO2013 @PMaxQuinn @J_E_Hayes @RyanGreelish
19. Preparing the Kool-Aid
for our Students
Teach them HOW...
to Write learning outcomes (LOs)
Why they are important
How to link LOs to Mission
to Write a program evaluation
Why satisfaction evaluations are important in
student programming
Teach them WHY...
You, your department & institution drinks Kool-Aid
You need information to make decisions
We must measure student learning
#NEACUHO2013 @PMaxQuinn @J_E_Hayes @RyanGreelish
20.
21. Assessment is never done, it is a cycle
The cycle:
Kool-Aid = 1 part water, 2 parts
mix, 3 parts love - stir until mixed
well, enjoy over ice, repeat until
satisfied
Assessment = 1 part value, 2 parts
continuous improvement, 3 parts
student development - mix
together, celebrate with colleagues
and repeat as necessary!
#NEACUHO2013 @PMaxQuinn @J_E_Hayes @RyanGreelish
22. Drink the Kool-Aid...
What have you done to embrace assessment?
Are you afraid of?
What is scary about assessment?
What does the Kool-Aid taste like?
#NEACUHO2013 @PMaxQuinn @J_E_Hayes @RyanGreelish
23. A Special Thanks To:
@BethMoriarty
Director of Residence Life & Housing,
Bridgewater State University
&
@CHolbrook357
Associate Vice President of Student Affairs,
Bridgewater State University
#NEACUHO2013 @PMaxQuinn @J_E_Hayes @RyanGreelish
25. Kool-Resources
ACPA/NASPA/ACUHO-I Professional Competencies
NASPA Leadership Exchange – Fall 2003: Assessment of Student
Learning and Development - MORE THAN A PLUG N’ PLAY
by Marilee J. Bresciani
http://www.naspa.org/membership/mem/pubs/lex/03fall.pdf
Evaluation Methodology Basics: The Nuts and Bolts of Sound
Evaluation
#NEACUHO2013 @PMaxQuinn @J_E_Hayes @RyanGreelish
Share our experience, backgrounds, professional developmentWhat experience/ opportunities do we have in Assessment?Julie- graduate education, prior institutional experience with EBIRyan- Assessment Institute, EBI & Assessment Planning CommitteeMax - Working for AVP, Learning Outcomes for RLCs
JulieBrief Program Description: Some may say, “I don’t have time to do assessment,” but it is key to know how important it is to show the value of our work and the impact it has in student development. This peer-to-peer session will prove the need to assess our work and why “drinking the assessment Kool-Aid” is a necessary evil.Program Relevance: Many new professionals have not had any/enough training in assessment. We will explain the importance of assessment and how it can demonstrate the value of our work. This is a peer-to-peer session sharing information about assessment and discussing how to gain more knowledge and experience.
Julie: Topics we will cover
Max
Max:Critical Questions: In a small group ask everyone to identify two questions they hope to have answered during the presentation or session, in a large group select “volunteers” to ask the questions or identify objectives.Categories - Have members of the group arrange themselves into groups by their favorite dessert, sport, color, movie, car, etc. This is a good activity to get people up and moving and to find out common likes. You can shift from one category to another. “Now group by favorite vacation spot.”Desert Island - Group people in teams of 5 or 6 and tell them they will be marooned on a desert island and give them 30 seconds to list all the things they think they want to take and each person has to contribute at least 3 things. At the end of the time, tell the teams they can only take three things. Have the person who suggested each item tell why they suggested it and defend why it should be chosen. This helps the team learn about how each of them thinks, get to know each other's values, and how they solve problems.
MaxThis presentation will explain why engaging in assessment is worth your consideration and investment and why assessment requires meaningful reflection, not just a fill-in-the-blank response.
MaxWhat is Assessment?Palomba and Banta (1999) define assessment as the “systematic collection, review, and use of information abouteducational programs undertaken for the purpose ofimproving student learning and development.” To engage inassessment, simply ask yourself the following questions(Bresciani, 2002) about any of your programs:• What are we trying to do and why are we doing it? • What do we expect the student to know or to do as aresult of our program?• How well are we doing it?• How do we know? • How do we use the information to improve? • Does that work? These questions may seem very simple. When we are busy,we rarely take time to reflect on what we are trying to accomplish. Merely setting aside the time to think about the reasonsbehind our programs and evaluating end results can immenselyimprove our day-to-day thinking about assessment. There aresome who would advocate that the evaluation process can becompleted quickly and dumped into a form. In reality, meaningful assessment requires insightful reflection and aninvestment of the self.What and Why?Many professionals do reflect on “What are we trying to doand why are we doing it?” as they formulate missions andgoals for their programs. If assessment were that easy, twoboxes from a standard form could be supplied—one formission and one for goals. You could pop answers into themand continue. However, thoughtful answers can truly informyour missions and goals and ensure that what you have articulated accurately reflects what you wish your programs toaccomplish. (Bresciani, 2003, p. 15)- Focused process- Department effectivenessWhy is assessment important?Exist for a purposeACPA Ask Standards (Handout in folder)
MaxWho is it important to?Different levels (institutional (strategic plan), divisional, department)Who is your audience? (handout)
Ryan:Basic definitionsprovide further resources so you can learn on your ownLEARNING OUTCOMES (Handout)Lorin Anderson, a student of Blooms, revised thisHandouts are the original version of BloomsLearn as a result of Ex. RAs, SOPsEx. SAWStudents will be able to identify community resources that maintain safety and security.Students will be able to demonstrate healthy lifestyle choices.Student will be able to identify ways to positively contribute to the residential community.Must be measurable (not since joining- only if pre-test and post-test) / demonstrateMeasure 1 thing at a timeEx. Able to describe social issues in the regionDifferent format if there are different players, RLC, dept, univSometimes complicated because 3 RLC, each have some, all have someGOALSBroadWhat you are going to accomplish Ex. SAW- Complete 2 programs in the fall semesterDIRECTstandardize testsRubricsChecklistsPortfolios By Computer (excel, microsoft access, SPSS, Survey Monkey)INDIRECTSurveys (Campus Climate, every 3 yrs, staff, students, etc)Focus GroupsInterviews1 minute reflectionQUANTITATIVEEasier oneSurveys, Closed-ended interviews, Standardized testsNumerical formatUse when: Generalize findings to the larger populationQUALITATIVETrickyObservation, open-ended interviews or questions on applications, document review, journaling, Facebook comments to questions postedOpen Ended Questions (Int/ Focus Gr): Probing questionsInt: recordThick and detail with descriptionUse when: learn more and explore a phenomenon, gain richnessLanguageUse: Explain, Understand, describe, develop, discoverAvoid: compare, effects, proveEx. Are they learning? (Quant) What are they learning? (Qual)
Ryan:TYPES OF QUESTIONSNever ask leading questionsOnly ask the information you NEED!Multiple Choice: no relation between them, be specificOrdinal Ranking- voting, ranking movies from 1-20Likert Scale (interval)- continuous, words or numbersagree is opinionImportant is satisfactionRatio Scale: 0 starting point, equidistantEx. Income or creditsOpen Ended: qual, state it is anonymousClosed Ended: quantLAYOUTdemographics at the endLight to heavyLogical to reader, not youStart with something that interests themConfirm they work on Macs, PCs, TabletsOlder CHOs- larger fontCheck with your institutionCOMMON MISTAKES/ CHALLENGESToo many choicesInaccurate rangesNot longer than 2 months, not the semesterDo not ask for 2 questions in 1Ex. Have you ever felt unsafe or been a victim of violence?Avoid acronymsFew wordsActive tenseDistinguish from “neutral” and “undecided”Equal number of positives and negativesEqual comparisons, apples vs. orangesUse “check all” sparinglyShort as possibleStatus bar
Ryan: 2 examples (Handout)Checklists You may be doing assessment and not realizing itMost familiar in classesMeasuring tool“Rubrics are sets of criteria or scoring guides that describe levels of performance or understanding” (Szpyrka, D & Smith, E. B. 1995)Consist ofCriteria used to evaluate performanceLevels to describe potential range of performanceIMPORTANT QUESTIONSWhat do you want to accomplish?How well do you want to preform GUIDING QUESTIONSNecessary to describe excellent achievementHow many levels for the rangeWhat are clear descriptions of achievement?How will the rubric be used?After using the rubric, what changes are needed in the rubric?Results will show an area of improvement so people need to be openTYPES OF RUBRICSHolisticSingle criteria w/ multiple levels of performance1-3, start with writing extremesAnalyticalMultiple criteria w/ multiple levels of performanceStandard checkbox on last slide, like NEACUHO evaluationsCONSIDERATIONSEx. Written and oral skillsFor presenter or participantsOrder Matter1-3 aim, working towards3-1 more positiveWhat are you aiming towardsLEADS applicationsScoring 1-3, aiming for a 2 for emerging leadersWeigh the categories, more weight on open ended question than referencesLeave spaces blankREVIEW RUBRICDo the categories reflect the major learning objectives?Are there distinct levels which are assigned names and point values?Are the descriptions clear?Are they on a continum that allows for student growth?Is the language clear and easy for students to understand?Is it easy for the instructor to use?Validity: Can the rubric be used to evaluate the work?Can it be used for assessing needs?BENEFITSClear expectationsConsistency in scoring student workUseful for self-evaluationCan help identify student weaknessesExplain what improvements are needed or desired
RyanWhy Document Your ResultsBetter understanding of the programBetter budget decisionsMaximize your effectivenessWHO IS IT IMPORTANT TO?LanguageFormalitySetupGraphsLength (ex. Board of Trustees, 1 page summary, but have information)INTERETINGAnalyzing the dataGo back to original questionsDescribe your data with the question in mindUse data in a graphic way you want, manipulate (charts, graphs)Summarize & Interpret ConclusionsThink about What would you like to discover?What is your design and layout?What information do you want to report? How many, avg score, compare to another group, report changes from pre to postPeople do ventValidity, reliability Type of measuringQuantitativeCollected in a standardized mannerAnalyzed using statistical techniquesRemove those that are incompletePre and Post tests- remove ones with one responsesQualitativeRemove outliers (people vent)Narrative FormatReview, highlight, notes, code for theme, sort, interpret patterns, describeGood assessment will be mixed methodsREPORTINGKnow your Audience (directors vs B of Trustees)What do you want them to knowShiw highlightsOr they interested in research?Communicate the right information to the right people in the right waysKnow your Reporting OptionsAnnual Reports can change into 2x or 4x yearUnderstand how to use visualsGraphsPie charts, data charts, bar graphs, line graphs, info graphicsDifferent softwares- Survey Monkey, Qualtrics, Excel (YouTube Videos)If you get 20% response, you cant generalize Present your findings! Presentations, open forums, articles, speak at Parent AssociationsBriefings/ executive summaries, annual reports, fact sheet with talking pointsPosters, brochures, Ex. St. Mary’s College in Maryland http://www.smcm.edu/studentaffairs/assessment.htmlGood breifs/ executive summariesEx. BSU http://www.bridgew.edu/StudentAffairs/Assessment/Elevator speechinfo graphics (Handout)Univ of North Carolina Willmington: flyers, you said this, we did this, 62 (handout)http://uncw.edu/studentaffairs/assessment/documents/voiceflyer.pdfChart (Handout) on examples of how to report data TRUSTWORTHY & CREDABILITYMember checking- email to colleague to editPeer review/ debriefing- explain coding, narrow down, explain the processMixed MethodsClarifying biasINSTITUTIONAL SURVEYSAlready existTalk to your IRB on campus, need approval
Ryan:Debunking MythsCommon Misunderstandings about DataCOMPLEX ANALYSISdon’t have to do everything complexDon’t have to use big wordsMost people appreciate practical and understandable analysisANALYSIS HAPPENS AT THE ENDuntrue, cyclical, always thinking and analyzingThink about analysis upfront so that we HAVE the data we WANT to analzeHave the outcome in mindQUANTITATIVE DATA IS THE MOST IMPORTANTEx of Qualitative- Audience, Beth- at an architecture building, show stats and on the bottom put quotes they can read during the presentationMix of bothIt is the quality of the analysis process that mattersDATA IS THE MOST IMPORTANTData must be interpretedNumbers do not speak for themselvesSTATING LIMITATIONS WEAKENS THE EVALUATIONAll analysis have weaknessesIt is more honest and responsible to acknowledge them
Julie:Characteristics of Good Departmental Level (we are new professionals)comprehensive- encompassing students, staff, faculty, and resourcesValued and supported by the division and university cultureImplement gradually in careful stepsPractical- with clear implications for studentsSelf-renewing- data and information that feed back at both the department and division levelDepartments should develop an Assessment PlanIdentify Assessment MethodsDirect- standardized tests, portfoliosIndirect- surveys, campus climate surveys, alumni surveys, Focus Groups, InterviewsEx, our department plan, committee presentations at staff meetingWhat went good or bad? Why? Was is resources (people or material)?Ask studentsIf you didn’t like the activity, what did you expect to see?If doing well, focus on low hanging fruit
julieBorrow ideas from other departments, talk to people, talk to usStart backwards, what do you want at the end? What main question do you want answered?Be careful not to have bias, work with another colleague
How can you gain more opportunities?RyanFYRE: LEADSAssessment PlanSA Assessment InstituteRubrics (2 ex)JulieRA / OA EvaluationsBCDs RubricsACUHOI EBIOn-Call LogsApplications to exit interviews ->RA LearningInstitutional Tools --Survey Monkey --Google --Facebook metrics --what data are your currently gatheringMaxRLCsSelf- Evaluation (Fall, Spring – 1:1 evaluations) Student program evaluations (satisfaction)Program Evaluations (done by students)Staff Development: Writing Learning OutcomesRLC Climate SurveyStrengthsQuest, MBTI, Many others...
MaxFollow-the leader, not the wolfThe wolf will tell you not to do assessment, and in-turn, to blow the house downThe leader, should, be the one influencing us – empowering and supporting us as we help to assess the damage of the house, and work with us to re-build itGIVE INThis is one instance where we would ask you to allow peer-pressure to overcome you – harness it and ask lots of questions!Join the conversation – Don’t sit back and wait for the invite, jump in! The more you know, the farther you will do – Remember, assessment is here to stay!Find Mentors who LOVE assessmentCathy Holbrook – Assessment institute – Working with her got me excited about assessment as an UndergradA safe person to help you overcome mistakesBrand yourself, and your assessmentDigital Identity Development Model – think outside of the box!Assess yourself (NASPA/ACPA Competencies) and boast about it! – Use social media to articulate your brandUtilize your knowledge when job searching & networking – The more competent you are, the better candidate you are!
MaxIF WE CAN GET STUDENT LEADERS TO DRINK THE KOOL-AID, TOO – OUR JOBS BECOME EASIER, AND WE BECOME MORE PURPOSEFUL EFFECTIVETake the time to explain to them the BIG PICTURE – Institution mission and the funnel down to you and them - The more they understand and “buy-in” to the mission and vision, the more they are able to embrace assessment and utilize it effectivelyBuild these necessary tools into student staff trainings – Teach them, assess them and empower them – Allow them to create
MaxAssessment, as you know, does not stop with identifying yourmission and goals. The next tough question is “What do weexpect the student to know or to do as a result of ourprogram?” While many times the next box to complete askshow we are going to accomplish the stated mission and goals,assessment takes it one step further by asking you to articulateyour end result. In other words, if you plan six workshops onleadership development, what will students who attend one ormore of those workshops be able to do and what will theyknow as a result of their participation? Again, assessment asksyou to articulate the identifiable, observable, or measurableend result of that program. That measurable end result iscalled an outcome. There are a number of informational resources to assist youwith the assessment process. If you need more time, there isno such thing. You may need to reallocate time spent onother activities to time spent on assessment. In addition, youcan leverage time and expertise from your students—bothstudent leaders and those students who cause you to questionwhy you are in this field. Banta, Black, and Kline (2001) write that student affairsprofessionals “need to provide credible evidence of the valueand effectiveness of their programs. More importantly, assessment is a means of discovering new information about ourprograms that will help us improve them.” Think of assessment as discovery and improvement.Understanding how to develop your program not onlyrequires meaningful reflection on “why” and “what,” but alsocareful planning. While there are many tools and templates toassist you in implementing your program and its evaluation,those tools have little value to you, your program, or yourstudents if you simply pick them up and drop them intoplace with no regard for your institutional culture andclimate. Reflection is inevitable. Done well, your investmentin assessment will have many valuable uses. (Bresciani, 2003, p. 16).
RyanDEMONSTRATION & AUDIENCE GETS TO DRINK KOOL-AID