Welcome!
MILEX: March 2014
TU’s Excellent Rubric Assessment Adventures:
On the RAILS
Shana Gass Claire Holmes Lisa Sweeney
sgass@towson.edu cholmes@towson.edu sweeney@towson.edu
Agenda for Today :
• Background on Assessment, RAILS & Rubrics
• Norming & Rating Sessions
• Working Lunch:
Create Draft Rubrics
• Reflections & Questions
Assessment…
• Knowing what you are doing
• Knowing why you are doing it
• Knowing what students are learning as a
result
• Changing because of the information
(~Debra Gilchrist, Dean of Libraries and Institutional
Effectiveness, Pierce College)
Identify
learning
outcomes
Create and
enact
learning
activities
Gather
data to
check for
learning
Interpret
data
Enact
decisions
to increase
learning
Information
Literacy
Instruction
Assessment
Cycle
(ILIAC)
Oakleaf, Megan. "The Information Literacy Instruction Assessment Cycle: A Guide for Increasing Student Learning and Improving Librarian Instructional
Skills." Journal of Documentation. 65.4. 2009.
The Institute of Museum and Library Services is the primary
source of federal support for the nation’s 123,000 libraries and
17,500 museums. The Institute's mission is to create strong
libraries and museums that connect people to information and
ideas.
Megan Oakleaf,
founder of all things
RAILS.
RAILS Project Purpose
• Investigated an analytic rubric
approach to IL assessment in
higher education
• Developed a suite of IL rubrics
• Investigated rubric reliability & validity
• Developed training materials for training/
norming/ scoring
• Explored indicators of rater expertise
Cook’s RAILS Purpose
• Gain rubric experience:
creating/norming/rating
• Identify assessment opportunities
within TU’s Core Curriculum
• Develop a rubric for use on campus
• Assess students’ information literacy skills
• Examine instructional practices
• Begin cycle of tracking
student learning.
• Begin cycle of tracking
instruction practices.
• Begin cycle of collecting
aggregated & anonymous
data.
• Reinforce regular opportunities
for reflection & discussion
among library instruction
colleagues.
(facilitate development of a
Community of Reflective
Practice)
(Image: AP Images)
Our assessment
adventure…
Understanding by Design
1. What do you want students to learn?
(outcome)
2. How will you know that they have learned it?
(assessment)
3. What activities will help them learn, and at the same
time, provide assessment data?
(teaching method & assessment)
(Wiggins & McTighe, 2006)
Performance/Integrated Assessment
Students reveal their learning when they are provided with:
complex,
authentic
LEARNING
ACTIVITIES
to explain, interpret, apply,
shift perspective,
empathize
and self-assess.
What we assess.What they learn.
(Megan Oakleaf, Assessment: Demonstrating the Educational Value of the Academic Library, ACRL
Assessment Immersion, 2011.)
5 Questions for Assessment Design:
1. Outcome What do you want the student to be able to
do?
2. IL Curriculum What does the student need to know in
order to do this well?
3. Pedagogy What type of instruction will best enable
the learning?
4. Assessment How will the student demonstrate the
learning?
5. Criteria for
evaluation
How will you know the student has done
well?
(Lisa Hinchcliffe, Student Learning Assessment Cycle. ACRL Assessment Immersion, 2011)
Evidence of “authentic” student learning:
For instance, the research worksheet in your
packet that asks students to break down and
practice sequential steps in the search process.
Brainstorm…
What other possible examples of evidence of
student learning do we collect? What could we
collect?
• 2 dimensions
1. criteria
2. levels of performance
• grid or table format
• judges quality
• translates unwieldy
data into accessible
information
(Image: thefirstgradediaries.blogspot.com)
Criteria
1.“the conditions a [student] must meet to be
successful” (Wiggins)
2.“the set of indicators, markers, guides, or a list of
measures or qualities that will help [a scorer]
know when a [student] has met an outcome”
(Bresciani, Zelna & Anderson)
3.what to look for in [student] performance “to
determine progress…or determine when mastery
has occurred” (Arter)
Performance Levels
mastery, progressing, emerging, satisfactory,
marginal, proficient, high, middle, beginning,
advanced, novice, intermediate, sophisticate
d, competent, professional, exemplary, need
s
work, adequate, developing, accomplished, d
istinguished
(or numerical…)
SAMPLE RAILS RUBRIC
(green handout in your packet)
Performance Level 3
Student:
Performance Level 2
Student:
Performance Level 1
Student:
Performance Level 0
Student:
1.
Determines
Key Concepts
Determines multiple key concepts that
reflect the research topic/thesis
statement accurately.
Determines some concepts that reflect the
research topic/thesis statement, but concept
breakdown is incomplete or repetitive.
Determines concepts that reflect the research
topic/thesis statement inaccurately.
Does not determine any concepts
that reflect the research
question/thesis statement.
2.
Identifies synonyms
and related terms Identifies relevant synonyms and/or
related terms that match key concepts.
Attempts synonym (or related term) use, but
synonym list is incomplete or not fully
relevant to key concepts.
Identifies synonyms that inaccurately reflect
the key concepts.
Does not identify synonyms.
3.
Constructs a search
strategy using relevant
operators
Constructs a search strategy using an
appropriate combination of relevant
operators (for example: and, or, not)
correctly.
Constructs a search strategy using
operator(s), but uses operators in an
incomplete or limited way.
Constructs a search strategy using operators
incorrectly. Does not use operators.
4.
Uses evaluative
criteria
to select source(s)
Uses evaluative criteria to provide in-
depth explanation of rationale for source
selected.
Uses evaluative criteria to provide a
limited/superficial explanation of rationale for
source selected.
Attempts to use evaluative criteria, but does
so inaccurately or incorrectly.
Does not use evaluative criteria.
5.
Uses Citations
Uses an appropriate standard citation
style consistently and correctly.
Uses an appropriate standard citation style
consistently (bibliographic elements intact),
but with minimal format and/or punctuation
errors.
Attempts an appropriate standard citation
style, but does not include all bibliographic
elements consistently or correctly.
Does not include common citation
elements or does not include
citations.
Workshop Norming Practice
Round 1
• For first student work sample, Claire will
“norm aloud.”
• Participants will rate 2 work samples
individually.
• Group discussion: Can we reach consensus
for what constitutes evidence for each
performance level?
Norming: Round 2
• Participants will rate 2 more work samples
individually.
• Group discussion: Are we closer to
consensus?
• Do we establish rating ground rules?
• Does the rubric need to be modified?
Keep in mind…
• An info lit skills rubric does not score
discipline content; it scores information
literacy skills.
• You can only score what you can see.
Rubrics – Benefits
Learning
• Articulate and communicate agreed upon
learning goals
• Provide direct feedback to learners
• Facilitate self-evaluation
• Focus on learning standards
1. What are our expectations of students completing this
assignment?
2. What does a successful learning of this type look like?
3. What specific learning outcomes do we want to see in
the completed assignment?
4. What evidence can we find that will demonstrate
learning success?
Creating a rubric:
More benefits of a (normed) rubric…
Data
• Facilitate consistent, accurate, unbiased scoring
• Deliver data that is easy to understand, defend,
and convey
• Offer detailed descriptions necessary for informed
decision-making
• Can be used over time or across multiple programs
Other
• Are inexpensive ($) to design & implement
Rubrics – Limitations
• Possible design flaws that impact data quality
• Require significant time for development
• Sometimes fail to balance between holistic and
analytic focus
• May fail to balance between generalized
wording and detailed description
• Can lack differentiation between
performance levels
RAILS Lessons
• Explicit, detailed performance
descriptions are crucial to achieve
inter-rater reliability.
• Raters appear to be more confident
about their ratings when student
artifacts under analysis are concrete,
focused, and shorter in length.
• The best raters “believe in” outcomes,
value constructed consensus (or
“disagree and commit”),
negotiate meaning across
disciplines,
develop shared
vocabulary, etc.
Identify
learning
outcomes
Create and
enact
learning
activities
Gather
data to
check for
learning
Interpret
data
Enact
decisions
to increase
learning
Information
Literacy
Instruction
Assessment
Cycle
(ILIAC)
Oakleaf, Megan. "The Information Literacy Instruction Assessment Cycle: A Guide for Increasing Student Learning and Improving Librarian Instructional
Skills." Journal of Documentation. 65.4. 2009.
References
Arter, J. (2000). Rubrics, scoring guides, and performance criteria:
Classroom tools for assessing and improving student learning.
Retrieved from http://eric.ed.gov/?id=ED446100
Bresciani, M., Zelna, C. & Anderson, J. (2004). Assessing student learning
and development: A handbook for practitioners. Washington, DC:
NASPA-Student Affairs Administrators in Higher Education.
Wiggins, G. P., & McTighe, J. (2006). Understanding by design. Upper Saddle
River, NJ: Pearson Education, Inc., 2006.
Wiggins, G. P. (1998). Educative assessment: Designing assessments
to inform and improve student performance. San Francisco,
CA: Jossey-Bass.
Selected Readings:
Diller, K. R., & Phelps, S. F. (2008). Learning outcomes, portfolios, and rubrics, oh my! Authentic
assessment of an information literacy program. Portal: Libraries and the Academy, 8 (1),
75-89.
Fagerheim, B. A., & Shrode, F. G. (2009). Information literacy rubrics within the disciplines.
Communications in Information Literacy, 3(2), 158-170.
Holmes, C. & Oakleaf, M. (2013). The Official (and Unofficial) Rules for Norming Rubrics Successfully.
Journal of Academic Librarianship, 39(6), 599-602.
Knight, L. A. (2006). Using rubrics to assess information literacy. Reference Services Review, 34(1),
43-55.
Oakleaf, M. (2007). Using rubrics to collect evidence for decision-making: What do librarians need to
learn? Evidence Based Library and Information Practice, 2(3), 27-42.
Oakleaf, M. (2009). The information literacy instruction assessment cycle: A guide for increasing
student learning and improving librarian instructional skills. Journal of
Documentation, 65(4), 539-560.
Oakleaf, M., Millet, M., & Kraus, L. (2011). All together now: getting faculty, administrators, and staff
engaged in information literacy assessment. Portal: Libraries and the Academy, 11(3), 831-
852.
Stevens, D. D., & Levi, A. (2005). Introduction to rubrics: An assessment tool to save grading time,
convey effective feedback, and promote student learning. Sterling, VA: Stylus Publishing.
MILEX: March 2014
TU’s Excellent Rubric Assessment Adventures:
On the RAILS
Shana Gass Claire Holmes Lisa Sweeney
sgass@towson.edu cholmes@towson.edu sweeney@towson.edu
SlideShare URL:
http://www.slideshare.net/claireholmes/milex-assess-norm2014
Thank you!