2. • Explain the five logics of mixed methods research designs
• Appreciate the link between research inquiry and research design
• Describe the purpose and importance of the Methods section of a research report
• Compare and contrast qualitative and quantitative intellectual inquiries
• Identify the major reporting components (subheadings) of qualitative and quantitative
research re-
ports
• Compare and contrast the most agreed-to approaches and terms about research integrity,
rigor, and
quality that are used in each of the three methodologies, and learn the attendant strategies to meet
the standard for the specific research methodology
Introduction
This chapter focuses on the constructs of research design and the Methods section in a research
paper. Re-
search design is a larger construct than methods, to be explained shortly. But within a research
paper, once
the authors have stated the research question, developed an introduction to the study, and
presented a re-
view of the literature (and maybe a theoretical framework), their next step is to provide a
description of the
strategies used to collect and analyze data pursuant to the research question—that is, their
methods. This
chapter provides a generic discussion of methods, followed with much more detail in Chapter 9
(qualitative
methods) and in Chapter 10 (quantitative and mixed methods).
As a caveat, a detailed discussion of how to use specific methods is beyond the scope of this
overview chap-
4. Etymology and Definition of Methods and Research Design
Method is Greek methodus, “for mode of inquiry or investigation.― It stems from meta,
“after,†and hodos, “a
travelling, a pursuit, a way of teaching or going― (Harper, 2016). In effect, method refers to
investigating or
inquiring into something by going after or pursuing it, especially in accordance with a plan. It
involves tech-
niques, procedures, and tasks used in a systematic, logical, and orderly way (Anderson, 2014).
Within the
context of conducting and reporting research, it is the stage wherein researchers design
instruments, appa-
ratus, or procedures or gain site access (if relevant), obtain a sample, and then collect and
analyze data from
that sample (or entire population) (Johnson & Christensen, 2012). As was discussed in Chapter
2, this book
distinguishes between method and methodology, with the latter connoting the philosophical
underpinnings of
the study.
The other term used in this chapter is research design. Research is French recercher, “to
search.― In the con-
text of this book, it refers to the accumulation of data that are interpreted, leading to new
knowledge. Design
is Latin designare, “to mark out, devise, choose, designate.― A design can be defined as a
plan used to show
the workings of something before it is made or created. It can also mean the underlying purpose
of some-
thing, in this case, the search for knowledge (Anderson, 2014; Harper, 2016). From a technical
stance, the
research design refers to the overall strategy that researchers choose to integrate the different
components of
6. (which shapes the research questions and all assumptions underlying the effort), and (b) type of
research
inquiry they are conducting. In short, (a) exploratory research strives to reach a greater
understanding of a
problem, usually laying the groundwork for future studies; (b) descriptive research seeks more
information
so as to accurately describe something in more detail, creating a fuller picture by mapping the
terrain; and
(c) explanatory research seeks to connect ideas to understand causal inferences (explain
relationships) (de
Vaus, 2001; Suter, 2012; Yin, 1984). These approaches apply to both quantitative and qualitative
research
methodologies (except explanatory), with qualitative also seeking to (d) illuminate meaning and
subjective
experiences and (e) understand processes and structures (Blaxter, 2013; Shank & Brown, 2007).
Articulating Research Purpose in Research Design
Each of these five types of research inquiry represents the deeper purpose of the study (the
problem), or the
reasons for doing it, which is why Yin (1984) said research design is logical (i.e., it entails
reasoned judg-
ments). Each type of inquiry offers a different reason for why the study is needed (e.g., to
describe, explore,
find meaning, or theorize). Authors must not confuse research purpose (reason for inquiry) with
methodolo-
gy, research design, research question, or methods (see example 8.1). When identifying the
nature of their
research inquiry, they can use headings in their paper such as Justification for the Study,
Importance of the
Study, or Objectives of the Study (Newman, Ridenour, Newman, & DeMarco, 2003). A clearly
stated research
11. search lacks design― (p. 215). Instead, qualitative research requires a broader and less
restrictive concept of
research design, in which researchers use “‘logic-in-use’ [as well as] ‘reconstructed
logic’ [to accommodate
the] ‘design in use’ [principle]― (p. 216). This is called an emergent research design
wherein the original plan
changes as the research unfolds, meaning it is nonlinear (Creswell, 2009) (discussed in Chapter
9). Regard-
less, the end result is data that are then analyzed, interpreted, and discussed, leading to
conclusions, impli-
cations, and recommendations (de Vaus, 2001; Suter, 2012; Yin, 1984).
As a final caveat, de Vaus (2001) explained that researchers should not equate a particular
logistical method
with a particular research design logic. It is also erroneous to equate a particular research design
with either
quantitative, qualitative, or mixed methods approaches. Instead, authors need to bear in mind the
link be-
tween (a) the purpose of the research (logical inquiry) and (b) their research design (both logic
and logistics)
(Yin, 1984) and then introduce their Methods section accordingly (see examples 8.2 and 8.3).
Example 8.2 Quantitative research design and method This exploratory, quantitative research
inquiry employed a cross-sectional research design. Data were collected from a purposive
sample
using the survey method, specifically a piloted questionnaire designed for this study. Descriptive
sta-
tistics were used to analyze the data using Minitab software, and the results were reported using
frequencies, percentages, and means (averages).
Example 8.3 Qualitative research design and method This qualitative research inquiry employed
an emergent research design, using the phenomenological method. Data were collected from a
14. Page 9 of 27 Understanding and Evaluating Research: A Critical Guide
https://methods.sagepub.com/book/understanding-and-evaluating-research/i2560.xml
https://methods.sagepub.com/book/understanding-and-evaluating-research/i1494.xml
tions (i.e., explained the logic used when creating their research design, especially what type of
data
were needed to answer their research questions)
â–¡ Per the above, determine if they included a section titled Justification for or Importance of
the Study
â–¡ Determine if they properly referred to reconstructed (deductive) logic (quantitative) or logic-
in-use
(qualitative emergent research design) or if they referenced mixed methods logics
â–¡ Determine if they clarified their research design (see Table 8.2)
â–¡ Determine if they explicitly stated the type of research inquiry they employed (exploratory,
descrip-
tive, explanatory, meaning seeking, or understanding processes and structures)
Most Common Research Designs
Table 8.2 summarizes the most common research designs for each of qualitative, quantitative,
and mixed
methods studies, discussed in much more detail in Chapters 9 and 10. These approaches to
designing re-
search differ because of methodological distinctions, discussed in more detail in the second part
of this
overview chapter.
Table 8.2 Main Types of Qualitative, Quantitative, and Mixed Methods Research Designs
Qualitative Research Designs (in-
volve changing tactics over the
course of the study)
15. Quantitative Research Designs (involve adhering to a formal
plan with no deviation)
Mixed Methods Research Designs (in-
volve some prioritized combination of
strategy and tactics)
• Interpretive—insights from inter-
preting data change the research
design
• Investigative—traces out a phe-
nomenon in its natural field set-
ting
• Participatory—research design
is codeveloped with participants
• Descriptive—describes what actually exists, as well as its fre-
quency, and then categorizes the information
• Correlational—examines whether a change in a variable (no
manipulation) is related to change in another
• Comparative—measures variables that occur naturally in ex-
isting groups, then compares them to determine their influence
on the dependent variable
• Experimental—manipulates independent variables, measures
• Use qualitative methods to explain
quantitative data (words to explain
17. book clearly distinguishes between methodology and methods (see Chapter 2). Methodology
(ology) is fo-
cused on what is involved in creating new knowledge and refers to the branch of philosophy that
analyzes the
principles and axioms of research. The word method refers to a system of strategies used to
obtain informa-
tion for a study.
Many disciplines’ use of the word methodology to refer to methods (Schneider, 2014) most
likely occurs be-
cause the empirical (quantitative) research paradigm is so prevalent. Given its dominance,
authors tend to
• Illuminative—strategically focus-
es on one aspect of research de-
sign
• Instrumentation—study creates
a new data collection instrument
• Sensitization (descriptive)—sen-
sitizes readers to participants’ sit-
uation
• Conceptualization (theory build-
ing)
changes in dependent variable (experiment and control), and
infers causal links
• Quasi-experimental—employs an experimental and control
design using existing groups, then cautiously infers causation
• Predictive exploratory—determines how variables may be
20. SAGE Research Methods
Page 12 of 27 Understanding and Evaluating Research: A Critical Guide
https://methods.sagepub.com/book/understanding-and-evaluating-research/i1494.xml
https://methods.sagepub.com/book/understanding-and-evaluating-research/i1494.xml#i1507
https://methods.sagepub.com/book/understanding-and-evaluating-research/i1494.xml#i1519
ous work. Their own Methods section should clearly set out a well-articulated set of procedures
that can be
consistently reapplied (quantitative) or appropriately adopted in another context (qualitative). By
making their
measurement choices explicit, authors help readers decide if the study was done well or needs
improvement
(Harris, 2014).
Following this convention, the Methods section serves the purpose of fostering ongoing debate
about how
to improve measurement instruments and research procedures, whether qualitative or
quantitative. Authors
should try to avoid using or perpetuating inconsistent measures and procedures because this
creates dis-
continuity in the literature about the particular phenomenon being measured (Choudhuri,
Glauser, & Peregoy,
2004; Harris, 2014). As examples, Harris (2014) noted that scholars have developed 200 ways to
measure
self-esteem, 16 ways to measure aspiration, and hundreds of instruments to measure quality of
life, and they
have not settled on how to measure gender identity or prejudice. These are examples of
discontinuities per-
petuated in the literature.
Authors may choose to select from and adapt previous attempts to measure a phenomenon, and if
so, they
22. Page 13 of 27 Understanding and Evaluating Research: A Critical Guide
https://methods.sagepub.com/book/understanding-and-evaluating-research/i1401.xml
https://methods.sagepub.com/book/understanding-and-evaluating-research/i1494.xml
information as appropriate when addressing each strand of their research design: qualitative and
quantitative.
Table 8.3 Main Differences Between Qualitative and Quantitative Intellectual Inquiry
Qualitative Inquiry Quantitative Inquiry
• Assumes subjective reality is socially constructed and subjective • Assumes there is an
objective reality ready to be discovered
• Appreciates complexity and multiple truths • Favors parsimony and assumes a single truth
• Research is value bound, and the researcher’s values are accounted
for • Research is value neutral, and the researcher’s values are muted
• The researcher is the primary instrument (observations, interviews) • Uses inanimate
instruments (scales, questionnaires, checklists,
tests)
• Contextualizes findings and applies ideas across contexts • Generalizes results from a
sample to a population
• Portrays natural settings and contexts • Manipulates and controls variables
• Few participants, many variables • Few variables, many subjects
• Understands the insider’s view • Presents the objective outsiders’ view
• Human behavior is situational • Human behavior is regular
• Interprets human behavior in context • Predicts human behavior
• Understands perspectives (empathetic) and exploration • Provides causal explanations and
predictions
• Widely, deeply examines phenomena • Narrowly tests specific hypotheses
SAGE
24. Major Components (Report Subheadings) of Qualitative and Quantitative Re-
search
Table 8.4 compares the basic stages or major components of both quantitative and qualitative
research meth-
ods and provides the typical subheadings authors would use to report their respective methods
for a study.
Purposefully using these headings greatly facilitates others’ ability to critically read the
Methods section of a
research report. If authors fail to explicitly indicate which methodology informed their study,
readers can take
cues from their subheadings. Absence of these subheadings—or, worse yet, the content relevant
to each
stage—raises unnecessary flags about the study’s integrity and quality. These headings are
used in Chapters
9 and 10 to organize the discussion of how to report both qualitative and quantitative research or
their strands
within a mixed methods study.
• Strives for trustworthy, credible data • Strives for reliable and valid data
Table 8.4 Basic Steps (Report Subheadings) of Qualitative and Quantitative Methods
Qualitative Methods
NOTE: These steps are not always linear and sequential
Quantitative Methods
NOTE: These steps are linear and sequential
• Site selection and access (gaining access to the site from which the sample
will be drawn)
• Sampling (people, artifacts from the site[s])
• Ethical considerations
26. When critically reading a research report, you would
â–¡ Ascertain whether the authors used language and vocabulary reflective of the research
inquiry ap-
proach that informed their study (see Table 8.3)
â–¡ Determine if they used methodology-specific headings to organize their Methods section (see
Table 8.4) and fully accounted for and shared their research design logic and logistics
â–¡ If subheadings are missing, determine if the authors at least included pertinent details for
each
stage of their respective methodology’s research design
• Data analysis (thematic, patterned examination of the thick data, often done
in concert with data collection)
• Account for trustworthiness (along several criteria)
• Data security and management
• Limitations of emergent research design
• Data security and management
• Limitations of predetermined research design (normally fol-
lows the Discussion section)
Table 8.5 Comparison of Criteria to Ensure High-Quality Quantitative and Qualitative Research
Quantitative (Positivistic, Empirical, Deterministic) Qualitative (Postpositivistic, Naturalistic, In-
terpretive, Critical)
Striving for unbiased data (results are true if no bias was introduced, made possible if the re-
searcher’s personal preferences, prejudices, and opinions are held at bay during the entire re-
search process).
Strategies: judiciously address issues of internal validity to ensure that the study design, imple-
28. the truth is found. Judgments about the evidence should not coincide with the researcher’s
ori-
entation (despite that science is not really neutral; relative value neutrality is more likely than
absolute neutrality).
Confirmability (subjectivity):
Refers to the researcher’s neutrality when in-
terpreting data (i.e., self-awareness and con-
trol of one’s bias); appreciating that values
are central to the research process, re-
searchers still have to be sure their findings
can be confirmed or corroborated by others
(i.e., their values did not take over). It is the
extent to which findings are shaped by the
respondents themselves, rather than the re-
searcher’s bias.
Strategies: embrace the tenets of the scientific method and empirical inquiry; do not distort re-
search or let one’s values intrude by drawing on personal worldviews, motives, self-interest,
or
customs or by capitulating to external pressures (researchers are especially vulnerable to value
intrusion during the interpretation and discussion stage).
Strategies: reflexivity (involves self-critique
and disclosure of what one brings to the re-
search, especially one’s predispositions); au-
dit trails; method triangulation; peer review
29. and debriefing.
Internal validity:
This refers to the integrity of the research design. The word internal pertains to the inner work-
ings of the research process, designed and conducted to ensure that the researcher measured
what was intended to be measured (producing strong, valid data instead of weak, invalid data).
Also, the research design should follow the principle of cause and effect. There are seven major
threats to internal validity (i.e., measuring something other than what was intended): (a) conta-
mination by an extraneous event (history effect); (b) participants aging or tiring (maturation ef-
fect); (c) loss of subjects or attrition between testing (mortality effect); (d) sensitizing subjects
with pretest (testing effect); (e) extremely high or low pretest scores (statistical regression ef-
fect); (f) subjects are not carefully assigned to test groups (selection bias effect); and (g) unreli-
ability of an assessment instrument (instrumentation effect).
Strategies: take steps necessary to mitigate threats to internal validity (e.g., account for contam-
ination, maturation, attrition, sampling size, group formation and assignment, instrumentation
alignment, and testing sensitization).
Credibility (credible to the participants):
Did the researchers create a faithful account-
ing of people’s lived experiences (i.e., an ac-
curate representation of their reality, from
their perspective)? Did the researchers get a
full answer to their research question? Also,
can others have confidence in the truth
shared by the researchers (i.e., in their ob-
31. settings—that is, used more widely by oth-
ers. It is the researcher’s responsibility to
provide accurate, detailed, and complete de-
scriptions of the context and the participants
so that users of the study can determine if
the findings and conclusions apply (are
transferable) in their context (based on simi-
larity of deep descriptors).
Strategies: judiciously choose appropriate research design protocol (especially sample size and
bias). Then, before asserting that the results are valid in other populations, situations, and con-
ditions, researchers must recognize, consider, and report on factors that mitigate these asser-
tions, notably any interactions (a) among treatment and subjects, settings, and history as well
as (b) between subjects and settings. Researchers often temper their assertions by setting out
study limitations.
Strategies: cross-case comparisons; litera-
ture comparisons; detailed, thick descrip-
tions; researcher reflexivity to mitigate inva-
lidity; state study limitations (account for se-
lection, setting, and history effects that might
make the study unique to only a single group
[i.e., not transferable]).
Reliability (of the instrument and methods):
Refers to the extent to which someone else can follow the research design with the same sam-
32. ple and get the same results. Are the methods reproducible and consistent, and is sufficient in-
formation provided so others can repeat the approach and procedures? To what extent are vari-
ations controlled?
The reliability of the instrument depends on six types of validity: (a) face validity (subjects think
the test is measuring what it is supposed to measure); (b) expert judges think the test is valid;
(c) test items actually contain content being measured; (d) compare a new test with a previously
validated test (concurrent validity); (e) taking a test is good prediction of a score when the test is
taken again in the future (predictive validity); and (f) construct validity (mix of all of the oth-
ers—did the test measure the intended higher-order construct and nothing else related to it, de-
termined by how the variables are operationalized?).
Strategies: standardized administration of instrument or procedure; internal consistency (i.e.,
ensure instrument items are actually measuring the underlying construct, reflected in Cron-
bach’s alpha); increase number of test items; use objective scoring; test-retest; ensure that
two
Dependability:
Related to reliability, researchers have to re-
sponsibly provide sufficient information so
others can repeat the research design proto-
col in their context but not necessarily get the
same results. It refers to the stability of find-
ings over time and in changing research con-
texts (i.e., others can rely [depend] on the
study). The latter means the findings, conclu-
sions, and interpretations must be supported
34. Table 8.5 provides an overview of the most agreed-to approaches and terms used by both types
of re-
searchers and of attendant strategies to meet the standard for the specific research methodology
(Ary et al.,
different forms of one test measure the same thing. mented.
Generalizability (breadth of applicability):
Researchers want to make broad statements from their specific case (they used a small ran-
dom sample from a whole population). They want their conclusions to hold for others not in their
study. Based on statistical assumptions, generalizability refers to the extent to which results and
conclusions can be applied to people, settings, or conditions beyond those represented in the
study.
Strategies: account for external validity.
Authenticity (realness for participants):
Researchers want to make specific state-
ments about only the people they studied
(how the latter see their world). So, authen-
ticity refers to the extent to which partici-
pants’ voices and agency are ensured, and it
strives for assurances that the researcher
has represented all views of all participants
(authentic means “original, genuine, undis-
puted―).
Strategies: collaboration with participants;
member checking; researcher reflexivity (in-
37. section in a mixed methods study depends on which strand was prioritized, qualitative or
quantitative.
Lynch (2014) clarified that the length of a qualitative Methods section is dictated by how much
detail is re-
quired to describe site selection, access, sampling, data collection, and analytical procedures.
Authors also
have to make available an audit trail (detail) that readers can follow to access researchers’
thinking while they
implemented and adjusted their emergent research design. The same principle of detail holds for
a quantita-
tive paper. The quantitative Methods section should be detailed enough that (a) it can be repeated
by others
because its essential characteristics have been recounted (reliability) and (b) readers can judge
whether the
results and conclusions are valid (i.e., did the study measure what it intended to measure?)
(Kallet, 2004).
More specifically, detail means those things that could logically be expected to influence the
results. “Insuffi-
cient detail leaves the reader with questions; too much detail burdens the reader with irrelevant
information―
(American Psychological Association, 2001, p. 18).
In all three methodologies, authors have to ensure that readers can follow what was done and
judge its rigor
and quality. Labaree (2016) opined that authors should assume readers possess a basic
understanding of the
method. This assumption means authors do not have to go into great detail about specific
procedures; rather,
they should focus on how they “applied a method, not on the mechanics of doing the
method.― The accepted
convention is to provide adequate citations to support the choice and application of the methods
employed
39. and coherent― (Hesson & Fraias-Hesson, 2010b, p. 461).
“The organization of the method section depends on the author’s presentation logic―
(Rocco & Plakhotnik,
2011, p. 167). (a) The most common approach is chronological, meaning authors would arrange
the discus-
sion of their method in the order that things occurred. (b) Sometimes, in order to describe a
complex aspect
of their research design, authors may have to shift to a most-to-least-important structure within
the chrono-
logical approach. (c) Another common organizational pattern is general-to-specific (Boylorn,
2008; Hesson &
Fraias-Hesson, 2010b; Labaree, 2016). (d) Authors can also organize their Methods section using
the major
components of their research design, identified with subheadings, taking direction from Table
8.4 for each
of qualitative and quantitative reports (Boylorn, 2008; Hesson & Fraias-Hesson, 2010b; Rocco &
Plakhotnik,
2011).
Objective Versus Subjective Writing
When preparing quantitative papers, authors are encouraged to use descriptive writing so they
can ensure
concise, adequate, logical, and detailed descriptions of their methods (Goodson, 2017; Labaree,
2016; Rocco
& Plakhotnik, 2011). Goodson (2017) explained that, as ironic as it sounds, when using
descriptive writing,
authors strive to be objective and avoid subjective judgments of what happened during the
sampling or data
collection stages. She provided these examples (p. 177):
Example 8.4 Descriptive (objective) writing “After examining the pictures, the researcher
asked
42. has already happened (Boylorn, 2008; Hesson & Fraias-Hesson, 2010a; Kallet, 2004; Labaree,
2016). There
are a few exceptions. Sentences describing standard procedures commonly used by others are
written in
present tense (e.g., “This assessment instrument is often used in studies focused on student
intelligence―)
(Lynch, 2014). Also, authors should try to avoid using the imperative (e.g., “Add 5 grams of
the solid to the
solution―) because it sounds like a recipe approach, which is to be avoided. A narrative
structure using past
tense is preferred to a step-by-step, recipe model (The Writing Center, 2014b).
Voice
Authors of quantitative papers are encouraged to use passive voice because it places the focus on
what was
done, not who did it. Occasionally, the passive voice is used with a by phrase, naming the agent
as well as
the action (e.g., “The survey was administered by the high school principal―) (Boylorn,
2008; Hesson & Fra-
ias-Hesson, 2010a). While passive voice should always be used in quantitative papers, authors of
qualitative
papers can consciously choose what voice they will use (Boylorn, 2008). Normally, authors of
qualitative pa-
pers employ active voice, which focuses on who did the action. This writing strategy makes
sense because
“qualitative research recognises, and even foregrounds, the role played by individuals—the
researcher, the
informants and other participants― (Lynch, 2014, p. 33).
Example 8.6 Passive voice Stress was applied to the rubber segments in gradually increasing in-
crements. [focus on what was done]
45. 8.
9.
identified, followed with general introductions to (a) the major differences between qualitative
and quantitative
inquiry, (b) the major reporting components (subheadings) of each type of research report, and
(c) the topic
of rigor and quality in each of the three methodologies. The chapter wrapped up with an
overview of the basic
grammatical and organizational conventions of reporting and writing up the Methods section of a
research
paper.
Review and Discussion Questions
Based on the approach used in this book, how do methods differ from methodologies? How do
methods differ from the research design?
Distinguish between research design as logical and as logistical (Figure 8.1).
How is the research design tied with the type of research inquiry?
What are the main differences between qualitative and quantitative inquiry and their approach to
scholarship (see Table 8.3)? Which of these aspects of scholarly inquiry did you struggle with
the
most, and why?
Compare qualitative research design logic with quantitative research design logic.
Identify the basic steps for conducting and reporting both qualitative and quantitative studies,
com-
menting on the issue of linearity and sequentiality. Which method do you feel most comfortable
with,
and why? How did these differ from approaches to designing mixed methods studies?
Explain to someone else in plain language the basic differences in reporting the methods used for
47. o Review and Engagement
o Integrity of Research Designs
o Integrity of Qualitative and Quantitative Research Designs
o Integrity of Mixed Methods Research Designs
o Technical Aspects of Reporting Methods
o Length
o Organizational Logic and Approaches
o Objective Versus Subjective Writing
o Person, Tense, and Voice
o Person
o Tense
o Voice
o Review and Engagement
o Final Judgment on Research Design and Methods Section
o Chapter Summary
o Review and Discussion Questions
Discussion Rubric
Criteria Ratings Points
This criterion is linked to
a Learning Outcome
Content/ Comprehension
20 to >15.0 pts
Excellent
Post demonstrates depth
of understanding of
course content; Addresses
discussion prompt
completely; offers clear
point of view and detail
15 to >10.0 pts
Satisfactory
48. Post demonstrates adequate
depth of understanding, but
does not address all of
discussion prompt; point of
view is somewhat unclear
and detail is limited
10 to >0 pts
Needs Improvement
Post does not
demonstrate depth of
understanding of
course content;
Discussion prompt is
minimally addressed;
Point of view is
unclear and detail is
under-developed
20
This criterion is linked to
a Learning Outcome
Engagement/ Classroom
Interaction (One to two
response posts, please
49. refer to the week's
discussion for specific
requirements)
20 to >14.0 pts
Excellent
Submits required number
of response posts;
Responses extend the
discussion by making
connections, relating to
others' ideas and adding
supporting detail
14 to >9.0 pts
Satisfactory
Submits required number
or response posts; Some
connections are made with
relevant explanation and
detail
9 to >0 pts
Needs Improvement
Responses are not
submitted; Responses
50. are generic, limited, do
not extend the
discussion or add
detail
20
This criterion is linked to
a Learning Outcome
Timeliness
10 pts
Excellent
Submits initial post by
deadline
4 pts
Satisfactory
Submits initial post one to
three days late(after
Wednesday)
0 pts
Needs Improvement
Submits initial post 4
days late(after
Sunday)
10
51. This criterion is linked to
a Learning Outcome
Spelling/
Grammar/Mechanics
10 to >8.0 pts
Excellent
Posts have 0-1 spelling or
grammatical errors;
Properly cites work in
APA format where
required
8 to >5.0 pts
Satisfactory
Posts have 2-3 spelling or
grammatical errors; Cites
work in APA format where
required with few errors
5 to >0 pts
Needs Improvement
Posts have 3 or more
spelling or
grammatical errors;
Does not cite work