SlideShare una empresa de Scribd logo
1 de 18
36

                                       CHAPTER 3

                   RESEARCH DESIGN AND METHODOLOGY

Introduction

       The review of literature has produced reoccurring themes emphasizing the

importance of technological literacy for citizens in the 21st Century (Garfinkel, 2003;

Hall, 2001; Lemke, 2003; Murray, 2003; NAE, 2002; Partnership for 21st Century Skills,

2003; Rose & Dugger, 2003; Zhao & Alexander, 2002; U.S. Department of Education,

2004; Technology Counts, 2005). Education is a critical component in preparing students

for a knowledge-based, digital society. According to Hall (2001), available technologies,

our perceptions of those technologies, and how they are used will determine the shape of

our world. Citizens of the future will face challenges that depend on the development and

application of technology. Are we preparing students, the citizens of tomorrow, for these

challenges?

Purpose of the Study

       This study developed and implemented a faculty survey and a student assessment.

The purpose of the faculty survey was to determine what basic computer skills are needed

by undergraduate students for academic success in post-secondary education. This phase

of the study examined the data collected for trends and differences between the

independent variables of subject/content area, institution, gender, and years of faculty

experience. The purpose of the student assessment was to evaluate the computer

competencies of students entering a post-secondary education. This phase of the study

examined the data collected for trends and differences between the independent variables
37

of home state, number of high school computer courses taken, gender, and major field of

study? Data collection and analysis assisted in determining if students possess the

necessary computer/technology skills entering a post-secondary institution or if a need

exists for a general education course to teach computer literacy/skills to the

undergraduate student population. This study also provided valuable information in

regards to the content of such a course.

Research Questions

   1. What technology skills do post-secondary faculty members deem important for all

       students to possess at the college level?

   2. Are there differences between the student technology skills post-secondary

       faculty members deem important when grouped by subject/content area,

       institution/stratum, gender, years of faculty experience?

   3. What technology skills can students demonstrate proficiently upon entering a

       post-secondary institution?

   4. Are there differences between the proficiency level of students’ technology skills

       when grouped by home state, number of high school computer courses, gender, or

       major field of study?

   5. Are students technologically ready entering post-secondary education or does a

       need exist for a computer literacy/skills course for all undergraduate students?

Instrumentation

       Two instruments were employed for data collection in this research study: a

faculty survey and a student assessment. A faculty survey was designed by the researcher
38

to help identify technology/computer skills deemed important for undergraduate students

to possess in order to be successful in their post-secondary endeavors. A survey research

design was applied to investigate the research questions.

       A second instrument was developed and implemented to assess technology skills

of freshmen undergraduate students who had not yet taken a post-secondary computer

literacy/skills course. A description of the two instruments used in this study follows.

Faculty Survey

Introduction

       According to Leedy and Ormrod (2001), “Research is a viable approach to a

problem only when there are data to support it” (p. 94). Nesbary (2000) defines survey

research as “the process of collecting representative sample data from a larger population

and using the sample to infer attributes of the population” (p. 10). The main purpose of a

survey is to estimate, with significant precision, the percentage of population that has a

specific attribute by collecting data from a small portion of the total population (Dillman,

2000; Wallen & Fraenkel, 2001). The researcher wanted to find out from members of the

population their view on one or more variables. As noted by Borg and Gall (1989),

studies involving surveys comprise a significant amount of the research done in the

education field. Data are ever-changing and survey research portrays a brief moment in

time to enhance our understanding of the present (Leedy & Ormrod, 2001). Educational

surveys are often used to assist in planning and decision making, as well as to evaluate

the effectiveness of an implemented program (McNamara, 1994; Borg & Gall, 1989).
39

       An online faculty survey was conducted to identify computer literacy skills that

faculty members deem important for an undergraduate student to possess in order to be

academically successful at the post-secondary level.

Population and Sample

       The population for this faculty survey consisted of post-secondary faculty

members at four-year public institutions in the state of Missouri. Four-year public

institutions were determined by visiting the Missouri Department of Higher Education

Web site at http://www.cbhe.state.mo.us/Institutions/pubinst.htm. Private or independent

institutions and community colleges were not included in the population. Thus the

sampling frame consisted of all faculty members at thirteen institutions in Missouri, as

summarized in Table 1.

Table 1
Summary of 4-Year Public Institutions in Missouri
 Central Missouri State University
 Harris-Stowe State College
 Lincoln University
 Missouri Southern State University
 Missouri Western State College
 Northwest Missouri State University
 Southeast Missouri State University
 Southwest Missouri State University
 Truman State University
 University of Missouri-Columbia
 University of Missouri-Kansas City
 University of Missouri-Rolla
 University of Missouri-St. Louis


       A sample population was drawn from the sampling frame. A sampling frame

includes the actual list of individuals included in the population (Nesbary, 2000) which

was approximately 4821 faculty members. According to Patten (2004), the quality of the
40

sample affects the quality of the research generalizations. Nesbary (2000), suggests the

larger the sample size, the greater the probability the sample will reflect the general

population. However, sample size alone does not constitute the ability to generalize.

Patten (2004), states that obtaining an unbiased sample is the main criterion when

evaluating the adequacy of a sample. Patten also identifies an unbiased sample as one in

which every member of a population has an equal opportunity of being selected in the

sample. Therefore, random sampling was used in this study to help ensure an unbiased

sample population. Because random sampling may introduce sampling errors, efforts

were made to reduce sampling errors, and thus increasing precision, by increasing the

sample size and by using stratified random sampling. To obtain a stratified random

sample, the population was divided into strata according to institutions as shown in Table

2. Typically, for stratified random sampling, the same percentage of participants, not the

same number of participants, are drawn from each stratum (Patten, 2004).

Table 2
Strata (subgroups) for Stratified Random Sampling
 Instructors and professors at Central Missouri State University
 Instructors and professors at Harris-Stowe State College
 Instructors and professors at Lincoln University
 Instructors and professors at Missouri Southern State University
 Instructors and professors at Missouri Western State College
 Instructors and professors at Northwest Missouri State University
 Instructors and professors at Southeast Missouri State University
 Instructors and professors at Southwest Missouri State University
 Instructors and professors at Truman State University
 Instructors and professors at University of Missouri-Columbia
 Instructors and professors at University of Missouri-Kansas City
 Instructors and professors at University of Missouri-Rolla
 Instructors and professors at University of Missouri-St. Louis
41

       Patten (2004) suggests that a researcher should first consider obtaining an

unbiased sample and then seek a relatively large number of participants. Patten (2004)

provides a table of recommended sample sizes. A table of recommended sample sizes (n)

for populations (N) with finite sizes, developed by Krejcie and Morgan and adapted by

Patten (2004), was used to determine estimated sample size. According to the table, and

for purposes of this study, the researcher used an estimated population size N = 4821 and

thus a sample size goal of n = 357.

Survey Procedures

       In 1998, according to Nesbary (2000), Web surveys were almost non-existent in

the public sector. Nesbary decided to test the waters and conduct three surveys to

compare response rate and response time of Web surveys to regular mail surveys. Survey

results and respondent feedback of all three surveys indicated that Web surveys were

more cost effective, easier to use, had quicker response rates, and greater responses. One

of Nesbary’s Web surveys was distributed to selected universities. Of those surveyed,

respondents indicated a strong preference for use of technology to take advantage of

speed and convenience.

       The researcher used a Web-based survey for the faculty survey portion of this

study. UNL IRB approval was obtained (Appendix A). Two approvals for change of

protocol were also obtained, one for changing the title of the study (Appendix B) and the

other for changing the survey format (Appendix C). From the original IRB request, the

survey was condensed to reduce the number of items, shortening the survey to increase

response rate.
42

Ethical Issues

       McNamara (1994) identifies five ethical concerns to be considered when

conducting survey research. These guidelines deal with voluntary participation, no harm

to respondents, anonymity and confidentiality, identifying purpose and sponsor, and

analysis and reporting. Each guideline will be addressed individually with explanations to

help eliminate or control any ethical concerns.

   First, researchers need to make sure that participation is completely voluntary.

However, voluntary participation can sometimes conflict with the need to have a high

response rate. Low return rates can introduce response bias (McNamara, 1994). In order

to encourage a high response rate, Dillman (2000) suggests multiple contacts. For this

study, up to five contacts were made per potential participant. The first email contact

(Appendix D) was sent a few days preceding the survey to not only verify email

addresses, but also to inform possible participants of the importance and justification for

the study. The second email contact (Appendix E) was the actual email cover letter

explaining the study objectives in more depth. This email consisted of a link to the Web-

based survey and a password to enter. By clicking on the link provided and logging into

the secure site, the participant indicated agreement to participate in the research study.

The third email contact (Appendix F) was sent a week later reminding those that had not

responded. The fourth email contact (Appendix G) was sent two weeks after the actual

survey email reemphasizing the importance of faculty expertise in providing input to the

study. The fifth and final email contact (Appendix H) was sent three weeks after the
43

actual survey email to inform faculty that the study was drawing to a close and that their

input was valuable to the results of the study.

       McNamara’s (1994) second ethical guideline is to avoid possible harm to

respondents. This could include embarrassment or feeling uncomfortable about questions.

This study did not include sensitive questions that could cause embarrassment or

uncomfortable feelings. Harm could also arise in data analysis or in the survey results.

Solutions to these harms will be discussed under confidentiality and report writing

guidelines.

       A third ethical guideline is to protect a respondent’s identity. This can be

accomplished by exercising anonymity and confidentiality. A survey is anonymous when

a respondent cannot be identified on the basis of a response. A survey is confidential

when a response can be identified with a subject, but the researcher promises not to

disclose the individual’s identity (McNamara, 1994). To avoid confusion, the cover email

clearly identified the survey as being confidential in regards to responses and the

reporting of results. Participant identification was kept confidential and was only used in

determining who had not responded for follow-up purposes.

       McNamara’s (1994) fourth ethical guideline is to let all prospective respondents

know the purpose of the survey and the organization that is sponsoring it. The purpose of

the study was provided in the cover email indicating a need to identify technology skills

necessary for students to be successful in their academic coursework and to determine if a

general education computer literacy/skills course should be required of all undergraduate
44

students. The cover email also explained that the results of the study would be used in a

dissertation as partial fulfillment for a Doctoral degree.

       The fifth ethical guideline, as described by McNamara (1994), is to accurately

report both the methods and the results of the surveys to professional colleagues in the

educational community. Because advancements in academic fields come through honesty

and openness, the researcher assumes the responsibility to report problems and

weaknesses experienced as well as the positive results of the study.

Validity and Reliability Issues

       An instrument is valid if it measures what it is intended to measure and accurately

achieves the purpose for which it was designed (Patten, 2004; Wallen & Fraenkel, 2001).

Patten (2004) emphasizes that validity is a matter of degree and discussion should focus

on how valid a test is, not whether it is valid or not. According to Patten (2004), no test

instrument is perfectly valid. The researcher needs some kind of assurance that the

instrument being used will result in accurate conclusions (Wallen & Fraenkel, 2001).

       Validity involves the appropriateness, meaningfulness, and usefulness of

inferences made by the researcher on the basis of the data collected (Wallen & Fraenkel,

2001). Validity can often be thought of as judgmental. According to Patten (2004),

content validity is determined by judgments on the appropriateness of the instrument’s

content. Patten (2004) identifies three principles to improve content validity: 1) use a

broad sample of content rather than a narrow one, 2) emphasize important material, and

3) write questions to measure the appropriate skill. These three principles were addressed

when writing the survey items. To provide additional content validity of the survey
45

instrument, the researcher formed a focus group of five to ten experts in the field of

computer literacy who provided input and suggestive feedback on survey items. Members

of the focus group were educators at the college and/or high school level who have taught

or are currently teaching computer literacy skills. Comments from the focus group

indicated that the skills listed in the survey were basic/intermediate skills and were

appropriate for all college students to know and be able to do. Some members of the

focus group suggested that the survey might be a bit long and that skills could be

generalized and consolidated for a more concise survey. The researcher categorized

application skills and condensed the application component items from 20 per application

to eight items per application. The computer concepts component was reduced from 22

items to eight items.

       According to Patten (2004), “. . . validity is more important than reliability” (p.

71). However, reliability does need to be addressed. Reliability relates to the consistency

of the data collected (Wallen & Fraenkel, 2001). Cronbach’s coefficient alpha was used

to determine the internal reliability of the instrument. The faculty survey instrument was

tested in its entirety, and the subscales of the instrument were tested independently.

Data Collection

       An informal pilot study was conducted with a small group of faculty members at

the researcher’s home institution. Conducting a local pilot study allowed the researcher to

ask participants for suggestive feedback on the survey and also helped eliminate author

bias. Once the pilot survey had been modified as per the educational expert’s feedback,

the survey was administered online to the stratified, random sample population.
46

        Participants of the study were contacted by email explaining the research

objective and asking them to participate. The objective of the research was to gather

information about technology skills, in particular, what technology skills should students

possess to be successful during their post-secondary courses. The email also contained a

link to the Web-based faculty survey and a password to enter the survey. Follow-up email

contacts were sent to increase response rate. Upon completion of the survey, each

respondent was directed to a Web page thanking them for their response and offering

them a copy of the study results if they were interested. Screen shots of the Web-based

faculty survey are presented in Appendix I.

        The Web-based survey was conducted using surveymonkey.com, a survey

software program offered online. For a small fee, the program offered many features

including unlimited number of survey questions, ability to add a personalized logo,

custom redirects, result filtering, and the capability to export data for statistical analysis.

The program provided a list management tool where responses could be tracked by their

email address which proved to be very useful for follow-up emails. The program also

provided security including the option to turn on SSL (Secure Sockets Layers) to utilize

data encryption and provide data protection.

        Responses to the survey were recorded, exported in a spreadsheet, and transferred

to a statistical software package for in-depth analysis. Descriptive statistics were

calculated and data relationships were analyzed.
47

Variables and Measures

       Variables used in the survey have been summarized in Table 3. The variables

consisted of seven independent variables that grouped respondents by common

characteristics and five dependent variables that grouped responses by content categories.

The independent variables included professional title, institution, department/content

area, school size, gender, number of years at current institution, and total number of years

in education. The dependent variables included word processing, spreadsheet,

presentation, database, and computer concepts.

Table 3
Summary of Dependent and Independent Variables in the Faculty Survey
 Independent Variables (n = 7)                  Dependent Variables (n = 5)
 Professional title                             Word processing
 Institution                                    Spreadsheet
 Department                                     Presentation
 School size                                    Database
 Gender                                         Computer concepts
 Years as faculty member at current institution
 Years in education


Data Analysis Plan

       To begin the data analysis process, descriptive statistics were calculated on the

independent variables to summarize and describe the data collected. Survey results were

measured by category. There were five categories (subscales), representing the five

dependent variables. Reponses to the survey items were coded from 1 to 4 depending on

the importance of each skill. One represented ‘not important’, two represented ‘somewhat

important’, three represented ‘important’, and four represented ‘very important’. The

code for all survey items in the same category were summed together for a composite
48

score per category. This category composite score was used for statistical analysis. Item

analysis was conducted to determine the internal consistency and reliability of each

individual item as well as each subscale. Cronbach’s Alpha test was also used to test

internal reliability.

        Inferential statistics were used to reach conclusions and make generalizations

about the characteristics of populations based on data collected from the sample.

Frequencies and/or percentages were used to identify computer skills that faculty

members deem important for all students to possess. Independent t-tests and/or simple

analysis of variance (ANOVA) were used to look for significant differences between the

student technology skills faculty members deem important when grouped by

department/content area, institution/stratum, gender, or years of faculty experience. The

type of tests that were used to answer specific research questions are summarized in

Table 4. A statistical software program, SPSS (Statistical Package for Social Sciences)

was used for in-depth data analyses.

Table 4
Summary of Data Sources, Types and Measures Applied by Research Question
  Research
Question #            Data Source         Response Type   Data Type        Analysis Plan
1              Faculty Survey Responses   Likert Scale    Nominal      f, %
2              Faculty Survey Responses   Likert Scale    Nominal      t test, ANOVA

Student Assessment

Introduction

        To assist in evaluating the technology skills of students, a technology assessment

was conducted to determine computer literacy and performance skills of students entering

a post-secondary institution prior to taking a computer course at the post-secondary level.
49

Population and Sample

       The population for the student assessment consisted of college freshmen from a

small mid-western university enrolled in a computer literacy course.

Permission from students to use their scores in the study was requested through informed

consent forms. This resulted in a sample size of 164 students.

Student Assessment Procedures

       The purpose of the student assessment was to describe specifically what a typical

student entering post-secondary education knows about computer operations and

concepts as well as the computer skills they can demonstrate proficiently.

       An additional Northwest Missouri State University IRB form (Appendix J) was

obtained and student consent forms (Appendix K) were collected from participants. A

series of assessments were given to all students during the first few weeks of a computer

literacy course to determine the computer skills students possess prior to taking a

computer literacy course.

Measurement Instrument

       The student assessment consisted of a few demographic questions, and two major

components: 1) computer concepts and 2) computer application skills.

       The computer concepts component of the assessment covered six different

modules. Module one questions covered computer and information literacy, introduction

to application software, word processing concepts, and inside the system. Module two

questions covered understanding the Internet, email, system software, and exploring the

Web. Module three questions covered spreadsheet concepts, current issues, emerging
50

technologies, and data storage. Module four covered presentation packages, special

purpose programs, multimedia/virtual reality, and input/output. Module five questions

covered database concepts, telecommunications, and networks. Module six questions

covered creating a Web page, ethics, and security. The assessment for the computer

concepts component of the study consisted of 150 questions, 25 questions randomly

selected from each of six module test banks. The number of questions in each module test

bank ranged from 143 to 214 questions. This portion of the study was administered using

an online program called QMark (Question Mark). The assessment was automatically

graded and scores were recorded on a server at Northwest Missouri State University.

         The computer application skills component was assessed using a commercial

software program called SAM (Skills Assessment Manager). SAM is a unique

performance-based testing software program that utilizes realistic, powerful simulations.

The software package works just like the actual Microsoft Word, Excel, Access, and

PowerPoint applications, but without the need for preinstalled Microsoft Office software.

Course Technology, the publisher of SAM software, provided the researcher with a site

license for use in this research. The application skills assessed for this study include word

processing skills, spreadsheet skills, presentation skills, and database skills.

Validity and Reliability Issues

         Patten’s (2004) three principles to improve content validity: 1) use a broad sample

of content rather than a narrow one, 2) emphasize important material, and 3) write

questions to measure the appropriate skill, were addressed when developing assessment

items.
51

       In 1998 Course Technology introduced SAM 1997 and has continued to update

the product through SAM 2000, SAM 2003, and now SAM XP. SAM is a unique

performance-based testing software program that utilizes realistic, powerful simulations.

The software package works just like the actual Microsoft Word, Excel, Access, and

PowerPoint applications, but without the need for preinstalled Office software. According

to Course Technology, SAM is the most powerful testing and reporting tool available.

According to Course Technology (2002),

       SAM is becoming the provider of the most widely-used and effective technology-

       based assessment product line for Microsoft Office used in educational institutions

       today. SAM is used at high schools, colleges, career colleges, MBA programs, and

       in the workplace as a screening tool for placing people in the right training

       courses, a ‘test-out’ tool to determine students’ proficiency before they take a

       course, and a seamless, in-course assessment tool to allow students to demonstrate

       their proficiency as they go through a course.

Proficiency skills on the assessment matched categories on the faculty survey so results

could be compared.

Data Collection

       All students enrolled in the course completed the assessments during the first few

weeks of class using the SAM and QMark software to determine their computer

literacy/skill proficiency level prior to taking the computer literacy course. The

assessments were graded online and the results were immediate. Prior to taking the

assessment, the participants provided data on a few demographic questions. A list of the
52

demographic questions can be found in Appendix L. Computer concepts questions were

randomly selected from a test bank of questions. Sample screen shots can be found in

Appendix M. A list of proficiency skills for the computer applications component of the

student assessment can be found in Appendix N. Student names were kept confidential to

ensure individual privacy.

Variables and Measures

       Variables used in the student assessment have been summarized in Table 5. The

variables consisted of five independent variables that group participants by common

characteristics and five dependent variables that group participants by content categories.

The independent variables included student home state, size of high school graduating

class, number of high school computer courses taken, gender, and post-secondary major

field of study. The dependent variables included computer skills grouped in five

categories including word processing, spreadsheet, presentation, database, and computer

concepts.

Table 5
Summary of Dependent and Independent Variables in the Student Assessment
        Independent Variables (n = 5)           Dependent Variables (n = 5)
Home state                                    Word processing
High school size                              Spreadsheet
Number of high school computer courses        Presentation
Gender                                        Database
Major                                         Computer concepts

Data Analysis Plan

       Results of the student assessment were recorded in a spreadsheet and transferred

to SPSS for statistical analysis. Descriptive statistics and data relationships were

calculated. Independent t-tests and simple analysis of variance (ANOVA) were used to
53

look for significant differences between the proficiency level of students’ technology

skills when grouped by home state, number of high school computer courses taken,

gender, and major field of study. A statistical software program, SPSS (Statistical

Package for Social Sciences) was used for in-depth data analyses. The type of tests used

to answer specific research questions are summarized in Table 6.

Table 6
Summary of Data Sources, Types and Measures Applied by Research Question
  Research
Question #       Data Source Title       Response Type     Data Type        Analysis Plan
3            Student assessment score    Percentage       Interval      f, %
4            Student assessment score    Percentage       Interval      t test, ANOVA
5            Student assessment score    Percentage       Interval      f, %

Más contenido relacionado

La actualidad más candente

An Example of a Qualitative Research Design
An Example of a Qualitative Research DesignAn Example of a Qualitative Research Design
An Example of a Qualitative Research Designdianakamaruddin
 
Article Review
Article ReviewArticle Review
Article Reviewfatinnah
 
Awareness and use of career information sources among secondary school studen...
Awareness and use of career information sources among secondary school studen...Awareness and use of career information sources among secondary school studen...
Awareness and use of career information sources among secondary school studen...Alexander Decker
 
TEACHER’S ATTITUDE TOWARDS UTILISING FUTURE GADGETS IN EDUCATION
TEACHER’S ATTITUDE TOWARDS UTILISING FUTURE GADGETS IN EDUCATION TEACHER’S ATTITUDE TOWARDS UTILISING FUTURE GADGETS IN EDUCATION
TEACHER’S ATTITUDE TOWARDS UTILISING FUTURE GADGETS IN EDUCATION ijcax
 
2009 educational data mining 8 43-2-pb
2009 educational data mining 8 43-2-pb2009 educational data mining 8 43-2-pb
2009 educational data mining 8 43-2-pbStefano Lariccia
 
An Analysis of Behavioral Intention toward Actual Usage of Open Source Softwa...
An Analysis of Behavioral Intention toward Actual Usage of Open Source Softwa...An Analysis of Behavioral Intention toward Actual Usage of Open Source Softwa...
An Analysis of Behavioral Intention toward Actual Usage of Open Source Softwa...IJAEMSJORNAL
 
Thesis PROPOSAL Defense Presentation - March 26
Thesis PROPOSAL Defense Presentation - March 26 Thesis PROPOSAL Defense Presentation - March 26
Thesis PROPOSAL Defense Presentation - March 26 Hermes Huang
 
DETERMINING FACTORS THAT INFLUENCE STUDENTS’ INTENTION TO ADOPT MOBILE BLACKB...
DETERMINING FACTORS THAT INFLUENCE STUDENTS’ INTENTION TO ADOPT MOBILE BLACKB...DETERMINING FACTORS THAT INFLUENCE STUDENTS’ INTENTION TO ADOPT MOBILE BLACKB...
DETERMINING FACTORS THAT INFLUENCE STUDENTS’ INTENTION TO ADOPT MOBILE BLACKB...ijma
 
A critical review journal
A critical review journalA critical review journal
A critical review journalVladimir Cuevas
 
Journal article review
Journal article reviewJournal article review
Journal article reviewrosie_sellwood
 
RESEARCH TRENDS İN EDUCATIONAL TECHNOLOGY İN TURKEY: 2010-2018 YEAR THESIS AN...
RESEARCH TRENDS İN EDUCATIONAL TECHNOLOGY İN TURKEY: 2010-2018 YEAR THESIS AN...RESEARCH TRENDS İN EDUCATIONAL TECHNOLOGY İN TURKEY: 2010-2018 YEAR THESIS AN...
RESEARCH TRENDS İN EDUCATIONAL TECHNOLOGY İN TURKEY: 2010-2018 YEAR THESIS AN...ijcax
 

La actualidad más candente (15)

An Example of a Qualitative Research Design
An Example of a Qualitative Research DesignAn Example of a Qualitative Research Design
An Example of a Qualitative Research Design
 
Article Review
Article ReviewArticle Review
Article Review
 
Article review
Article reviewArticle review
Article review
 
Awareness and use of career information sources among secondary school studen...
Awareness and use of career information sources among secondary school studen...Awareness and use of career information sources among secondary school studen...
Awareness and use of career information sources among secondary school studen...
 
Article Review
Article ReviewArticle Review
Article Review
 
TEACHER’S ATTITUDE TOWARDS UTILISING FUTURE GADGETS IN EDUCATION
TEACHER’S ATTITUDE TOWARDS UTILISING FUTURE GADGETS IN EDUCATION TEACHER’S ATTITUDE TOWARDS UTILISING FUTURE GADGETS IN EDUCATION
TEACHER’S ATTITUDE TOWARDS UTILISING FUTURE GADGETS IN EDUCATION
 
2009 educational data mining 8 43-2-pb
2009 educational data mining 8 43-2-pb2009 educational data mining 8 43-2-pb
2009 educational data mining 8 43-2-pb
 
An Analysis of Behavioral Intention toward Actual Usage of Open Source Softwa...
An Analysis of Behavioral Intention toward Actual Usage of Open Source Softwa...An Analysis of Behavioral Intention toward Actual Usage of Open Source Softwa...
An Analysis of Behavioral Intention toward Actual Usage of Open Source Softwa...
 
Thesis PROPOSAL Defense Presentation - March 26
Thesis PROPOSAL Defense Presentation - March 26 Thesis PROPOSAL Defense Presentation - March 26
Thesis PROPOSAL Defense Presentation - March 26
 
Writing chapter 1
Writing chapter 1Writing chapter 1
Writing chapter 1
 
DETERMINING FACTORS THAT INFLUENCE STUDENTS’ INTENTION TO ADOPT MOBILE BLACKB...
DETERMINING FACTORS THAT INFLUENCE STUDENTS’ INTENTION TO ADOPT MOBILE BLACKB...DETERMINING FACTORS THAT INFLUENCE STUDENTS’ INTENTION TO ADOPT MOBILE BLACKB...
DETERMINING FACTORS THAT INFLUENCE STUDENTS’ INTENTION TO ADOPT MOBILE BLACKB...
 
A critical review journal
A critical review journalA critical review journal
A critical review journal
 
Journal article review
Journal article reviewJournal article review
Journal article review
 
RESEARCH TRENDS İN EDUCATIONAL TECHNOLOGY İN TURKEY: 2010-2018 YEAR THESIS AN...
RESEARCH TRENDS İN EDUCATIONAL TECHNOLOGY İN TURKEY: 2010-2018 YEAR THESIS AN...RESEARCH TRENDS İN EDUCATIONAL TECHNOLOGY İN TURKEY: 2010-2018 YEAR THESIS AN...
RESEARCH TRENDS İN EDUCATIONAL TECHNOLOGY İN TURKEY: 2010-2018 YEAR THESIS AN...
 
CBE Life Sci Educ-2016-Maton-Bailey
CBE Life Sci Educ-2016-Maton-BaileyCBE Life Sci Educ-2016-Maton-Bailey
CBE Life Sci Educ-2016-Maton-Bailey
 

Similar a Chapter3

Review
ReviewReview
Reviewronda3
 
WHY DO LEARNERS CHOOSE ONLINE LEARNING THE LEARNERS’ VOI.docx
WHY DO LEARNERS CHOOSE ONLINE LEARNING  THE LEARNERS’ VOI.docxWHY DO LEARNERS CHOOSE ONLINE LEARNING  THE LEARNERS’ VOI.docx
WHY DO LEARNERS CHOOSE ONLINE LEARNING THE LEARNERS’ VOI.docxmansonagnus
 
James Jurica and Lori Webb - Published National Refereed Article in NATIONAL ...
James Jurica and Lori Webb - Published National Refereed Article in NATIONAL ...James Jurica and Lori Webb - Published National Refereed Article in NATIONAL ...
James Jurica and Lori Webb - Published National Refereed Article in NATIONAL ...William Kritsonis
 
Dr. Lori Webb and Dr. James Jurica, NATIONAL FORUM OF EDUCATIONAL ADMINISTRAT...
Dr. Lori Webb and Dr. James Jurica, NATIONAL FORUM OF EDUCATIONAL ADMINISTRAT...Dr. Lori Webb and Dr. James Jurica, NATIONAL FORUM OF EDUCATIONAL ADMINISTRAT...
Dr. Lori Webb and Dr. James Jurica, NATIONAL FORUM OF EDUCATIONAL ADMINISTRAT...William Kritsonis
 
www.nationalforum.com - Dr. Lorie Webb and Dr. James Jurica - NATIONAL FORUM ...
www.nationalforum.com - Dr. Lorie Webb and Dr. James Jurica - NATIONAL FORUM ...www.nationalforum.com - Dr. Lorie Webb and Dr. James Jurica - NATIONAL FORUM ...
www.nationalforum.com - Dr. Lorie Webb and Dr. James Jurica - NATIONAL FORUM ...William Kritsonis
 
Online Reading Comprehension Assessment
Online Reading Comprehension AssessmentOnline Reading Comprehension Assessment
Online Reading Comprehension AssessmentGreg Mcverry
 
Computers & Education 91 (2015) 32e45Contents lists availabl
Computers & Education 91 (2015) 32e45Contents lists availablComputers & Education 91 (2015) 32e45Contents lists availabl
Computers & Education 91 (2015) 32e45Contents lists availablLynellBull52
 
PREDICTING SUCCESS: AN APPLICATION OF DATA MINING TECHNIQUES TO STUDENT OUTCOMES
PREDICTING SUCCESS: AN APPLICATION OF DATA MINING TECHNIQUES TO STUDENT OUTCOMESPREDICTING SUCCESS: AN APPLICATION OF DATA MINING TECHNIQUES TO STUDENT OUTCOMES
PREDICTING SUCCESS: AN APPLICATION OF DATA MINING TECHNIQUES TO STUDENT OUTCOMESIJDKP
 
PREDICTING SUCCESS: AN APPLICATION OF DATA MINING TECHNIQUES TO STUDENT OUTCOMES
PREDICTING SUCCESS: AN APPLICATION OF DATA MINING TECHNIQUES TO STUDENT OUTCOMESPREDICTING SUCCESS: AN APPLICATION OF DATA MINING TECHNIQUES TO STUDENT OUTCOMES
PREDICTING SUCCESS: AN APPLICATION OF DATA MINING TECHNIQUES TO STUDENT OUTCOMESIJDKP
 
A Mixed Methods Sampling Methodology For A Multisite Case Study
A Mixed Methods Sampling Methodology For A Multisite Case StudyA Mixed Methods Sampling Methodology For A Multisite Case Study
A Mixed Methods Sampling Methodology For A Multisite Case StudyFiona Phillips
 
Student perspectives on formative feedback: an exploratory comparative study
Student perspectives on formative feedback: an exploratory comparative studyStudent perspectives on formative feedback: an exploratory comparative study
Student perspectives on formative feedback: an exploratory comparative studymcjssfs2
 
Attitude of esl chinese students towards call
Attitude of esl chinese  students towards callAttitude of esl chinese  students towards call
Attitude of esl chinese students towards callAyuni Abdullah
 
AWARENESS AND USER PATTERN OF ERESOURCES.pdf
AWARENESS AND USER PATTERN OF ERESOURCES.pdfAWARENESS AND USER PATTERN OF ERESOURCES.pdf
AWARENESS AND USER PATTERN OF ERESOURCES.pdfAyyanar k
 
EFFECT OF MIND MAPS ON STUDENTS’ INTEREST AND ACHIEVEMENT IN MEASURES OF CENT...
EFFECT OF MIND MAPS ON STUDENTS’ INTEREST AND ACHIEVEMENT IN MEASURES OF CENT...EFFECT OF MIND MAPS ON STUDENTS’ INTEREST AND ACHIEVEMENT IN MEASURES OF CENT...
EFFECT OF MIND MAPS ON STUDENTS’ INTEREST AND ACHIEVEMENT IN MEASURES OF CENT...Gabriel Ken
 
Action Research final project.pdf.pdf
Action Research final project.pdf.pdfAction Research final project.pdf.pdf
Action Research final project.pdf.pdfDaphne Smith
 
Intervention forEducationMarkis’ EdwardsJanuary 29, 2018.docx
Intervention forEducationMarkis’ EdwardsJanuary 29, 2018.docxIntervention forEducationMarkis’ EdwardsJanuary 29, 2018.docx
Intervention forEducationMarkis’ EdwardsJanuary 29, 2018.docxnormanibarber20063
 
Fuzzy Measurement of University Students Importance Indexes by Using Analytic...
Fuzzy Measurement of University Students Importance Indexes by Using Analytic...Fuzzy Measurement of University Students Importance Indexes by Using Analytic...
Fuzzy Measurement of University Students Importance Indexes by Using Analytic...IRJESJOURNAL
 

Similar a Chapter3 (20)

Review
ReviewReview
Review
 
WHY DO LEARNERS CHOOSE ONLINE LEARNING THE LEARNERS’ VOI.docx
WHY DO LEARNERS CHOOSE ONLINE LEARNING  THE LEARNERS’ VOI.docxWHY DO LEARNERS CHOOSE ONLINE LEARNING  THE LEARNERS’ VOI.docx
WHY DO LEARNERS CHOOSE ONLINE LEARNING THE LEARNERS’ VOI.docx
 
James Jurica and Lori Webb - Published National Refereed Article in NATIONAL ...
James Jurica and Lori Webb - Published National Refereed Article in NATIONAL ...James Jurica and Lori Webb - Published National Refereed Article in NATIONAL ...
James Jurica and Lori Webb - Published National Refereed Article in NATIONAL ...
 
Dr. Lori Webb and Dr. James Jurica, NATIONAL FORUM OF EDUCATIONAL ADMINISTRAT...
Dr. Lori Webb and Dr. James Jurica, NATIONAL FORUM OF EDUCATIONAL ADMINISTRAT...Dr. Lori Webb and Dr. James Jurica, NATIONAL FORUM OF EDUCATIONAL ADMINISTRAT...
Dr. Lori Webb and Dr. James Jurica, NATIONAL FORUM OF EDUCATIONAL ADMINISTRAT...
 
www.nationalforum.com - Dr. Lorie Webb and Dr. James Jurica - NATIONAL FORUM ...
www.nationalforum.com - Dr. Lorie Webb and Dr. James Jurica - NATIONAL FORUM ...www.nationalforum.com - Dr. Lorie Webb and Dr. James Jurica - NATIONAL FORUM ...
www.nationalforum.com - Dr. Lorie Webb and Dr. James Jurica - NATIONAL FORUM ...
 
Online Reading Comprehension Assessment
Online Reading Comprehension AssessmentOnline Reading Comprehension Assessment
Online Reading Comprehension Assessment
 
Computers & Education 91 (2015) 32e45Contents lists availabl
Computers & Education 91 (2015) 32e45Contents lists availablComputers & Education 91 (2015) 32e45Contents lists availabl
Computers & Education 91 (2015) 32e45Contents lists availabl
 
PREDICTING SUCCESS: AN APPLICATION OF DATA MINING TECHNIQUES TO STUDENT OUTCOMES
PREDICTING SUCCESS: AN APPLICATION OF DATA MINING TECHNIQUES TO STUDENT OUTCOMESPREDICTING SUCCESS: AN APPLICATION OF DATA MINING TECHNIQUES TO STUDENT OUTCOMES
PREDICTING SUCCESS: AN APPLICATION OF DATA MINING TECHNIQUES TO STUDENT OUTCOMES
 
PREDICTING SUCCESS: AN APPLICATION OF DATA MINING TECHNIQUES TO STUDENT OUTCOMES
PREDICTING SUCCESS: AN APPLICATION OF DATA MINING TECHNIQUES TO STUDENT OUTCOMESPREDICTING SUCCESS: AN APPLICATION OF DATA MINING TECHNIQUES TO STUDENT OUTCOMES
PREDICTING SUCCESS: AN APPLICATION OF DATA MINING TECHNIQUES TO STUDENT OUTCOMES
 
A Mixed Methods Sampling Methodology For A Multisite Case Study
A Mixed Methods Sampling Methodology For A Multisite Case StudyA Mixed Methods Sampling Methodology For A Multisite Case Study
A Mixed Methods Sampling Methodology For A Multisite Case Study
 
Student perspectives on formative feedback: an exploratory comparative study
Student perspectives on formative feedback: an exploratory comparative studyStudent perspectives on formative feedback: an exploratory comparative study
Student perspectives on formative feedback: an exploratory comparative study
 
First monday
First mondayFirst monday
First monday
 
Attitude of esl chinese students towards call
Attitude of esl chinese  students towards callAttitude of esl chinese  students towards call
Attitude of esl chinese students towards call
 
AWARENESS AND USER PATTERN OF ERESOURCES.pdf
AWARENESS AND USER PATTERN OF ERESOURCES.pdfAWARENESS AND USER PATTERN OF ERESOURCES.pdf
AWARENESS AND USER PATTERN OF ERESOURCES.pdf
 
EFFECT OF MIND MAPS ON STUDENTS’ INTEREST AND ACHIEVEMENT IN MEASURES OF CENT...
EFFECT OF MIND MAPS ON STUDENTS’ INTEREST AND ACHIEVEMENT IN MEASURES OF CENT...EFFECT OF MIND MAPS ON STUDENTS’ INTEREST AND ACHIEVEMENT IN MEASURES OF CENT...
EFFECT OF MIND MAPS ON STUDENTS’ INTEREST AND ACHIEVEMENT IN MEASURES OF CENT...
 
Action Research final project.pdf.pdf
Action Research final project.pdf.pdfAction Research final project.pdf.pdf
Action Research final project.pdf.pdf
 
Research-Info.docx
Research-Info.docxResearch-Info.docx
Research-Info.docx
 
First monday
First mondayFirst monday
First monday
 
Intervention forEducationMarkis’ EdwardsJanuary 29, 2018.docx
Intervention forEducationMarkis’ EdwardsJanuary 29, 2018.docxIntervention forEducationMarkis’ EdwardsJanuary 29, 2018.docx
Intervention forEducationMarkis’ EdwardsJanuary 29, 2018.docx
 
Fuzzy Measurement of University Students Importance Indexes by Using Analytic...
Fuzzy Measurement of University Students Importance Indexes by Using Analytic...Fuzzy Measurement of University Students Importance Indexes by Using Analytic...
Fuzzy Measurement of University Students Importance Indexes by Using Analytic...
 

Último

Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionSafetyChain Software
 
URLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppURLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppCeline George
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxSayali Powar
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphThiyagu K
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxheathfieldcps1
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Celine George
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104misteraugie
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityGeoBlogs
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...EduSkills OECD
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactPECB
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...Marc Dusseiller Dusjagr
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Krashi Coaching
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxmanuelaromero2013
 
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991RKavithamani
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformChameera Dedduwage
 

Último (20)

Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory Inspection
 
URLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppURLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website App
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptx
 
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy Reform
 

Chapter3

  • 1. 36 CHAPTER 3 RESEARCH DESIGN AND METHODOLOGY Introduction The review of literature has produced reoccurring themes emphasizing the importance of technological literacy for citizens in the 21st Century (Garfinkel, 2003; Hall, 2001; Lemke, 2003; Murray, 2003; NAE, 2002; Partnership for 21st Century Skills, 2003; Rose & Dugger, 2003; Zhao & Alexander, 2002; U.S. Department of Education, 2004; Technology Counts, 2005). Education is a critical component in preparing students for a knowledge-based, digital society. According to Hall (2001), available technologies, our perceptions of those technologies, and how they are used will determine the shape of our world. Citizens of the future will face challenges that depend on the development and application of technology. Are we preparing students, the citizens of tomorrow, for these challenges? Purpose of the Study This study developed and implemented a faculty survey and a student assessment. The purpose of the faculty survey was to determine what basic computer skills are needed by undergraduate students for academic success in post-secondary education. This phase of the study examined the data collected for trends and differences between the independent variables of subject/content area, institution, gender, and years of faculty experience. The purpose of the student assessment was to evaluate the computer competencies of students entering a post-secondary education. This phase of the study examined the data collected for trends and differences between the independent variables
  • 2. 37 of home state, number of high school computer courses taken, gender, and major field of study? Data collection and analysis assisted in determining if students possess the necessary computer/technology skills entering a post-secondary institution or if a need exists for a general education course to teach computer literacy/skills to the undergraduate student population. This study also provided valuable information in regards to the content of such a course. Research Questions 1. What technology skills do post-secondary faculty members deem important for all students to possess at the college level? 2. Are there differences between the student technology skills post-secondary faculty members deem important when grouped by subject/content area, institution/stratum, gender, years of faculty experience? 3. What technology skills can students demonstrate proficiently upon entering a post-secondary institution? 4. Are there differences between the proficiency level of students’ technology skills when grouped by home state, number of high school computer courses, gender, or major field of study? 5. Are students technologically ready entering post-secondary education or does a need exist for a computer literacy/skills course for all undergraduate students? Instrumentation Two instruments were employed for data collection in this research study: a faculty survey and a student assessment. A faculty survey was designed by the researcher
  • 3. 38 to help identify technology/computer skills deemed important for undergraduate students to possess in order to be successful in their post-secondary endeavors. A survey research design was applied to investigate the research questions. A second instrument was developed and implemented to assess technology skills of freshmen undergraduate students who had not yet taken a post-secondary computer literacy/skills course. A description of the two instruments used in this study follows. Faculty Survey Introduction According to Leedy and Ormrod (2001), “Research is a viable approach to a problem only when there are data to support it” (p. 94). Nesbary (2000) defines survey research as “the process of collecting representative sample data from a larger population and using the sample to infer attributes of the population” (p. 10). The main purpose of a survey is to estimate, with significant precision, the percentage of population that has a specific attribute by collecting data from a small portion of the total population (Dillman, 2000; Wallen & Fraenkel, 2001). The researcher wanted to find out from members of the population their view on one or more variables. As noted by Borg and Gall (1989), studies involving surveys comprise a significant amount of the research done in the education field. Data are ever-changing and survey research portrays a brief moment in time to enhance our understanding of the present (Leedy & Ormrod, 2001). Educational surveys are often used to assist in planning and decision making, as well as to evaluate the effectiveness of an implemented program (McNamara, 1994; Borg & Gall, 1989).
  • 4. 39 An online faculty survey was conducted to identify computer literacy skills that faculty members deem important for an undergraduate student to possess in order to be academically successful at the post-secondary level. Population and Sample The population for this faculty survey consisted of post-secondary faculty members at four-year public institutions in the state of Missouri. Four-year public institutions were determined by visiting the Missouri Department of Higher Education Web site at http://www.cbhe.state.mo.us/Institutions/pubinst.htm. Private or independent institutions and community colleges were not included in the population. Thus the sampling frame consisted of all faculty members at thirteen institutions in Missouri, as summarized in Table 1. Table 1 Summary of 4-Year Public Institutions in Missouri Central Missouri State University Harris-Stowe State College Lincoln University Missouri Southern State University Missouri Western State College Northwest Missouri State University Southeast Missouri State University Southwest Missouri State University Truman State University University of Missouri-Columbia University of Missouri-Kansas City University of Missouri-Rolla University of Missouri-St. Louis A sample population was drawn from the sampling frame. A sampling frame includes the actual list of individuals included in the population (Nesbary, 2000) which was approximately 4821 faculty members. According to Patten (2004), the quality of the
  • 5. 40 sample affects the quality of the research generalizations. Nesbary (2000), suggests the larger the sample size, the greater the probability the sample will reflect the general population. However, sample size alone does not constitute the ability to generalize. Patten (2004), states that obtaining an unbiased sample is the main criterion when evaluating the adequacy of a sample. Patten also identifies an unbiased sample as one in which every member of a population has an equal opportunity of being selected in the sample. Therefore, random sampling was used in this study to help ensure an unbiased sample population. Because random sampling may introduce sampling errors, efforts were made to reduce sampling errors, and thus increasing precision, by increasing the sample size and by using stratified random sampling. To obtain a stratified random sample, the population was divided into strata according to institutions as shown in Table 2. Typically, for stratified random sampling, the same percentage of participants, not the same number of participants, are drawn from each stratum (Patten, 2004). Table 2 Strata (subgroups) for Stratified Random Sampling Instructors and professors at Central Missouri State University Instructors and professors at Harris-Stowe State College Instructors and professors at Lincoln University Instructors and professors at Missouri Southern State University Instructors and professors at Missouri Western State College Instructors and professors at Northwest Missouri State University Instructors and professors at Southeast Missouri State University Instructors and professors at Southwest Missouri State University Instructors and professors at Truman State University Instructors and professors at University of Missouri-Columbia Instructors and professors at University of Missouri-Kansas City Instructors and professors at University of Missouri-Rolla Instructors and professors at University of Missouri-St. Louis
  • 6. 41 Patten (2004) suggests that a researcher should first consider obtaining an unbiased sample and then seek a relatively large number of participants. Patten (2004) provides a table of recommended sample sizes. A table of recommended sample sizes (n) for populations (N) with finite sizes, developed by Krejcie and Morgan and adapted by Patten (2004), was used to determine estimated sample size. According to the table, and for purposes of this study, the researcher used an estimated population size N = 4821 and thus a sample size goal of n = 357. Survey Procedures In 1998, according to Nesbary (2000), Web surveys were almost non-existent in the public sector. Nesbary decided to test the waters and conduct three surveys to compare response rate and response time of Web surveys to regular mail surveys. Survey results and respondent feedback of all three surveys indicated that Web surveys were more cost effective, easier to use, had quicker response rates, and greater responses. One of Nesbary’s Web surveys was distributed to selected universities. Of those surveyed, respondents indicated a strong preference for use of technology to take advantage of speed and convenience. The researcher used a Web-based survey for the faculty survey portion of this study. UNL IRB approval was obtained (Appendix A). Two approvals for change of protocol were also obtained, one for changing the title of the study (Appendix B) and the other for changing the survey format (Appendix C). From the original IRB request, the survey was condensed to reduce the number of items, shortening the survey to increase response rate.
  • 7. 42 Ethical Issues McNamara (1994) identifies five ethical concerns to be considered when conducting survey research. These guidelines deal with voluntary participation, no harm to respondents, anonymity and confidentiality, identifying purpose and sponsor, and analysis and reporting. Each guideline will be addressed individually with explanations to help eliminate or control any ethical concerns. First, researchers need to make sure that participation is completely voluntary. However, voluntary participation can sometimes conflict with the need to have a high response rate. Low return rates can introduce response bias (McNamara, 1994). In order to encourage a high response rate, Dillman (2000) suggests multiple contacts. For this study, up to five contacts were made per potential participant. The first email contact (Appendix D) was sent a few days preceding the survey to not only verify email addresses, but also to inform possible participants of the importance and justification for the study. The second email contact (Appendix E) was the actual email cover letter explaining the study objectives in more depth. This email consisted of a link to the Web- based survey and a password to enter. By clicking on the link provided and logging into the secure site, the participant indicated agreement to participate in the research study. The third email contact (Appendix F) was sent a week later reminding those that had not responded. The fourth email contact (Appendix G) was sent two weeks after the actual survey email reemphasizing the importance of faculty expertise in providing input to the study. The fifth and final email contact (Appendix H) was sent three weeks after the
  • 8. 43 actual survey email to inform faculty that the study was drawing to a close and that their input was valuable to the results of the study. McNamara’s (1994) second ethical guideline is to avoid possible harm to respondents. This could include embarrassment or feeling uncomfortable about questions. This study did not include sensitive questions that could cause embarrassment or uncomfortable feelings. Harm could also arise in data analysis or in the survey results. Solutions to these harms will be discussed under confidentiality and report writing guidelines. A third ethical guideline is to protect a respondent’s identity. This can be accomplished by exercising anonymity and confidentiality. A survey is anonymous when a respondent cannot be identified on the basis of a response. A survey is confidential when a response can be identified with a subject, but the researcher promises not to disclose the individual’s identity (McNamara, 1994). To avoid confusion, the cover email clearly identified the survey as being confidential in regards to responses and the reporting of results. Participant identification was kept confidential and was only used in determining who had not responded for follow-up purposes. McNamara’s (1994) fourth ethical guideline is to let all prospective respondents know the purpose of the survey and the organization that is sponsoring it. The purpose of the study was provided in the cover email indicating a need to identify technology skills necessary for students to be successful in their academic coursework and to determine if a general education computer literacy/skills course should be required of all undergraduate
  • 9. 44 students. The cover email also explained that the results of the study would be used in a dissertation as partial fulfillment for a Doctoral degree. The fifth ethical guideline, as described by McNamara (1994), is to accurately report both the methods and the results of the surveys to professional colleagues in the educational community. Because advancements in academic fields come through honesty and openness, the researcher assumes the responsibility to report problems and weaknesses experienced as well as the positive results of the study. Validity and Reliability Issues An instrument is valid if it measures what it is intended to measure and accurately achieves the purpose for which it was designed (Patten, 2004; Wallen & Fraenkel, 2001). Patten (2004) emphasizes that validity is a matter of degree and discussion should focus on how valid a test is, not whether it is valid or not. According to Patten (2004), no test instrument is perfectly valid. The researcher needs some kind of assurance that the instrument being used will result in accurate conclusions (Wallen & Fraenkel, 2001). Validity involves the appropriateness, meaningfulness, and usefulness of inferences made by the researcher on the basis of the data collected (Wallen & Fraenkel, 2001). Validity can often be thought of as judgmental. According to Patten (2004), content validity is determined by judgments on the appropriateness of the instrument’s content. Patten (2004) identifies three principles to improve content validity: 1) use a broad sample of content rather than a narrow one, 2) emphasize important material, and 3) write questions to measure the appropriate skill. These three principles were addressed when writing the survey items. To provide additional content validity of the survey
  • 10. 45 instrument, the researcher formed a focus group of five to ten experts in the field of computer literacy who provided input and suggestive feedback on survey items. Members of the focus group were educators at the college and/or high school level who have taught or are currently teaching computer literacy skills. Comments from the focus group indicated that the skills listed in the survey were basic/intermediate skills and were appropriate for all college students to know and be able to do. Some members of the focus group suggested that the survey might be a bit long and that skills could be generalized and consolidated for a more concise survey. The researcher categorized application skills and condensed the application component items from 20 per application to eight items per application. The computer concepts component was reduced from 22 items to eight items. According to Patten (2004), “. . . validity is more important than reliability” (p. 71). However, reliability does need to be addressed. Reliability relates to the consistency of the data collected (Wallen & Fraenkel, 2001). Cronbach’s coefficient alpha was used to determine the internal reliability of the instrument. The faculty survey instrument was tested in its entirety, and the subscales of the instrument were tested independently. Data Collection An informal pilot study was conducted with a small group of faculty members at the researcher’s home institution. Conducting a local pilot study allowed the researcher to ask participants for suggestive feedback on the survey and also helped eliminate author bias. Once the pilot survey had been modified as per the educational expert’s feedback, the survey was administered online to the stratified, random sample population.
  • 11. 46 Participants of the study were contacted by email explaining the research objective and asking them to participate. The objective of the research was to gather information about technology skills, in particular, what technology skills should students possess to be successful during their post-secondary courses. The email also contained a link to the Web-based faculty survey and a password to enter the survey. Follow-up email contacts were sent to increase response rate. Upon completion of the survey, each respondent was directed to a Web page thanking them for their response and offering them a copy of the study results if they were interested. Screen shots of the Web-based faculty survey are presented in Appendix I. The Web-based survey was conducted using surveymonkey.com, a survey software program offered online. For a small fee, the program offered many features including unlimited number of survey questions, ability to add a personalized logo, custom redirects, result filtering, and the capability to export data for statistical analysis. The program provided a list management tool where responses could be tracked by their email address which proved to be very useful for follow-up emails. The program also provided security including the option to turn on SSL (Secure Sockets Layers) to utilize data encryption and provide data protection. Responses to the survey were recorded, exported in a spreadsheet, and transferred to a statistical software package for in-depth analysis. Descriptive statistics were calculated and data relationships were analyzed.
  • 12. 47 Variables and Measures Variables used in the survey have been summarized in Table 3. The variables consisted of seven independent variables that grouped respondents by common characteristics and five dependent variables that grouped responses by content categories. The independent variables included professional title, institution, department/content area, school size, gender, number of years at current institution, and total number of years in education. The dependent variables included word processing, spreadsheet, presentation, database, and computer concepts. Table 3 Summary of Dependent and Independent Variables in the Faculty Survey Independent Variables (n = 7) Dependent Variables (n = 5) Professional title Word processing Institution Spreadsheet Department Presentation School size Database Gender Computer concepts Years as faculty member at current institution Years in education Data Analysis Plan To begin the data analysis process, descriptive statistics were calculated on the independent variables to summarize and describe the data collected. Survey results were measured by category. There were five categories (subscales), representing the five dependent variables. Reponses to the survey items were coded from 1 to 4 depending on the importance of each skill. One represented ‘not important’, two represented ‘somewhat important’, three represented ‘important’, and four represented ‘very important’. The code for all survey items in the same category were summed together for a composite
  • 13. 48 score per category. This category composite score was used for statistical analysis. Item analysis was conducted to determine the internal consistency and reliability of each individual item as well as each subscale. Cronbach’s Alpha test was also used to test internal reliability. Inferential statistics were used to reach conclusions and make generalizations about the characteristics of populations based on data collected from the sample. Frequencies and/or percentages were used to identify computer skills that faculty members deem important for all students to possess. Independent t-tests and/or simple analysis of variance (ANOVA) were used to look for significant differences between the student technology skills faculty members deem important when grouped by department/content area, institution/stratum, gender, or years of faculty experience. The type of tests that were used to answer specific research questions are summarized in Table 4. A statistical software program, SPSS (Statistical Package for Social Sciences) was used for in-depth data analyses. Table 4 Summary of Data Sources, Types and Measures Applied by Research Question Research Question # Data Source Response Type Data Type Analysis Plan 1 Faculty Survey Responses Likert Scale Nominal f, % 2 Faculty Survey Responses Likert Scale Nominal t test, ANOVA Student Assessment Introduction To assist in evaluating the technology skills of students, a technology assessment was conducted to determine computer literacy and performance skills of students entering a post-secondary institution prior to taking a computer course at the post-secondary level.
  • 14. 49 Population and Sample The population for the student assessment consisted of college freshmen from a small mid-western university enrolled in a computer literacy course. Permission from students to use their scores in the study was requested through informed consent forms. This resulted in a sample size of 164 students. Student Assessment Procedures The purpose of the student assessment was to describe specifically what a typical student entering post-secondary education knows about computer operations and concepts as well as the computer skills they can demonstrate proficiently. An additional Northwest Missouri State University IRB form (Appendix J) was obtained and student consent forms (Appendix K) were collected from participants. A series of assessments were given to all students during the first few weeks of a computer literacy course to determine the computer skills students possess prior to taking a computer literacy course. Measurement Instrument The student assessment consisted of a few demographic questions, and two major components: 1) computer concepts and 2) computer application skills. The computer concepts component of the assessment covered six different modules. Module one questions covered computer and information literacy, introduction to application software, word processing concepts, and inside the system. Module two questions covered understanding the Internet, email, system software, and exploring the Web. Module three questions covered spreadsheet concepts, current issues, emerging
  • 15. 50 technologies, and data storage. Module four covered presentation packages, special purpose programs, multimedia/virtual reality, and input/output. Module five questions covered database concepts, telecommunications, and networks. Module six questions covered creating a Web page, ethics, and security. The assessment for the computer concepts component of the study consisted of 150 questions, 25 questions randomly selected from each of six module test banks. The number of questions in each module test bank ranged from 143 to 214 questions. This portion of the study was administered using an online program called QMark (Question Mark). The assessment was automatically graded and scores were recorded on a server at Northwest Missouri State University. The computer application skills component was assessed using a commercial software program called SAM (Skills Assessment Manager). SAM is a unique performance-based testing software program that utilizes realistic, powerful simulations. The software package works just like the actual Microsoft Word, Excel, Access, and PowerPoint applications, but without the need for preinstalled Microsoft Office software. Course Technology, the publisher of SAM software, provided the researcher with a site license for use in this research. The application skills assessed for this study include word processing skills, spreadsheet skills, presentation skills, and database skills. Validity and Reliability Issues Patten’s (2004) three principles to improve content validity: 1) use a broad sample of content rather than a narrow one, 2) emphasize important material, and 3) write questions to measure the appropriate skill, were addressed when developing assessment items.
  • 16. 51 In 1998 Course Technology introduced SAM 1997 and has continued to update the product through SAM 2000, SAM 2003, and now SAM XP. SAM is a unique performance-based testing software program that utilizes realistic, powerful simulations. The software package works just like the actual Microsoft Word, Excel, Access, and PowerPoint applications, but without the need for preinstalled Office software. According to Course Technology, SAM is the most powerful testing and reporting tool available. According to Course Technology (2002), SAM is becoming the provider of the most widely-used and effective technology- based assessment product line for Microsoft Office used in educational institutions today. SAM is used at high schools, colleges, career colleges, MBA programs, and in the workplace as a screening tool for placing people in the right training courses, a ‘test-out’ tool to determine students’ proficiency before they take a course, and a seamless, in-course assessment tool to allow students to demonstrate their proficiency as they go through a course. Proficiency skills on the assessment matched categories on the faculty survey so results could be compared. Data Collection All students enrolled in the course completed the assessments during the first few weeks of class using the SAM and QMark software to determine their computer literacy/skill proficiency level prior to taking the computer literacy course. The assessments were graded online and the results were immediate. Prior to taking the assessment, the participants provided data on a few demographic questions. A list of the
  • 17. 52 demographic questions can be found in Appendix L. Computer concepts questions were randomly selected from a test bank of questions. Sample screen shots can be found in Appendix M. A list of proficiency skills for the computer applications component of the student assessment can be found in Appendix N. Student names were kept confidential to ensure individual privacy. Variables and Measures Variables used in the student assessment have been summarized in Table 5. The variables consisted of five independent variables that group participants by common characteristics and five dependent variables that group participants by content categories. The independent variables included student home state, size of high school graduating class, number of high school computer courses taken, gender, and post-secondary major field of study. The dependent variables included computer skills grouped in five categories including word processing, spreadsheet, presentation, database, and computer concepts. Table 5 Summary of Dependent and Independent Variables in the Student Assessment Independent Variables (n = 5) Dependent Variables (n = 5) Home state Word processing High school size Spreadsheet Number of high school computer courses Presentation Gender Database Major Computer concepts Data Analysis Plan Results of the student assessment were recorded in a spreadsheet and transferred to SPSS for statistical analysis. Descriptive statistics and data relationships were calculated. Independent t-tests and simple analysis of variance (ANOVA) were used to
  • 18. 53 look for significant differences between the proficiency level of students’ technology skills when grouped by home state, number of high school computer courses taken, gender, and major field of study. A statistical software program, SPSS (Statistical Package for Social Sciences) was used for in-depth data analyses. The type of tests used to answer specific research questions are summarized in Table 6. Table 6 Summary of Data Sources, Types and Measures Applied by Research Question Research Question # Data Source Title Response Type Data Type Analysis Plan 3 Student assessment score Percentage Interval f, % 4 Student assessment score Percentage Interval t test, ANOVA 5 Student assessment score Percentage Interval f, %