SlideShare a Scribd company logo
1 of 22
Assessment Solutions
White Paper
October, 2007
© Assessment Solutions, KSB, NIIT Ltd 1
Assessment Solutions
Table of Contents
Table of Contents....................................................................................................................................2
EXECUTIVE SUMMARY............................................................................................................................4
INTRODUCTION..........................................................................................................................................5
ITEM AUTHORING......................................................................................................................................6
EXAM OBJECTIVES ......................................................................................................................................6
BLUEPRINTING..............................................................................................................................................6
TEST CREATION............................................................................................................................................7
REVIEWING...................................................................................................................................................7
FIELD TESTING.............................................................................................................................................7
HOSTED ONLINE ASSESSMENT ENGINE.............................................................................................9
AUTHORING AND QUESTION BANK MAINTENANCE.....................................................................................9
Practice and Certification Tests...........................................................................................................10
Review / Workflow................................................................................................................................10
TEST CONFIGURATION & DELIVERY .........................................................................................................11
Test Generation and Delivery...............................................................................................................11
Test Instruction.....................................................................................................................................12
Question Pages.....................................................................................................................................12
Review Questions..................................................................................................................................13
Feedback and Hints..............................................................................................................................14
Marks Configuration............................................................................................................................14
Auto Grading........................................................................................................................................14
Importing and Exporting Tests.............................................................................................................14
Proctoring.............................................................................................................................................14
REPORTING.................................................................................................................................................15
Individual Student Performance Chart.................................................................................................15
Comparative Analysis...........................................................................................................................15
Individual Student Activity Details.......................................................................................................15
Performance.........................................................................................................................................15
SCHEDULING AND ADMINISTRATION.........................................................................................................15
User Management.................................................................................................................................15
Test Schedule Management..................................................................................................................15
Extensibility..........................................................................................................................................16
Multilingual..........................................................................................................................................16
High Availability...................................................................................................................................16
Open API based Architecture...............................................................................................................16
Ease of Customization..........................................................................................................................16
HOSTING ....................................................................................................................................................17
ONLINE DELIVERY.....................................................................................................................................18
TESTING CENTERS......................................................................................................................................18
OTHER UNIQUE ASSESSMENT TYPES.........................................................................................................18
PROCTORING...............................................................................................................................................18
MEASUREMENT THEORIES AND MODELS...................................................................................................19
CLASSICAL TEST THEORY..........................................................................................................................19
Item Analysis.........................................................................................................................................19
© Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd
2
Assessment Solutions
Internal Consistency Analysis...............................................................................................................20
Test Reliability Analysis.......................................................................................................................20
Test Validity Analysis...........................................................................................................................20
Standardization and Norm Setting........................................................................................................21
ITEM RESPONSE THEORY (IRT).................................................................................................................21
One-Parameter Logistic Model............................................................................................................21
Two-Parameter Logistic Model............................................................................................................21
Three-Parameter Logistic Model.........................................................................................................21
TOOLS AND TECHNOLOGY.........................................................................................................................22
OTHER SERVICES.....................................................................................................................................22
CUSTOMIZATION AND INTEGRATION..........................................................................................................22
TECHNICAL SUPPORT.................................................................................................................................22
HELPDESK...................................................................................................................................................22
MENTORING................................................................................................................................................22
TEST-PROCESS AUDITING..........................................................................................................................22
PRE-TEST SCREENING.................................................................................................................................22
SCHEDULING AND ADMINISTRATION.........................................................................................................22
© Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd
3
Assessment Solutions
Executive Summary
NIIT’s Assessment Solutions provides a complete range of offerings from strategy and
design to implementation and administration to its customers in the Corporate,
Education and Government segments.
NIIT provides full solutions as well as components of its offerings to various customers
for both low stakes as well as high stakes tests. The products and services include:
• Assessment Strategy
• Design & BluePrinting
• Item Authoring
• Assessment Engine
• Hosting
• Delivery
• Reporting and Psychometric Analysis
• Scheduling, Administration, Helpdesk, Proctoring, Tech Support
© Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd
4
Strategy
Assessment Administration
Technology Integration
& Management
Psychometric Services
Test Item Development
& Management
Assessment
Design
Item bank creation based on overall strategy, audience
and exposure size. Item sun setting strategies
Planning for difficulty levels, discrimination
requirement, statistical plan
Over 2500 physical locations for test delivery
Hosted Test engine, Data management services
IMS QTI compliant test bank. Flexible workflows
and advanced statistical analysis tools
Field testing for Item characteristics, Test
characteristics, reliability and validity analyses
Strategy for diagnostic assessment, formative tests
and summative certification based on program needs
Assessment Solutions
Introduction
There is a growing need to professionally assess knowledge, skills, abilities and attitudes of
people in almost every sphere of life – particularly in education and at the workplace.
In schools and colleges, as it is in professional education, assessments can be used very
effectively as a tool to assess what learning needs to be imparted (diagnostic assessments), a
tool to aid learning (formative assessments) and as a tool to measure the extent of learning and
hence the effectiveness of learning or teaching process (summative assessments).
At the workplace, it has been a common practice to assess the competence of new hires through
tests. Increasingly, employers are assessing, through online testing, the ongoing acquisition of
skills and knowledge for each new technology, product, project, process or client. While this
phenomenon started with the ICT and related industries, the ubiquity of the Internet and
computers has meant that almost every industry is actively embracing online assessments.
A pioneer in computer education, NIIT has been delivering online tests for nearly a decade to its
students in the retail education business. NIIT’s feature-rich assessment solution comprises:
1. Item Authoring
2. Hosted Assessment Engine
3. Delivery
4. Psychometric Analysis
5. Other Services including Helpdesk, Technical Support, Certification Process Audit,
Scheduling and Administration.
© Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd
5
ReliabilityBluePrinting Randomization
Testing
Centers
NIIT Assessment Solutions
Classical Test
Theory
Item Analysis
Validity
Descriptive
Statistics
Calibration
Design
Authoring/
Reviewing
Field Testing
Item Banking/
Delivery
Psychometric
Analysis
Item
Authoring
Hosted
Engine
Delivery/
Admin
Reports
Banking by
Objective
Web-based
authoring
Scheduling/
Delivery
Standards
Compliance
Online
Delivery
Multiple
Question
Types
Proctoring
Item Response
Theory
Security
Customizatio
n
Tech Support
Other
Services
HelpDesk
Pre-Test
Screening
Certification
Process Audit
Training/
Mentoring
Scheduling/
Administration
Security
Security
Item Authoring
Item Authoring
Item authoring is a specialized skill which requires both - formal training as well as systems and
processes for the items to be scientific and effective.
For any test on any subject, item authoring follows a standard process. On an ongoing basis,
over-exposed and poor-performing items must be retired and new ones added.
Test Creation follows the process outlined below:
Objectives from
Client
Blueprint
Creation
Test Creation
NIIT Test
Review &
Fixes
Review by
Client
Completed
Test
Test Item
Bank
Exam Objectives
In this phase, the client provides objectives that are identified according to the required skill level
to be tested through the test. The objectives are further split into specific outcomes of
assessment items (SOA). The SOAs specify the cognitive ability to be assessed.
Blueprinting
Next, the NIIT Test Design Team does Blueprint Creation. A test designer creates a blueprint
with the help of the instructors and an analyst and the program chair reviews the blueprint. The
blueprint is a specification table that determines the configuration of a test. It lays down rules for
the composition of the test. The blueprint ensures test reliability and the definition of:
• Exam objectives
• Difficulty level for each test item
• Types of questions, such as, multiple choice, sequencing, or match the following.
• Percentage distribution across various ability levels for each objective
The blueprint also enables the analyst and designer to decide the weight assigned to a topic or
an SOA, which in turn defines the number of test items to be created for a topic/SOA. The weight
assigned to a topic is decided according to the:
• Importance of a topic to measure the particular ability
• Importance of the topic in the context of the overall assessment
The weight assigned to a topic decides the relative importance of the topic and helps define the
marks to be allocated to each test item.
The blueprint is developed to ensure that:
• The weight given to each topic in each test is appropriate, so that the important topics are
not neglected. This contributes to the validity of the test.
• The abilities tested are appropriate. For example there are sufficient questions requiring
application and understanding of logical reasoning.
© Assessment Solutions, KSB, NIIT Ltd 6
Item Authoring
• Weight of topics and abilities are consistent from test to test; this contributes to the
reliability of the test.
Final Blue Print
From the blueprint, applying the testing conditions derives a Test Configuration Table. Testing
conditions such as number of items in a test, time allowed, maximum marks and marks assigned
to a test item should be determined after careful consideration. The table consists of the:
• Randomization strategy - Randomization could be based on item difficulty, exam objectives,
or a combination of both.
• Item scoring details
• Negative scoring (Y/N)
• Time allocation
• Number of questions
• Cut score
Test Creation
Test Creation follows blueprint creation. This involves the actual writing of the test items and is
done by the NIIT Test Creation Team. To do this, the team:
• Identifies the difficulty level for each identified SOA based on the Bloom’s Taxonomy of
cognitive ability
• Identifies the item type for each SOA based on the analysis and the difficulty level
• Creates each test item based on NIIT Test Item creation Standards and Guidelines. These
guidelines are based on sound Instructional Design principles and correct use of language.
Reviewing
After the items have been authored, each item must go through a series of rigorous reviews to
eliminate errors, ambiguity and biases of any kind.
NIIT Review
After Item Creation, items are reviewed and fixed in the NIIT Test Review and Fixes phase.
Items undergo a rigorous review process. Each test item is checked against various parameters
to ensure that the right ability is tested with the right test item. Only those items that clear the
review process are used in a test. Reviews are of the following types:
• ID Review – Ensures items are in accordance with Instructional Design principles
• Language Review – Ensures clarity of language
• Technical Review – Ensures items are technically correct.
Review by Client
The Review phase is followed by the Review by Client. In this step, the Program Chair would
review the items and identify any changes to the items. Fixes (if any) as suggested by the Client
are made by the NIIT Test Creation Team. The test is now ready for delivery.
Field Testing
Once the items are ready for deployment, they are put through a field test where a statistically
significant number of test-takers who are representative of the final test-taker audience respond
to all the items in a controlled environment. The results data collected from this round of testing is
subjected to statistical analysis to assess the difficulty, discrimination and performance of the
distractors. Poor performing items are modified or dropped.
Item Maintenance
© Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd
7
Item Authoring
Over time, based on exposure and how well each item has performed in the tests, some items
need to retired periodically and replaced by new items. This is an ongoing activity.
© Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd
8
CLiKS Assessment Engine
Hosted Online Assessment Engine
NIIT’s CLiKS Online Assessment Engine is a web-based comprehensive testing and item banking
solution that supports the creation, development, delivery, and management of online
assessments. The entire solution is hosted at NIIT’s Data Center facility.
Being an open and customizable framework, CLiKS Online Assessment Engine can be
implemented quickly and tailored to meet the specific business needs of any customer.
The CLiKS Assessment Engine offers the following features:
• Test Item Authoring: Provides workflow based question authoring functionality with a built-
in multi-level review mechanism.
• Test Configuration: Allows for generating exam, based on specifications. Interfaces with
Authoring for specification and sends the generated item list to authoring.
• Test Delivery: Allows for launching individual candidate test, using the test pack allocated
for the candidate. Interfaces with authoring and sends the results in a standard format.
• Performance Reporting: Interfaces with all the components above and generates various
standard and custom reports for analysis.
• Test and Candidate Administration: Allows for scheduling exams for individuals or batches
along with proctor keys.
Authoring and Question Bank Maintenance
A question consists of a stem that presents a question to the candidate along with options. The
options depend on the question type. Instructional Design experts have different theories for
evaluating or testing a user. This gives rise to a situation where individual experts come out with
separate designs of questions that they would like to ask a user.
The solution supports various types of questions as described later. These question types are
exhaustive enough to fulfill most Instructional Design requirements.
For each question, the author can add feedback. Feedback is entered at the option level. When
the learner selects the wrong option, the feedback is given to improve the learning experience.
Feedback can be disabled for certification tests.
The question editor provides formatting features like bold, italics, underline, text alignment,
ordered list (sequence), unordered list (Bullets), tables, font size, font face, font color, superscript
and subscript. In addition, special symbols, images, hyperlinks, audio, and video clips can be
attached to the question.
The system stores the stem and options in an encrypted form to protect data from unauthorized
access.
There are some parameters that are common to all questions, including positive and negative
marking, difficulty levels, hints, and expiry dates for a question. The number of permissible
attempts for a question is also a configurable item.
The assessment engine provides ready templates for the following Question Types:
• Multiple Choice Single Select
• Multiple Choice Multiple Select
• True and False
• Fill in the blanks (Text and Numeric)
• Match the Following
© Assessment Solutions, KSB, NIIT Ltd 9
CLiKS Assessment Engine
• Free text/Essay Type (Subjective response)
The above question types can be presented to the user in different presentation styles. The
assessment engine maintains two sets of information for the questions, one is the type of
question it represents and the other is the corresponding presentation format.
Additional question types can easily be incorporated into the engine depending on requirements
of the organization/university.
Practice and Certification Tests
A Test can be defined as a Practice Test or a Certification Test. Practice tests enable the user to
check their understanding of the subject and pursue remediation depending on the feedback.
Feedback is displayed for each question in a practice test.
Certification is an acknowledgment of the skills possessed by an individual. It involves evaluating
the skills a person possesses and providing a result along with feedback for the same.
Tests can be configured in both static and randomized modes. In a static test, all candidates
taking the test get the same set of questions. In a randomized test, the question set presented to
each candidate is unique. The assessment engine can pick up questions for a test based on the
test configuration from a large question bank.
The system allows creating sections within a test. Sequences within a section can be
predetermined along with specifications for distribution of questions and online analysis based on
statistics.
It is also possible to configure sections so that a user is allowed/not allowed to move ahead to the
next section without successfully completing a section.
Review / Workflow
Content is always reviewed before it is allowed for publication. CLiKS provides the Workflow
module to facilitate the online review and correction of content before it is published.
When a user creates a content piece such as an item or a test, the engine requires it to be
reviewed and approved through a defined, (configurable) workflow process before it is published.
After the initial creation, the content would then follow a specified path and go through the levels
of review/edits. At each level, other roles have specific jobs to perform on the item. Reviewers
can send back the item to the creator for changes. Alternately, items can be forwarded to the next
level reviewer till their final approval.
The Workflow module supports the 3Rs, ‘Routes, Rules, and Roles’.
• Route is the path that content takes while under going review. The path has levels in it, which
are assigned to appropriate roles.
• Roles are the system roles, which are assigned to act upon the content.
• Rules are the conditions specified while setting up a workflow cycle to define the decisions
and actions that a reviewer can take on the content at different levels.
Together these three R's facilitate the functioning of the workflow process in CLiKS.
© Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd
10
CLiKS Assessment Engine
Test Configuration & Delivery
The screen shown in the figure below is a launch page for an Assessment. In this scenario, the
assessment is embedded within a course and its learning plan. It is also possible to create
assessments as independent activities or standalone assessments.
Figure: The Course Dashboard Screen
Test Generation and Delivery
When a candidate starts a test, a test configuration present in the appropriate test pack is
randomly picked up and assigned to the student. Thereafter, the test-taker is delivered the test
using the same set of questions.
The system searches for questions in the Question Pool depending upon the configuration of the
test assigned to the candidate. The system processes questions in batches from the generated
questions pool and displays them on the screen. The system displays the first set of questions as
soon as the Testing Engine selects them. While the test-taker attempts these questions, the
engine keeps selecting further questions and caching them. Further questions are displayed to
the test-taker from this cache. This improves the performance, as the test-taker need not wait till
the engine has selected all the questions for the test. Also, the system need not wait for the
student to attempt the displayed question to fetch the next question from the question pool.
© Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd
11
CLiKS Assessment Engine
Test Instruction
The Test starts with a set of instructions. The figure below, displays the attributes/parameters of
the test as they appear on the instructions screen
.
Figure: Instructions for Assessment screen
Question Pages
Each question in the assessment is displayed on its own page, and each question page includes
the following buttons:
• Review: To review and revise all questions in the section. Make sure that all questions
are “Attempted” before pressing the End Assessment button.
• Section List: To move to the next section, once you have attempted all questions in the
current section.
• Instructions: To return to the initial instructions page.
• Previous Question: To go back to the previous question.
• Submit Answer: To submit your answer and go to the next question.
• Skip Question: To skip the current question and go on to the next question.
Un-attempted questions will be marked incorrect, so make sure that you attempt all the
questions before clicking End Assessment. You can review and revise all questions in
this section by clicking Review Questions.
• End Assessment: To be used when you have reviewed, answered and submitted an
answer for every question of the assessment. You can confirm that all the questions have
been attempted by clicking the “Review Section” button.
© Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd
12
CLiKS Assessment Engine
Figure: The Assessment Screen
Question Types
The Assessment Engine has the ability to incorporate a diverse set of questions that would meet
the requirements of most of the Instructional Design experts. Following question types are
supported in the system through readily available templates:
• Multiple Choice Single Select
• Multiple Choice Multiple Select
• True and False
• Fill in the Blanks (Text and Numeric)
• Match the Following
• Free text/ Essay Type (Subjective response)
Review Questions
The Review Questions page, as shown in the following figure, displays all the questions and
indicates whether or not an answer has been submitted for each question. This page displays the
status of all the attempts made/not made by the learner, and allows him to keep track of progress
during the assessment.
© Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd
13
CLiKS Assessment Engine
Figure: The Questions Review Page on the Assessment Screen
Feedback and Hints
The assessment engine enables a test creator to provide feedback on the responses of the
learners. The feedback corrects the learners and reinforces the concept. The assessment engine
also enables a test creator to provide hints to the learners while the learner is taking a test. In a
certification test, it is possible to penalize learners if they make use of the hints.
Marks Configuration
This enables a test creator to configure marks at the time of test creation. The learners can either
be marked in grades or percentage. The test creator can configure negative marks for each
wrong response, full marks or no marks for a particular question, and the marks can be based on
the rendering type of the question or on its difficulty level. This gives flexibility for the marking
system and discretionary power to the test creator.
Auto Grading
A test creator can select auto grading of the test. In such a case, the test creator need not
configure the marking at the time of creating a test. The learner will be automatically graded
according to the default marks set in the system.
Importing and Exporting Tests
The Assessment engine is IMS and QTI compliant. Hence tests and questions can be easily
imported from and exported to other systems that are standards compliant, promoting reuse of
existing content.
Proctoring
In a certification test, the candidates need to take the test in a controlled environment where their
identity is co-signed by the invigilating authority assigned for the test. The delivery engine has an
interface for the proctor to co-sign candidates for the test. The proctors are assigned to the tests
when the test is being scheduled.
CLiKS provides the facility to proctor the test in the following mechanisms:
Individual Co-signing
In this form of proctoring, the assessment engine launches a screen to accept the login
credentials as the candidates start a test. This is applicable only on the tests where proctoring
has been enabled while configuring it.
Another mechanism of individual cosigning is by which an invigilator is provided a unique random
key for every examination. The invigilator can write or announce the key to the students in the
examination hall.
© Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd
14
CLiKS Assessment Engine
Mass Co-signing or Group Proctoring
Using this mechanism a proctor can co-sign a group or batch of students attempting a test from a
central location using an interface provided by CLiKS. This interface provides a screen that
displays the students taking an assessment. After the proctor has verified the student, all the
students can be selected and co-signed.
Reporting
The purpose of Performance Tracker is to generate various reports to enable analysis of
performance, evaluate the outcomes of students learning and, their assessments. It is also
possible to generate administration reports for monitoring the usage and details of key
information setup in the system.
Individual Student Performance Chart
This report shows the performance of a ‘single’ student in a graphical format. The report shows
the score of a student in various tests. This report enables to view and compare the performance
of a student across various tests in a module. The intended end user of the report is the teaching
staff, who can review the performance of a student in their own batches.
Comparative Analysis
This report shows the trend of the batch for test in a graphical format. The report shows how the
students are performing in a particular test. This report enables to view and judge the average
scoring ability of candidates. The report presents the score in blocks of 10 along the x-axis and
the percentage of candidates who fall under each block along the y-axis.
Individual Student Activity Details
This report lists the details of activities for a candidate in a batch in a tabular format. The intended
end user of the report is the teaching staff. The teaching staff can review the details of online
tests taken by students. The report allows the teaching staff to choose a student from their
batches and view the process.
Performance
CLiKS can support large number of concurrent users, enables streaming of digital content,
supports concurrent streams, and manages bandwidth using defined policies and priorities.
Scheduling and Administration
User Management
CLiKS provides online registration of users in the system and can assign them to desired roles. It
provides the flexibility to assign multiple roles to a user who may be required to undertake the
assigned tasks. Further, CLiKS supports a scenario where a user needs to be provided
permission to a specific function of a different role rather than all functions of the concerned role.
Permission to an individual function in the CLiKS system can be revoked explicitly by overriding
previously assigned roles. This provides the flexibility to the administrator to grant or revoke
permissions to users at the function level.
Test Schedule Management
The tests for the candidates taking certification or practice tests are scheduled in advance. The
test details, time, and duration of the test are provided along with the list of candidates who would
take the test. Candidates log on to the system using the assigned user identification and
password. After a candidate logs on to a system and if the current time falls within the stipulated
time for the test, a link is displayed for the candidate to start the tests. The tests can be re-
scheduled for candidates who are unable to take the tests on the allotted time.
© Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd
15
CLiKS Assessment Engine
The tests, after being configured and published can be delivered to the candidates using the
delivery engine. The delivery engine receives the candidate’s responses and evaluates the
results. There results are stored as a part of the student's assessment records.
An important feature of assessment engine is its ability to keep track of individual question
papers. A learner can abandon a test midway and can resume it at a later date. The engine
keeps tracks of the questions given to the learner in the incomplete test and the options that
had been marked by learner.
Caching is another important feature of the Online Testing Engine. Questions and their
attributes are downloaded and cached in the local machine while the person is answering the
first question. This process is done in background without affecting the display. This improves
system performance, as the learner need not wait for the questions to be downloaded.
Time for the test is calculated as real time and it does not include the download time. This means
that low or high bandwidth does not affect the time allowed to take the test.
Extensibility
CLiKS is a component-based system, which allows a CLiKS component to be replaced with any
external system available in the market. The system is open to learning, collaborative, and
knowledge base tools developed by other vendors. This is achieved by the implementation of
open APIs. Any external system can interface with the existing components using the APIs
provided for the component.
Multilingual
CLiKS supports multibyte UTF-8 character set that allows it to be used for most of the languages
in the world, including Asian languages, such as like Japanese, Chinese, and Korean. The data
stored in the database, field, labels, and error messages can be multilingual.
High Availability
The scalable architecture of CLiKS allows configuration for using more than one Application
servers and Web servers. In such a case, the whole application will be available to users even
when one of the servers is not available.
Open API based Architecture
CLiKS follows the open API based architecture, which allows easy integration with external
systems in an enterprise. The robust design enforces that all the programs in any module can
access the data of their own modules. For accessing the data of any other modules the
published APIs are used. This allows any other application system to get/pass data to CLiKS
modules by calling the published APIs. This allows rapid integration with existing applications in
the organization.
The system is also compatible with LDAP standards allowing it to be integrated with any other
standard application to achieve Single Sign On through an enterprise portal.
Ease of Customization
CLiKS allows easy customization based on organizational needs. All static text in the system
(screen titles, field labels, error messages, etc.) is retrieved from a multilingual file. This allows
for rapid modifications to screen names, field names, and other UI elements, thus eliminating the
need for any re-programming.
© Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd
16
CLiKS Assessment Engine
CLiKS is a rule-based system, where many rules can be defined for each function at the time of
configuration. This once again offers tremendous flexibility during implementation without any
need for re-programming.
The look-and-feel and other changes in organizational policies can be effected in an extremely
short duration.
Technical overview of the CLiKS Assessment Engine:
 J2EE Application server (Pramati)
 Oracle RDBMS
 Scalable, load balancing ready, multilingual architecture.
 Fully Internet browser based
 Portable on all OS including Linux (Red Hat)
 Allows 3-tiered implementation where web server, application server & data base servers
can be separated by firewalls.
Hosting
NIIT’s offers a fully hosted and managed eLearning services environment for production facility
and appropriate infrastructure necessary for your technical requirements and business strategic
goals.
NIIT offers scalability, security and transparency of a hosted eLearning infrastructure, without
requiring investment in hardware and software evaluation, deployment and maintenance.
Features:
• 99.5% Server uptime guarantee
• 24x7 application and systems monitoring
• System, database and network administration
• Back-ups
• Redundant and high resilience Internet connectivity
• Security
• Fully managed staging environment
• Rapid application deployment
• Scalable storage
Application Setup
NIIT takes care of the installation of Hardware, software and the application. NIIT takes away
your worry to Hire, train any additional IT staff to setup the software or configure the hardware.
Availability
NIIT provides 99.5% uptime for your eLearning application with an ability to continue
uninterrupted services.
© Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd
17
Test Delivery
Physical Test Delivery
NIIT has an extensive and growing network of education and test centers in more than 30
countries around the world.
Online Delivery
NIIT provides delivery of online tests to a number of its education and corporate customers
around the world. These tests are scheduled and delivered at locations of choice of the customer.
These tests are used for a variety of purposes:
Education:
1. Entrance into a program
2. As part of courses, to aid in the learning process
3. Ongoing evaluation in various courses
4. Final exams/graduation
Corporate
1. For hiring new employees
2. Assessing the impact of training programs
3. Pre-promotion assessment of abilities and behavioral aspects
Testing Centers
NIIT is setting up a network of dedicated testing centers across India to conduct tests in large
numbers to manage scale for its customers. These centers, besides providing a large facility in
various cities for conducting over 1,000 tests per day each also have other features for doing pre-
assessment screening and post-test interviewing.
The testing centers also have the capability for conducting various new types of assessments
including automatically assessing language, voice and accent skills through patent-pending
applications developed by NIIT.
Other Unique Assessment Types
NIIT is continuously investing in research and development to improve the effectiveness of
computer assisted assessments. In a significant change, NIIT relies less on multiple choice
questions which only test the general abilities or theoretical knowledge of test-takers and not their
suitability for a specific job-role.
NIIT now has several new types of assessments which significantly improve the effectiveness of
the assessment process. Some examples of new item/assessment types are:
• Computer Screen Simulations
• Computer assisted Role-plays
• Automated Voice and Accent assessment
• Online programming skills testing
Proctoring
NIIT provides a proctored environment for conducting high-stakes assessments in its dedicated
testing centers. These centers are equipped with closed-circuit cameras and physical proctoring
by trained proctors.
© Assessment Solutions, KSB, NIIT Ltd 18
Assessment Solutions
Psychometric Analysis
Psychometric Analysis is an integral part of NIIT’s assessment solution. Psychometric analysis is
performed at two levels:
• After field testing: in order to evaluate item performance and for (re)calibrating items for
difficulty. This forms an integral part of item authoring process.
• On an ongoing basis: to continuously evaluate performance of tests and individual items
within the tests.
Measurement Theories and Models
NIIT uses a variety of models for doing psychometric analysis. The model used is selected in
close discussion with the customer and also depends on the nature of tests, types of items and
number of test-takers. The popular models used include:
• Classical Test Theory (CTT)
• Item Response Theory (IRT)
o Rasch Model (1 parameter logistic model or 1 PLM)
o Two-Parameter Logistic Model (2PLM)
o Three Parameter Logistic Model (3PLM)
Classical Test Theory
Item Analysis
During the construction of tests, test developers need to determine the effectiveness of each
individual item in the test. This process of evaluating each item is called item analysis.
On the basis of the information derived from item analysis on test items’ effectiveness, they can
be modified or eliminated from the final test, thus ultimately ensuring the reliability and validity of
the test.
Three important item functions that need to be evaluated through item analysis are:
1. Item Difficulty
2. Item Discrimination
3. Distractor Efficiency
Item Difficulty Analysis
Item difficulty is a measure of the overall difficulty of a test item. In CTT, it is represented as a
percentage of test takers who answer an item correctly. An easy item is one that is correctly
answered by a majority of test takers. Conversely, a difficult item is answered correctly by only a
few test-takers.
Typically, an effective test must have a balanced distribution of items of varying degrees of
difficulty. With a “difficulty value” assigned to each item, it is easy for the test developer to select
items of varying difficulty levels for inclusion in the final test.
Item Discrimination Analysis
The most important function of a test is to differentiate between test takers in terms of their ability,
knowledge, skills and personality. Therefore, it is crucial that the building blocks of the test, i.e.,
the items, have the ability to differentiate, say, between high and low performers on an arithmetic
test.
Through item discrimination analysis, it is possible to evaluate the performance of an item with
respect to its ability to discriminate between people of varying abilities or traits. With a definite
discrimination index for each item, test developer can select those items with higher
discrimination indices for the test.
© Assessment Solutions, KSB, NIIT Ltd 19
CLiKS Assessment Engine
Point-Biserial and Extreme Group Methods are commonly used approaches to arrive at the
Discrimination Index for individual items. Besides providing a detailed reports on discriminative
power of individual items, interpretative comments on the psychometric implications of
discrimination indices of each item are shared with test developers to facilitate the test review
process.
Distractor Efficiency Analysis
In the case of tests that use multiple-choice items, the incorrect answer choices to a question
(distractors) play an important role.
A “good” distractor, which is unambiguously incorrect and yet can confuse the less
knowledgeable test taker, adds to the discrimination value of the question. The effectiveness of
distractors is analyzed by measuring the distribution of responses across all the distractors.
Distractors that are not selected at all or often enough by test-takers may be discarded/ modified/
replaced to improve the efficiency of the item.
Point-Biserial Method is used to compare between the group that chooses a distractor and the
group that chooses the correct option. A distractor efficiency matrix is prepared for all items in a
test and recommendations provided to the item developers.
Internal Consistency Analysis
Items in a test must have internal consistency in measuring the proposed construct or variable.
That is, items that are chosen for a test, designed to measure a particular ability or trait, must
assess only that ability or trait and, therefore, have high correlation among themselves and with
the test.
In effect, information from such analysis is essential to know the internal structure of the test and
to make decisions on the need to further enhance the quality of the test. With high internal
consistency indices, the test developer can confidently rely on the test’s ability to assess the
proposed ability or trait.
The internal consistency of a test is measured by means of different statistical tools. Common
tools used to analyze internal consistency are Cronbach’s Alpha Technique, KR20 Coefficient
and Spearman-Brown Alpha.
Test Reliability Analysis
A good test needs to be consistent in its performance. That is, the same test given to a candidate
today should produce identical results a few weeks or months later when the candidate takes it
again. In a similar way, two or more parallel tests that assess the same ability or trait must show
similar/identical results.
In the psychometric analysis of ability/personality/skills tests, two different methods are used to
evaluate the reliability of tests - Test-Retest Reliability analysis and Alternate-Form or Split-Half
Reliability Analysis.
Test Validity Analysis
It is absolutely essential that a test measure what it was originally supposed to measure. At
various stages of its development, a test needs to be evaluated for its validity. Validity of a test is
assessed in different ways: Face validity, content validity, concurrent validity, predictive validity,
and construct validity.
© Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd
20
CLiKS Assessment Engine
Standardization and Norm Setting
The process of standardization of a test ensures the representativeness of the test to the target
audiences. It enables the test to be administered and scored under uniform conditions so that the
test can produce comparable results across different situations and target audiences. As part of
the process, norms and benchmarks for different groups and situations are set for an unbiased
comparison of individual scores on the test.
According to the specific requirements of the customer, the team of psychometricians create
standard rules for the administration, scoring and interpretation of the test. Standardized scores,
such as percentiles, T-scores and STEN scores, and group-specific comparison norms or
benchmarks are developed for accurate interpretation of individual test scores.
Item Response Theory (IRT)
Item Response Theory has a different approach and style from CTT while analyzing the
psychometric properties of a test and its items. While CTT evaluates a test and its items against
a normative population on which they are psychometrically analyzed and standardized, IRT
analyses are not based on any such relative measures. It uses complex probabilistic models to
arrive at the psychometric properties of test items. It assumes that latent variables such as ability
and personality traits have an underlying measurement scale, on which individual items, tests and
individual test takers can be compared.
IRT is a statistical procedure used to model item responses with certain parameters to determine
the proficiency level of an examinee. IRT computes an estimated item characteristic curve (ICC)
for each test item. IRT can use one to three parameters to specify the item response model. IRT
has the following advantages when compared to classical item analysis:
• Item parameter estimates are independent of the group of examinees selected from the
population for whom the test was designed.
• Examinee ability estimates are independent of the particular choice of test items used
from the population of items that were calibrated.
• The precision of the ability estimates is known.
One-Parameter Logistic Model
This IRT model uses only the difficulty parameter for its analysis. Analysis results include item-
specific information in the form of”Item Characteristic Curve” as well as test-specific information in
the form of a “Test Characteristic Curve”. The location of the curve on the latent ability scale
represents the difficulty of an item. This model takes only the difficulty parameter for the purpose
of analysis; it assumes a fixed value for the discrimination parameter.
Two-Parameter Logistic Model
The two-parameter IRT model utilizes both the difficulty and the discrimination parameters for the
psychometric analysis. The resultant Item Characteristic Curve tells about the difficulty level as
well as the discrimination power of the test item. While the location of the curve on the ability
scale represents item difficulty, the height or steepness of the curve indicates the discrimination
index of the item.
Three-Parameter Logistic Model
In addition to the difficulty and discrimination parameters, the Three-Parameter Model takes a
third dimension into consideration, i.e., the guessing parameter. This third parameter indicates
the probability of getting the item correct by guessing alone. Therefore, the Item Characteristic
Curve resulting from this analysis has information about item difficulty, item discrimination and the
“guessability”.
© Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd
21
CLiKS Assessment Engine
Tools and Technology
The Assessment Practice of NIIT has built its own, proprietary software tools for performing
statistical analysis based on the various models of IRT.
Other Services
NIIT provides a range of other services related to assessments to its customers around the world.
Customization and Integration
For medium to large implementations of the CLiKS Assessment Engine, NIIT provides
customization of the engine for its customers. These include:
• User Interface
• Functionality
• Reports
Since CLiKS uses open APIs and supports industry standards, it is also possible to
interface/integrate CLiKS with other internal systems that customers may have. Single Sign-On,
ERP and client HR systems are some examples of applications that CLiKS may be interfaced
with.
Technical Support
NIIT provides technical support to its customers and test delivery partners to help resolve any
technical query.
Helpdesk
Since many test-takers are still taking online tests for the first time, they may need some hand-
holding. NIIT has 24x7 toll-free helpdesks that provide this service to our test sponsors.
Mentoring
When tests are used as a practice or preparation mechanism, test-takers need to discuss their
performance with someone who is familiar with both – the test items as well as the subject area.
NIIT provides this service to several of its customers through phone, email and chat.
Test-Process Auditing
Some of our customers use our technology to deliver certification tests through a network of their
partners/franchisees. In such cases NIIT provides a service to audit the test delivery process
through scheduled visits as well as through mystery shoppers at test delivery locations. This
helps our customers to maintain a high level of integrity in their testing process.
Pre-test Screening
For certain kinds of tests, e.g., pre-hire assessments, test sponsors may have a set of criteria for
screening candidates for eligibility for the test. NIIT provides both - manual as well automated
options for screening.
Scheduling and Administration
When physical test delivery (for proctored tests) is a constraint, tests need to be scheduled and
administered in a manner that optimizes the availability of all resources. NIIT provides services
for doing this task.
© Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd
22

More Related Content

What's hot

HSK 4 Intensive Reading for Advance Learner V2009 H41004 汉语水平考试四级模拟考题 - Exam-...
HSK 4 Intensive Reading for Advance Learner V2009 H41004 汉语水平考试四级模拟考题 - Exam-...HSK 4 Intensive Reading for Advance Learner V2009 H41004 汉语水平考试四级模拟考题 - Exam-...
HSK 4 Intensive Reading for Advance Learner V2009 H41004 汉语水平考试四级模拟考题 - Exam-...LEGOO MANDARIN
 
GM crops: global socio-economic and environmental impacts 1996-2012
GM crops: global socio-economic and environmental impacts 1996-2012GM crops: global socio-economic and environmental impacts 1996-2012
GM crops: global socio-economic and environmental impacts 1996-2012LarsLarsonShow
 
HSK 5 Chinese Intensive Reading for Advance Learner H51001 汉语水平考试模拟考题 sample
HSK 5 Chinese Intensive Reading for Advance Learner H51001 汉语水平考试模拟考题 sampleHSK 5 Chinese Intensive Reading for Advance Learner H51001 汉语水平考试模拟考题 sample
HSK 5 Chinese Intensive Reading for Advance Learner H51001 汉语水平考试模拟考题 sampleLEGOO MANDARIN
 
HSK 5 Intensive Reading for Advance Learner V2009 H51004 汉语水平考试六级模拟考题 sample
HSK 5 Intensive Reading for Advance Learner V2009 H51004 汉语水平考试六级模拟考题 sampleHSK 5 Intensive Reading for Advance Learner V2009 H51004 汉语水平考试六级模拟考题 sample
HSK 5 Intensive Reading for Advance Learner V2009 H51004 汉语水平考试六级模拟考题 sampleLEGOO MANDARIN
 
HSK 3 Chinese Intensive Reading for Intermediate Level V2009 H31003- Exam-ori...
HSK 3 Chinese Intensive Reading for Intermediate Level V2009 H31003- Exam-ori...HSK 3 Chinese Intensive Reading for Intermediate Level V2009 H31003- Exam-ori...
HSK 3 Chinese Intensive Reading for Intermediate Level V2009 H31003- Exam-ori...LEGOO MANDARIN
 
HSK 3 Chinese Intensive Reading for Intermediate Level V2009 H31004 汉语水平考试模拟考...
HSK 3 Chinese Intensive Reading for Intermediate Level V2009 H31004 汉语水平考试模拟考...HSK 3 Chinese Intensive Reading for Intermediate Level V2009 H31004 汉语水平考试模拟考...
HSK 3 Chinese Intensive Reading for Intermediate Level V2009 H31004 汉语水平考试模拟考...LEGOO MANDARIN
 
AQA GCSE Chinese Grammar Foundation Tier (8673F) 国际汉语水平考试规范性语法 sample
AQA GCSE Chinese Grammar Foundation Tier (8673F) 国际汉语水平考试规范性语法 sampleAQA GCSE Chinese Grammar Foundation Tier (8673F) 国际汉语水平考试规范性语法 sample
AQA GCSE Chinese Grammar Foundation Tier (8673F) 国际汉语水平考试规范性语法 sampleLEGOO MANDARIN
 
Fdi in tradable_services___8211__final_report
Fdi in tradable_services___8211__final_reportFdi in tradable_services___8211__final_report
Fdi in tradable_services___8211__final_reportKrishna Murari
 
HSK 1 Chinese Intensive Reading for Beginner Level V2009 汉语水平考试模拟考题 H10902 sa...
HSK 1 Chinese Intensive Reading for Beginner Level V2009 汉语水平考试模拟考题 H10902 sa...HSK 1 Chinese Intensive Reading for Beginner Level V2009 汉语水平考试模拟考题 H10902 sa...
HSK 1 Chinese Intensive Reading for Beginner Level V2009 汉语水平考试模拟考题 H10902 sa...LEGOO MANDARIN
 
Cie igcse chinese vocabulary (foreign language 0547) v2022 2024 sample
Cie igcse chinese vocabulary (foreign language 0547) v2022 2024 sampleCie igcse chinese vocabulary (foreign language 0547) v2022 2024 sample
Cie igcse chinese vocabulary (foreign language 0547) v2022 2024 sampleLEGOO MANDARIN
 
Edexcel GCSE Chinese Foundation Tier 1800 Vocabulary (1CN0F 2021 Edition) sa...
Edexcel GCSE Chinese Foundation Tier 1800 Vocabulary (1CN0F 2021 Edition)  sa...Edexcel GCSE Chinese Foundation Tier 1800 Vocabulary (1CN0F 2021 Edition)  sa...
Edexcel GCSE Chinese Foundation Tier 1800 Vocabulary (1CN0F 2021 Edition) sa...LEGOO MANDARIN
 
Edexcel IGCSE Chinese Vocabulary 4CN1-3 V2021-Edexcel GCSE 中学会考汉语水平考试词汇 sample
Edexcel IGCSE Chinese Vocabulary 4CN1-3 V2021-Edexcel GCSE 中学会考汉语水平考试词汇 sampleEdexcel IGCSE Chinese Vocabulary 4CN1-3 V2021-Edexcel GCSE 中学会考汉语水平考试词汇 sample
Edexcel IGCSE Chinese Vocabulary 4CN1-3 V2021-Edexcel GCSE 中学会考汉语水平考试词汇 sampleLEGOO MANDARIN
 
AQA GCSE Chinese Foundation Tier 1200 Vocabulary (8673F 2021 Edition) sample
AQA GCSE Chinese Foundation Tier 1200 Vocabulary (8673F 2021 Edition)  sampleAQA GCSE Chinese Foundation Tier 1200 Vocabulary (8673F 2021 Edition)  sample
AQA GCSE Chinese Foundation Tier 1200 Vocabulary (8673F 2021 Edition) sampleLEGOO MANDARIN
 
Cie igcse chinese vocabulary second language (0523) v2022 2024 sample
Cie igcse chinese vocabulary second language (0523) v2022 2024 sampleCie igcse chinese vocabulary second language (0523) v2022 2024 sample
Cie igcse chinese vocabulary second language (0523) v2022 2024 sampleLEGOO MANDARIN
 
Cie igcse chinese vocabulary first language (0509) v2022 2024 sample
Cie igcse chinese vocabulary first language (0509) v2022 2024 sampleCie igcse chinese vocabulary first language (0509) v2022 2024 sample
Cie igcse chinese vocabulary first language (0509) v2022 2024 sampleLEGOO MANDARIN
 
Publication: Space Debris: Applied Technologies and Policy Prescriptions
Publication: Space Debris: Applied Technologies and Policy PrescriptionsPublication: Space Debris: Applied Technologies and Policy Prescriptions
Publication: Space Debris: Applied Technologies and Policy Prescriptionsstephaniclark
 
MICON - NI 43-101 Technical Resource Report
MICON - NI 43-101 Technical Resource ReportMICON - NI 43-101 Technical Resource Report
MICON - NI 43-101 Technical Resource ReportSpider Resources, Inc.
 
Cdc Safewater Systems Manual
Cdc Safewater Systems ManualCdc Safewater Systems Manual
Cdc Safewater Systems ManualJan Hatol
 

What's hot (20)

HSK 4 Intensive Reading for Advance Learner V2009 H41004 汉语水平考试四级模拟考题 - Exam-...
HSK 4 Intensive Reading for Advance Learner V2009 H41004 汉语水平考试四级模拟考题 - Exam-...HSK 4 Intensive Reading for Advance Learner V2009 H41004 汉语水平考试四级模拟考题 - Exam-...
HSK 4 Intensive Reading for Advance Learner V2009 H41004 汉语水平考试四级模拟考题 - Exam-...
 
GM crops: global socio-economic and environmental impacts 1996-2012
GM crops: global socio-economic and environmental impacts 1996-2012GM crops: global socio-economic and environmental impacts 1996-2012
GM crops: global socio-economic and environmental impacts 1996-2012
 
HSK 5 Chinese Intensive Reading for Advance Learner H51001 汉语水平考试模拟考题 sample
HSK 5 Chinese Intensive Reading for Advance Learner H51001 汉语水平考试模拟考题 sampleHSK 5 Chinese Intensive Reading for Advance Learner H51001 汉语水平考试模拟考题 sample
HSK 5 Chinese Intensive Reading for Advance Learner H51001 汉语水平考试模拟考题 sample
 
HSK 5 Intensive Reading for Advance Learner V2009 H51004 汉语水平考试六级模拟考题 sample
HSK 5 Intensive Reading for Advance Learner V2009 H51004 汉语水平考试六级模拟考题 sampleHSK 5 Intensive Reading for Advance Learner V2009 H51004 汉语水平考试六级模拟考题 sample
HSK 5 Intensive Reading for Advance Learner V2009 H51004 汉语水平考试六级模拟考题 sample
 
HSK 3 Chinese Intensive Reading for Intermediate Level V2009 H31003- Exam-ori...
HSK 3 Chinese Intensive Reading for Intermediate Level V2009 H31003- Exam-ori...HSK 3 Chinese Intensive Reading for Intermediate Level V2009 H31003- Exam-ori...
HSK 3 Chinese Intensive Reading for Intermediate Level V2009 H31003- Exam-ori...
 
HSK 3 Chinese Intensive Reading for Intermediate Level V2009 H31004 汉语水平考试模拟考...
HSK 3 Chinese Intensive Reading for Intermediate Level V2009 H31004 汉语水平考试模拟考...HSK 3 Chinese Intensive Reading for Intermediate Level V2009 H31004 汉语水平考试模拟考...
HSK 3 Chinese Intensive Reading for Intermediate Level V2009 H31004 汉语水平考试模拟考...
 
AQA GCSE Chinese Grammar Foundation Tier (8673F) 国际汉语水平考试规范性语法 sample
AQA GCSE Chinese Grammar Foundation Tier (8673F) 国际汉语水平考试规范性语法 sampleAQA GCSE Chinese Grammar Foundation Tier (8673F) 国际汉语水平考试规范性语法 sample
AQA GCSE Chinese Grammar Foundation Tier (8673F) 国际汉语水平考试规范性语法 sample
 
Fdi in tradable_services___8211__final_report
Fdi in tradable_services___8211__final_reportFdi in tradable_services___8211__final_report
Fdi in tradable_services___8211__final_report
 
HSK 1 Chinese Intensive Reading for Beginner Level V2009 汉语水平考试模拟考题 H10902 sa...
HSK 1 Chinese Intensive Reading for Beginner Level V2009 汉语水平考试模拟考题 H10902 sa...HSK 1 Chinese Intensive Reading for Beginner Level V2009 汉语水平考试模拟考题 H10902 sa...
HSK 1 Chinese Intensive Reading for Beginner Level V2009 汉语水平考试模拟考题 H10902 sa...
 
Cie igcse chinese vocabulary (foreign language 0547) v2022 2024 sample
Cie igcse chinese vocabulary (foreign language 0547) v2022 2024 sampleCie igcse chinese vocabulary (foreign language 0547) v2022 2024 sample
Cie igcse chinese vocabulary (foreign language 0547) v2022 2024 sample
 
Edexcel GCSE Chinese Foundation Tier 1800 Vocabulary (1CN0F 2021 Edition) sa...
Edexcel GCSE Chinese Foundation Tier 1800 Vocabulary (1CN0F 2021 Edition)  sa...Edexcel GCSE Chinese Foundation Tier 1800 Vocabulary (1CN0F 2021 Edition)  sa...
Edexcel GCSE Chinese Foundation Tier 1800 Vocabulary (1CN0F 2021 Edition) sa...
 
Edexcel IGCSE Chinese Vocabulary 4CN1-3 V2021-Edexcel GCSE 中学会考汉语水平考试词汇 sample
Edexcel IGCSE Chinese Vocabulary 4CN1-3 V2021-Edexcel GCSE 中学会考汉语水平考试词汇 sampleEdexcel IGCSE Chinese Vocabulary 4CN1-3 V2021-Edexcel GCSE 中学会考汉语水平考试词汇 sample
Edexcel IGCSE Chinese Vocabulary 4CN1-3 V2021-Edexcel GCSE 中学会考汉语水平考试词汇 sample
 
AQA GCSE Chinese Foundation Tier 1200 Vocabulary (8673F 2021 Edition) sample
AQA GCSE Chinese Foundation Tier 1200 Vocabulary (8673F 2021 Edition)  sampleAQA GCSE Chinese Foundation Tier 1200 Vocabulary (8673F 2021 Edition)  sample
AQA GCSE Chinese Foundation Tier 1200 Vocabulary (8673F 2021 Edition) sample
 
Cie igcse chinese vocabulary second language (0523) v2022 2024 sample
Cie igcse chinese vocabulary second language (0523) v2022 2024 sampleCie igcse chinese vocabulary second language (0523) v2022 2024 sample
Cie igcse chinese vocabulary second language (0523) v2022 2024 sample
 
Cie igcse chinese vocabulary first language (0509) v2022 2024 sample
Cie igcse chinese vocabulary first language (0509) v2022 2024 sampleCie igcse chinese vocabulary first language (0509) v2022 2024 sample
Cie igcse chinese vocabulary first language (0509) v2022 2024 sample
 
Publication: Space Debris: Applied Technologies and Policy Prescriptions
Publication: Space Debris: Applied Technologies and Policy PrescriptionsPublication: Space Debris: Applied Technologies and Policy Prescriptions
Publication: Space Debris: Applied Technologies and Policy Prescriptions
 
Indice servizi
Indice serviziIndice servizi
Indice servizi
 
FinalPBISubmission
FinalPBISubmissionFinalPBISubmission
FinalPBISubmission
 
MICON - NI 43-101 Technical Resource Report
MICON - NI 43-101 Technical Resource ReportMICON - NI 43-101 Technical Resource Report
MICON - NI 43-101 Technical Resource Report
 
Cdc Safewater Systems Manual
Cdc Safewater Systems ManualCdc Safewater Systems Manual
Cdc Safewater Systems Manual
 

Similar to Niit assessment practice white paper oct 07

Oaktree funding non-prime_select_guidelines
Oaktree funding non-prime_select_guidelinesOaktree funding non-prime_select_guidelines
Oaktree funding non-prime_select_guidelinesJesse B. Lucero
 
Non Prime Select Guidelines call Jesse B Lucero 702-551-3125
Non Prime Select Guidelines call Jesse B Lucero 702-551-3125Non Prime Select Guidelines call Jesse B Lucero 702-551-3125
Non Prime Select Guidelines call Jesse B Lucero 702-551-3125Jesse B. Lucero
 
01132017_short-term_rental_report
01132017_short-term_rental_report01132017_short-term_rental_report
01132017_short-term_rental_reportJason Vincent, AICP
 
#VirtualDesignMaster 3 Challenge 1 - Steven Viljoen
#VirtualDesignMaster 3 Challenge 1 - Steven Viljoen#VirtualDesignMaster 3 Challenge 1 - Steven Viljoen
#VirtualDesignMaster 3 Challenge 1 - Steven Viljoenvdmchallenge
 
Strategy Field Project Report
Strategy Field Project ReportStrategy Field Project Report
Strategy Field Project ReportSaritaMishra62
 
Stage 4 - Final Proposal
Stage 4 - Final ProposalStage 4 - Final Proposal
Stage 4 - Final ProposalVincent Trinh
 
3GPP Release 10 and beyond
3GPP Release 10 and beyond3GPP Release 10 and beyond
3GPP Release 10 and beyondskripnikov
 
Hilltop, Columbus, Ohio Neighborhood Stabilization Program Recommendations Re...
Hilltop, Columbus, Ohio Neighborhood Stabilization Program Recommendations Re...Hilltop, Columbus, Ohio Neighborhood Stabilization Program Recommendations Re...
Hilltop, Columbus, Ohio Neighborhood Stabilization Program Recommendations Re...amandajking
 
Developing small and medium enterprises in traditional handicraft villages in...
Developing small and medium enterprises in traditional handicraft villages in...Developing small and medium enterprises in traditional handicraft villages in...
Developing small and medium enterprises in traditional handicraft villages in...https://www.facebook.com/garmentspace
 
emerging technologies _act_2005
emerging technologies _act_2005emerging technologies _act_2005
emerging technologies _act_2005Garry Putland
 
Gestion De Los Procesos En Distribucion
Gestion De Los Procesos En DistribucionGestion De Los Procesos En Distribucion
Gestion De Los Procesos En Distribuciontoniomadrid
 
Gestion de Procesos en Distribucion
Gestion de Procesos en DistribucionGestion de Procesos en Distribucion
Gestion de Procesos en Distribuciontoniomadrid
 
Seth Forgosh - - Challenge 1 - Virtual Design Master
Seth Forgosh - - Challenge 1 - Virtual Design MasterSeth Forgosh - - Challenge 1 - Virtual Design Master
Seth Forgosh - - Challenge 1 - Virtual Design Mastervdmchallenge
 
Management by competencies tulay bozkurt
Management by competencies tulay bozkurtManagement by competencies tulay bozkurt
Management by competencies tulay bozkurtTulay Bozkurt
 

Similar to Niit assessment practice white paper oct 07 (20)

Oaktree funding non-prime_select_guidelines
Oaktree funding non-prime_select_guidelinesOaktree funding non-prime_select_guidelines
Oaktree funding non-prime_select_guidelines
 
Non Prime Select Guidelines call Jesse B Lucero 702-551-3125
Non Prime Select Guidelines call Jesse B Lucero 702-551-3125Non Prime Select Guidelines call Jesse B Lucero 702-551-3125
Non Prime Select Guidelines call Jesse B Lucero 702-551-3125
 
Enrollment Management Plan
Enrollment Management PlanEnrollment Management Plan
Enrollment Management Plan
 
raking.pdf
raking.pdfraking.pdf
raking.pdf
 
01132017_short-term_rental_report
01132017_short-term_rental_report01132017_short-term_rental_report
01132017_short-term_rental_report
 
#VirtualDesignMaster 3 Challenge 1 - Steven Viljoen
#VirtualDesignMaster 3 Challenge 1 - Steven Viljoen#VirtualDesignMaster 3 Challenge 1 - Steven Viljoen
#VirtualDesignMaster 3 Challenge 1 - Steven Viljoen
 
Strategy Field Project Report
Strategy Field Project ReportStrategy Field Project Report
Strategy Field Project Report
 
Stage 4 - Final Proposal
Stage 4 - Final ProposalStage 4 - Final Proposal
Stage 4 - Final Proposal
 
3GPP Release 10 and beyond
3GPP Release 10 and beyond3GPP Release 10 and beyond
3GPP Release 10 and beyond
 
Hilltop, Columbus, Ohio Neighborhood Stabilization Program Recommendations Re...
Hilltop, Columbus, Ohio Neighborhood Stabilization Program Recommendations Re...Hilltop, Columbus, Ohio Neighborhood Stabilization Program Recommendations Re...
Hilltop, Columbus, Ohio Neighborhood Stabilization Program Recommendations Re...
 
Developing small and medium enterprises in traditional handicraft villages in...
Developing small and medium enterprises in traditional handicraft villages in...Developing small and medium enterprises in traditional handicraft villages in...
Developing small and medium enterprises in traditional handicraft villages in...
 
tr-4308
tr-4308tr-4308
tr-4308
 
emerging technologies _act_2005
emerging technologies _act_2005emerging technologies _act_2005
emerging technologies _act_2005
 
Gestion De Los Procesos En Distribucion
Gestion De Los Procesos En DistribucionGestion De Los Procesos En Distribucion
Gestion De Los Procesos En Distribucion
 
Gestion de Procesos en Distribucion
Gestion de Procesos en DistribucionGestion de Procesos en Distribucion
Gestion de Procesos en Distribucion
 
M&E Report SEAL 2012
M&E Report SEAL 2012M&E Report SEAL 2012
M&E Report SEAL 2012
 
Seth Forgosh - - Challenge 1 - Virtual Design Master
Seth Forgosh - - Challenge 1 - Virtual Design MasterSeth Forgosh - - Challenge 1 - Virtual Design Master
Seth Forgosh - - Challenge 1 - Virtual Design Master
 
BW Power User TG R4a-1
BW Power User TG R4a-1BW Power User TG R4a-1
BW Power User TG R4a-1
 
Crossing The Next Regional Frontier 2009
Crossing The Next Regional Frontier 2009Crossing The Next Regional Frontier 2009
Crossing The Next Regional Frontier 2009
 
Management by competencies tulay bozkurt
Management by competencies tulay bozkurtManagement by competencies tulay bozkurt
Management by competencies tulay bozkurt
 

Recently uploaded

CCXG global forum, April 2024, Tiza Mafira
CCXG global forum, April 2024,  Tiza MafiraCCXG global forum, April 2024,  Tiza Mafira
CCXG global forum, April 2024, Tiza MafiraOECD Environment
 
LCCXG global forum, April 2024, Lydie-Line Paroz
LCCXG global forum, April 2024,  Lydie-Line ParozLCCXG global forum, April 2024,  Lydie-Line Paroz
LCCXG global forum, April 2024, Lydie-Line ParozOECD Environment
 
CCXG global forum, April 2024, Brian Motherway and Paolo Frankl
CCXG global forum, April 2024,  Brian Motherway and Paolo FranklCCXG global forum, April 2024,  Brian Motherway and Paolo Frankl
CCXG global forum, April 2024, Brian Motherway and Paolo FranklOECD Environment
 
CCXG global forum, April 2024, Surabi Menon
CCXG global forum, April 2024, Surabi MenonCCXG global forum, April 2024, Surabi Menon
CCXG global forum, April 2024, Surabi MenonOECD Environment
 
CCXG global forum, April 2024, Niklas Höhne
CCXG global forum, April 2024,  Niklas HöhneCCXG global forum, April 2024,  Niklas Höhne
CCXG global forum, April 2024, Niklas HöhneOECD Environment
 
XO2 high quality carbon offsets and Bamboo as a Climate Solution
XO2 high quality carbon offsets and Bamboo as a Climate SolutionXO2 high quality carbon offsets and Bamboo as a Climate Solution
XO2 high quality carbon offsets and Bamboo as a Climate SolutionAlexanderPlace
 
Slide deck for the IPCC Briefing to Latvian Parliamentarians
Slide deck for the IPCC Briefing to Latvian ParliamentariansSlide deck for the IPCC Briefing to Latvian Parliamentarians
Slide deck for the IPCC Briefing to Latvian Parliamentariansipcc-media
 
CCXG global forum, April 2024, Amar Bhattacharya
CCXG global forum, April 2024,  Amar BhattacharyaCCXG global forum, April 2024,  Amar Bhattacharya
CCXG global forum, April 2024, Amar BhattacharyaOECD Environment
 
CCXG global forum, April 2024, Marcia Rocha
CCXG global forum, April 2024,  Marcia RochaCCXG global forum, April 2024,  Marcia Rocha
CCXG global forum, April 2024, Marcia RochaOECD Environment
 
CCXG global forum, April 2024, Mia Ryan
CCXG global forum, April 2024,  Mia RyanCCXG global forum, April 2024,  Mia Ryan
CCXG global forum, April 2024, Mia RyanOECD Environment
 
CCXG global forum, April 2024, Raphaël Jachnik
CCXG global forum, April 2024, Raphaël JachnikCCXG global forum, April 2024, Raphaël Jachnik
CCXG global forum, April 2024, Raphaël JachnikOECD Environment
 
CCXG global forum, April 2024, Sirini Jeudy-Hugo
CCXG global forum, April 2024,  Sirini Jeudy-HugoCCXG global forum, April 2024,  Sirini Jeudy-Hugo
CCXG global forum, April 2024, Sirini Jeudy-HugoOECD Environment
 
CCXG global forum, April 2024, David Mutisya
CCXG global forum, April 2024,  David MutisyaCCXG global forum, April 2024,  David Mutisya
CCXG global forum, April 2024, David MutisyaOECD Environment
 
CCXG global forum, April 2024, Thomas Spencer
CCXG global forum, April 2024,  Thomas SpencerCCXG global forum, April 2024,  Thomas Spencer
CCXG global forum, April 2024, Thomas SpencerOECD Environment
 
CCXG global forum, April 2025, Key takeaways
CCXG global forum, April 2025, Key takeawaysCCXG global forum, April 2025, Key takeaways
CCXG global forum, April 2025, Key takeawaysOECD Environment
 
CCXG global forum, April 2024, Adriana Bonilla
CCXG global forum, April 2024,  Adriana BonillaCCXG global forum, April 2024,  Adriana Bonilla
CCXG global forum, April 2024, Adriana BonillaOECD Environment
 
CCXG global forum, April 2024, XU Huaqing
CCXG global forum, April 2024,  XU HuaqingCCXG global forum, April 2024,  XU Huaqing
CCXG global forum, April 2024, XU HuaqingOECD Environment
 
CCXG global forum, April 2024, Julio Cordano
CCXG global forum, April 2024,  Julio CordanoCCXG global forum, April 2024,  Julio Cordano
CCXG global forum, April 2024, Julio CordanoOECD Environment
 
ENVIRONMENTAL ISSUES AND AWARENESS - Presentation
ENVIRONMENTAL ISSUES AND AWARENESS - PresentationENVIRONMENTAL ISSUES AND AWARENESS - Presentation
ENVIRONMENTAL ISSUES AND AWARENESS - PresentationTaruna Deshwal
 
Broiler SBA.docx for agricultural science csec
Broiler SBA.docx for agricultural science csecBroiler SBA.docx for agricultural science csec
Broiler SBA.docx for agricultural science csecLaceyannWilliams
 

Recently uploaded (20)

CCXG global forum, April 2024, Tiza Mafira
CCXG global forum, April 2024,  Tiza MafiraCCXG global forum, April 2024,  Tiza Mafira
CCXG global forum, April 2024, Tiza Mafira
 
LCCXG global forum, April 2024, Lydie-Line Paroz
LCCXG global forum, April 2024,  Lydie-Line ParozLCCXG global forum, April 2024,  Lydie-Line Paroz
LCCXG global forum, April 2024, Lydie-Line Paroz
 
CCXG global forum, April 2024, Brian Motherway and Paolo Frankl
CCXG global forum, April 2024,  Brian Motherway and Paolo FranklCCXG global forum, April 2024,  Brian Motherway and Paolo Frankl
CCXG global forum, April 2024, Brian Motherway and Paolo Frankl
 
CCXG global forum, April 2024, Surabi Menon
CCXG global forum, April 2024, Surabi MenonCCXG global forum, April 2024, Surabi Menon
CCXG global forum, April 2024, Surabi Menon
 
CCXG global forum, April 2024, Niklas Höhne
CCXG global forum, April 2024,  Niklas HöhneCCXG global forum, April 2024,  Niklas Höhne
CCXG global forum, April 2024, Niklas Höhne
 
XO2 high quality carbon offsets and Bamboo as a Climate Solution
XO2 high quality carbon offsets and Bamboo as a Climate SolutionXO2 high quality carbon offsets and Bamboo as a Climate Solution
XO2 high quality carbon offsets and Bamboo as a Climate Solution
 
Slide deck for the IPCC Briefing to Latvian Parliamentarians
Slide deck for the IPCC Briefing to Latvian ParliamentariansSlide deck for the IPCC Briefing to Latvian Parliamentarians
Slide deck for the IPCC Briefing to Latvian Parliamentarians
 
CCXG global forum, April 2024, Amar Bhattacharya
CCXG global forum, April 2024,  Amar BhattacharyaCCXG global forum, April 2024,  Amar Bhattacharya
CCXG global forum, April 2024, Amar Bhattacharya
 
CCXG global forum, April 2024, Marcia Rocha
CCXG global forum, April 2024,  Marcia RochaCCXG global forum, April 2024,  Marcia Rocha
CCXG global forum, April 2024, Marcia Rocha
 
CCXG global forum, April 2024, Mia Ryan
CCXG global forum, April 2024,  Mia RyanCCXG global forum, April 2024,  Mia Ryan
CCXG global forum, April 2024, Mia Ryan
 
CCXG global forum, April 2024, Raphaël Jachnik
CCXG global forum, April 2024, Raphaël JachnikCCXG global forum, April 2024, Raphaël Jachnik
CCXG global forum, April 2024, Raphaël Jachnik
 
CCXG global forum, April 2024, Sirini Jeudy-Hugo
CCXG global forum, April 2024,  Sirini Jeudy-HugoCCXG global forum, April 2024,  Sirini Jeudy-Hugo
CCXG global forum, April 2024, Sirini Jeudy-Hugo
 
CCXG global forum, April 2024, David Mutisya
CCXG global forum, April 2024,  David MutisyaCCXG global forum, April 2024,  David Mutisya
CCXG global forum, April 2024, David Mutisya
 
CCXG global forum, April 2024, Thomas Spencer
CCXG global forum, April 2024,  Thomas SpencerCCXG global forum, April 2024,  Thomas Spencer
CCXG global forum, April 2024, Thomas Spencer
 
CCXG global forum, April 2025, Key takeaways
CCXG global forum, April 2025, Key takeawaysCCXG global forum, April 2025, Key takeaways
CCXG global forum, April 2025, Key takeaways
 
CCXG global forum, April 2024, Adriana Bonilla
CCXG global forum, April 2024,  Adriana BonillaCCXG global forum, April 2024,  Adriana Bonilla
CCXG global forum, April 2024, Adriana Bonilla
 
CCXG global forum, April 2024, XU Huaqing
CCXG global forum, April 2024,  XU HuaqingCCXG global forum, April 2024,  XU Huaqing
CCXG global forum, April 2024, XU Huaqing
 
CCXG global forum, April 2024, Julio Cordano
CCXG global forum, April 2024,  Julio CordanoCCXG global forum, April 2024,  Julio Cordano
CCXG global forum, April 2024, Julio Cordano
 
ENVIRONMENTAL ISSUES AND AWARENESS - Presentation
ENVIRONMENTAL ISSUES AND AWARENESS - PresentationENVIRONMENTAL ISSUES AND AWARENESS - Presentation
ENVIRONMENTAL ISSUES AND AWARENESS - Presentation
 
Broiler SBA.docx for agricultural science csec
Broiler SBA.docx for agricultural science csecBroiler SBA.docx for agricultural science csec
Broiler SBA.docx for agricultural science csec
 

Niit assessment practice white paper oct 07

  • 1. Assessment Solutions White Paper October, 2007 © Assessment Solutions, KSB, NIIT Ltd 1
  • 2. Assessment Solutions Table of Contents Table of Contents....................................................................................................................................2 EXECUTIVE SUMMARY............................................................................................................................4 INTRODUCTION..........................................................................................................................................5 ITEM AUTHORING......................................................................................................................................6 EXAM OBJECTIVES ......................................................................................................................................6 BLUEPRINTING..............................................................................................................................................6 TEST CREATION............................................................................................................................................7 REVIEWING...................................................................................................................................................7 FIELD TESTING.............................................................................................................................................7 HOSTED ONLINE ASSESSMENT ENGINE.............................................................................................9 AUTHORING AND QUESTION BANK MAINTENANCE.....................................................................................9 Practice and Certification Tests...........................................................................................................10 Review / Workflow................................................................................................................................10 TEST CONFIGURATION & DELIVERY .........................................................................................................11 Test Generation and Delivery...............................................................................................................11 Test Instruction.....................................................................................................................................12 Question Pages.....................................................................................................................................12 Review Questions..................................................................................................................................13 Feedback and Hints..............................................................................................................................14 Marks Configuration............................................................................................................................14 Auto Grading........................................................................................................................................14 Importing and Exporting Tests.............................................................................................................14 Proctoring.............................................................................................................................................14 REPORTING.................................................................................................................................................15 Individual Student Performance Chart.................................................................................................15 Comparative Analysis...........................................................................................................................15 Individual Student Activity Details.......................................................................................................15 Performance.........................................................................................................................................15 SCHEDULING AND ADMINISTRATION.........................................................................................................15 User Management.................................................................................................................................15 Test Schedule Management..................................................................................................................15 Extensibility..........................................................................................................................................16 Multilingual..........................................................................................................................................16 High Availability...................................................................................................................................16 Open API based Architecture...............................................................................................................16 Ease of Customization..........................................................................................................................16 HOSTING ....................................................................................................................................................17 ONLINE DELIVERY.....................................................................................................................................18 TESTING CENTERS......................................................................................................................................18 OTHER UNIQUE ASSESSMENT TYPES.........................................................................................................18 PROCTORING...............................................................................................................................................18 MEASUREMENT THEORIES AND MODELS...................................................................................................19 CLASSICAL TEST THEORY..........................................................................................................................19 Item Analysis.........................................................................................................................................19 © Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd 2
  • 3. Assessment Solutions Internal Consistency Analysis...............................................................................................................20 Test Reliability Analysis.......................................................................................................................20 Test Validity Analysis...........................................................................................................................20 Standardization and Norm Setting........................................................................................................21 ITEM RESPONSE THEORY (IRT).................................................................................................................21 One-Parameter Logistic Model............................................................................................................21 Two-Parameter Logistic Model............................................................................................................21 Three-Parameter Logistic Model.........................................................................................................21 TOOLS AND TECHNOLOGY.........................................................................................................................22 OTHER SERVICES.....................................................................................................................................22 CUSTOMIZATION AND INTEGRATION..........................................................................................................22 TECHNICAL SUPPORT.................................................................................................................................22 HELPDESK...................................................................................................................................................22 MENTORING................................................................................................................................................22 TEST-PROCESS AUDITING..........................................................................................................................22 PRE-TEST SCREENING.................................................................................................................................22 SCHEDULING AND ADMINISTRATION.........................................................................................................22 © Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd 3
  • 4. Assessment Solutions Executive Summary NIIT’s Assessment Solutions provides a complete range of offerings from strategy and design to implementation and administration to its customers in the Corporate, Education and Government segments. NIIT provides full solutions as well as components of its offerings to various customers for both low stakes as well as high stakes tests. The products and services include: • Assessment Strategy • Design & BluePrinting • Item Authoring • Assessment Engine • Hosting • Delivery • Reporting and Psychometric Analysis • Scheduling, Administration, Helpdesk, Proctoring, Tech Support © Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd 4 Strategy Assessment Administration Technology Integration & Management Psychometric Services Test Item Development & Management Assessment Design Item bank creation based on overall strategy, audience and exposure size. Item sun setting strategies Planning for difficulty levels, discrimination requirement, statistical plan Over 2500 physical locations for test delivery Hosted Test engine, Data management services IMS QTI compliant test bank. Flexible workflows and advanced statistical analysis tools Field testing for Item characteristics, Test characteristics, reliability and validity analyses Strategy for diagnostic assessment, formative tests and summative certification based on program needs
  • 5. Assessment Solutions Introduction There is a growing need to professionally assess knowledge, skills, abilities and attitudes of people in almost every sphere of life – particularly in education and at the workplace. In schools and colleges, as it is in professional education, assessments can be used very effectively as a tool to assess what learning needs to be imparted (diagnostic assessments), a tool to aid learning (formative assessments) and as a tool to measure the extent of learning and hence the effectiveness of learning or teaching process (summative assessments). At the workplace, it has been a common practice to assess the competence of new hires through tests. Increasingly, employers are assessing, through online testing, the ongoing acquisition of skills and knowledge for each new technology, product, project, process or client. While this phenomenon started with the ICT and related industries, the ubiquity of the Internet and computers has meant that almost every industry is actively embracing online assessments. A pioneer in computer education, NIIT has been delivering online tests for nearly a decade to its students in the retail education business. NIIT’s feature-rich assessment solution comprises: 1. Item Authoring 2. Hosted Assessment Engine 3. Delivery 4. Psychometric Analysis 5. Other Services including Helpdesk, Technical Support, Certification Process Audit, Scheduling and Administration. © Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd 5 ReliabilityBluePrinting Randomization Testing Centers NIIT Assessment Solutions Classical Test Theory Item Analysis Validity Descriptive Statistics Calibration Design Authoring/ Reviewing Field Testing Item Banking/ Delivery Psychometric Analysis Item Authoring Hosted Engine Delivery/ Admin Reports Banking by Objective Web-based authoring Scheduling/ Delivery Standards Compliance Online Delivery Multiple Question Types Proctoring Item Response Theory Security Customizatio n Tech Support Other Services HelpDesk Pre-Test Screening Certification Process Audit Training/ Mentoring Scheduling/ Administration Security Security
  • 6. Item Authoring Item Authoring Item authoring is a specialized skill which requires both - formal training as well as systems and processes for the items to be scientific and effective. For any test on any subject, item authoring follows a standard process. On an ongoing basis, over-exposed and poor-performing items must be retired and new ones added. Test Creation follows the process outlined below: Objectives from Client Blueprint Creation Test Creation NIIT Test Review & Fixes Review by Client Completed Test Test Item Bank Exam Objectives In this phase, the client provides objectives that are identified according to the required skill level to be tested through the test. The objectives are further split into specific outcomes of assessment items (SOA). The SOAs specify the cognitive ability to be assessed. Blueprinting Next, the NIIT Test Design Team does Blueprint Creation. A test designer creates a blueprint with the help of the instructors and an analyst and the program chair reviews the blueprint. The blueprint is a specification table that determines the configuration of a test. It lays down rules for the composition of the test. The blueprint ensures test reliability and the definition of: • Exam objectives • Difficulty level for each test item • Types of questions, such as, multiple choice, sequencing, or match the following. • Percentage distribution across various ability levels for each objective The blueprint also enables the analyst and designer to decide the weight assigned to a topic or an SOA, which in turn defines the number of test items to be created for a topic/SOA. The weight assigned to a topic is decided according to the: • Importance of a topic to measure the particular ability • Importance of the topic in the context of the overall assessment The weight assigned to a topic decides the relative importance of the topic and helps define the marks to be allocated to each test item. The blueprint is developed to ensure that: • The weight given to each topic in each test is appropriate, so that the important topics are not neglected. This contributes to the validity of the test. • The abilities tested are appropriate. For example there are sufficient questions requiring application and understanding of logical reasoning. © Assessment Solutions, KSB, NIIT Ltd 6
  • 7. Item Authoring • Weight of topics and abilities are consistent from test to test; this contributes to the reliability of the test. Final Blue Print From the blueprint, applying the testing conditions derives a Test Configuration Table. Testing conditions such as number of items in a test, time allowed, maximum marks and marks assigned to a test item should be determined after careful consideration. The table consists of the: • Randomization strategy - Randomization could be based on item difficulty, exam objectives, or a combination of both. • Item scoring details • Negative scoring (Y/N) • Time allocation • Number of questions • Cut score Test Creation Test Creation follows blueprint creation. This involves the actual writing of the test items and is done by the NIIT Test Creation Team. To do this, the team: • Identifies the difficulty level for each identified SOA based on the Bloom’s Taxonomy of cognitive ability • Identifies the item type for each SOA based on the analysis and the difficulty level • Creates each test item based on NIIT Test Item creation Standards and Guidelines. These guidelines are based on sound Instructional Design principles and correct use of language. Reviewing After the items have been authored, each item must go through a series of rigorous reviews to eliminate errors, ambiguity and biases of any kind. NIIT Review After Item Creation, items are reviewed and fixed in the NIIT Test Review and Fixes phase. Items undergo a rigorous review process. Each test item is checked against various parameters to ensure that the right ability is tested with the right test item. Only those items that clear the review process are used in a test. Reviews are of the following types: • ID Review – Ensures items are in accordance with Instructional Design principles • Language Review – Ensures clarity of language • Technical Review – Ensures items are technically correct. Review by Client The Review phase is followed by the Review by Client. In this step, the Program Chair would review the items and identify any changes to the items. Fixes (if any) as suggested by the Client are made by the NIIT Test Creation Team. The test is now ready for delivery. Field Testing Once the items are ready for deployment, they are put through a field test where a statistically significant number of test-takers who are representative of the final test-taker audience respond to all the items in a controlled environment. The results data collected from this round of testing is subjected to statistical analysis to assess the difficulty, discrimination and performance of the distractors. Poor performing items are modified or dropped. Item Maintenance © Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd 7
  • 8. Item Authoring Over time, based on exposure and how well each item has performed in the tests, some items need to retired periodically and replaced by new items. This is an ongoing activity. © Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd 8
  • 9. CLiKS Assessment Engine Hosted Online Assessment Engine NIIT’s CLiKS Online Assessment Engine is a web-based comprehensive testing and item banking solution that supports the creation, development, delivery, and management of online assessments. The entire solution is hosted at NIIT’s Data Center facility. Being an open and customizable framework, CLiKS Online Assessment Engine can be implemented quickly and tailored to meet the specific business needs of any customer. The CLiKS Assessment Engine offers the following features: • Test Item Authoring: Provides workflow based question authoring functionality with a built- in multi-level review mechanism. • Test Configuration: Allows for generating exam, based on specifications. Interfaces with Authoring for specification and sends the generated item list to authoring. • Test Delivery: Allows for launching individual candidate test, using the test pack allocated for the candidate. Interfaces with authoring and sends the results in a standard format. • Performance Reporting: Interfaces with all the components above and generates various standard and custom reports for analysis. • Test and Candidate Administration: Allows for scheduling exams for individuals or batches along with proctor keys. Authoring and Question Bank Maintenance A question consists of a stem that presents a question to the candidate along with options. The options depend on the question type. Instructional Design experts have different theories for evaluating or testing a user. This gives rise to a situation where individual experts come out with separate designs of questions that they would like to ask a user. The solution supports various types of questions as described later. These question types are exhaustive enough to fulfill most Instructional Design requirements. For each question, the author can add feedback. Feedback is entered at the option level. When the learner selects the wrong option, the feedback is given to improve the learning experience. Feedback can be disabled for certification tests. The question editor provides formatting features like bold, italics, underline, text alignment, ordered list (sequence), unordered list (Bullets), tables, font size, font face, font color, superscript and subscript. In addition, special symbols, images, hyperlinks, audio, and video clips can be attached to the question. The system stores the stem and options in an encrypted form to protect data from unauthorized access. There are some parameters that are common to all questions, including positive and negative marking, difficulty levels, hints, and expiry dates for a question. The number of permissible attempts for a question is also a configurable item. The assessment engine provides ready templates for the following Question Types: • Multiple Choice Single Select • Multiple Choice Multiple Select • True and False • Fill in the blanks (Text and Numeric) • Match the Following © Assessment Solutions, KSB, NIIT Ltd 9
  • 10. CLiKS Assessment Engine • Free text/Essay Type (Subjective response) The above question types can be presented to the user in different presentation styles. The assessment engine maintains two sets of information for the questions, one is the type of question it represents and the other is the corresponding presentation format. Additional question types can easily be incorporated into the engine depending on requirements of the organization/university. Practice and Certification Tests A Test can be defined as a Practice Test or a Certification Test. Practice tests enable the user to check their understanding of the subject and pursue remediation depending on the feedback. Feedback is displayed for each question in a practice test. Certification is an acknowledgment of the skills possessed by an individual. It involves evaluating the skills a person possesses and providing a result along with feedback for the same. Tests can be configured in both static and randomized modes. In a static test, all candidates taking the test get the same set of questions. In a randomized test, the question set presented to each candidate is unique. The assessment engine can pick up questions for a test based on the test configuration from a large question bank. The system allows creating sections within a test. Sequences within a section can be predetermined along with specifications for distribution of questions and online analysis based on statistics. It is also possible to configure sections so that a user is allowed/not allowed to move ahead to the next section without successfully completing a section. Review / Workflow Content is always reviewed before it is allowed for publication. CLiKS provides the Workflow module to facilitate the online review and correction of content before it is published. When a user creates a content piece such as an item or a test, the engine requires it to be reviewed and approved through a defined, (configurable) workflow process before it is published. After the initial creation, the content would then follow a specified path and go through the levels of review/edits. At each level, other roles have specific jobs to perform on the item. Reviewers can send back the item to the creator for changes. Alternately, items can be forwarded to the next level reviewer till their final approval. The Workflow module supports the 3Rs, ‘Routes, Rules, and Roles’. • Route is the path that content takes while under going review. The path has levels in it, which are assigned to appropriate roles. • Roles are the system roles, which are assigned to act upon the content. • Rules are the conditions specified while setting up a workflow cycle to define the decisions and actions that a reviewer can take on the content at different levels. Together these three R's facilitate the functioning of the workflow process in CLiKS. © Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd 10
  • 11. CLiKS Assessment Engine Test Configuration & Delivery The screen shown in the figure below is a launch page for an Assessment. In this scenario, the assessment is embedded within a course and its learning plan. It is also possible to create assessments as independent activities or standalone assessments. Figure: The Course Dashboard Screen Test Generation and Delivery When a candidate starts a test, a test configuration present in the appropriate test pack is randomly picked up and assigned to the student. Thereafter, the test-taker is delivered the test using the same set of questions. The system searches for questions in the Question Pool depending upon the configuration of the test assigned to the candidate. The system processes questions in batches from the generated questions pool and displays them on the screen. The system displays the first set of questions as soon as the Testing Engine selects them. While the test-taker attempts these questions, the engine keeps selecting further questions and caching them. Further questions are displayed to the test-taker from this cache. This improves the performance, as the test-taker need not wait till the engine has selected all the questions for the test. Also, the system need not wait for the student to attempt the displayed question to fetch the next question from the question pool. © Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd 11
  • 12. CLiKS Assessment Engine Test Instruction The Test starts with a set of instructions. The figure below, displays the attributes/parameters of the test as they appear on the instructions screen . Figure: Instructions for Assessment screen Question Pages Each question in the assessment is displayed on its own page, and each question page includes the following buttons: • Review: To review and revise all questions in the section. Make sure that all questions are “Attempted” before pressing the End Assessment button. • Section List: To move to the next section, once you have attempted all questions in the current section. • Instructions: To return to the initial instructions page. • Previous Question: To go back to the previous question. • Submit Answer: To submit your answer and go to the next question. • Skip Question: To skip the current question and go on to the next question. Un-attempted questions will be marked incorrect, so make sure that you attempt all the questions before clicking End Assessment. You can review and revise all questions in this section by clicking Review Questions. • End Assessment: To be used when you have reviewed, answered and submitted an answer for every question of the assessment. You can confirm that all the questions have been attempted by clicking the “Review Section” button. © Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd 12
  • 13. CLiKS Assessment Engine Figure: The Assessment Screen Question Types The Assessment Engine has the ability to incorporate a diverse set of questions that would meet the requirements of most of the Instructional Design experts. Following question types are supported in the system through readily available templates: • Multiple Choice Single Select • Multiple Choice Multiple Select • True and False • Fill in the Blanks (Text and Numeric) • Match the Following • Free text/ Essay Type (Subjective response) Review Questions The Review Questions page, as shown in the following figure, displays all the questions and indicates whether or not an answer has been submitted for each question. This page displays the status of all the attempts made/not made by the learner, and allows him to keep track of progress during the assessment. © Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd 13
  • 14. CLiKS Assessment Engine Figure: The Questions Review Page on the Assessment Screen Feedback and Hints The assessment engine enables a test creator to provide feedback on the responses of the learners. The feedback corrects the learners and reinforces the concept. The assessment engine also enables a test creator to provide hints to the learners while the learner is taking a test. In a certification test, it is possible to penalize learners if they make use of the hints. Marks Configuration This enables a test creator to configure marks at the time of test creation. The learners can either be marked in grades or percentage. The test creator can configure negative marks for each wrong response, full marks or no marks for a particular question, and the marks can be based on the rendering type of the question or on its difficulty level. This gives flexibility for the marking system and discretionary power to the test creator. Auto Grading A test creator can select auto grading of the test. In such a case, the test creator need not configure the marking at the time of creating a test. The learner will be automatically graded according to the default marks set in the system. Importing and Exporting Tests The Assessment engine is IMS and QTI compliant. Hence tests and questions can be easily imported from and exported to other systems that are standards compliant, promoting reuse of existing content. Proctoring In a certification test, the candidates need to take the test in a controlled environment where their identity is co-signed by the invigilating authority assigned for the test. The delivery engine has an interface for the proctor to co-sign candidates for the test. The proctors are assigned to the tests when the test is being scheduled. CLiKS provides the facility to proctor the test in the following mechanisms: Individual Co-signing In this form of proctoring, the assessment engine launches a screen to accept the login credentials as the candidates start a test. This is applicable only on the tests where proctoring has been enabled while configuring it. Another mechanism of individual cosigning is by which an invigilator is provided a unique random key for every examination. The invigilator can write or announce the key to the students in the examination hall. © Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd 14
  • 15. CLiKS Assessment Engine Mass Co-signing or Group Proctoring Using this mechanism a proctor can co-sign a group or batch of students attempting a test from a central location using an interface provided by CLiKS. This interface provides a screen that displays the students taking an assessment. After the proctor has verified the student, all the students can be selected and co-signed. Reporting The purpose of Performance Tracker is to generate various reports to enable analysis of performance, evaluate the outcomes of students learning and, their assessments. It is also possible to generate administration reports for monitoring the usage and details of key information setup in the system. Individual Student Performance Chart This report shows the performance of a ‘single’ student in a graphical format. The report shows the score of a student in various tests. This report enables to view and compare the performance of a student across various tests in a module. The intended end user of the report is the teaching staff, who can review the performance of a student in their own batches. Comparative Analysis This report shows the trend of the batch for test in a graphical format. The report shows how the students are performing in a particular test. This report enables to view and judge the average scoring ability of candidates. The report presents the score in blocks of 10 along the x-axis and the percentage of candidates who fall under each block along the y-axis. Individual Student Activity Details This report lists the details of activities for a candidate in a batch in a tabular format. The intended end user of the report is the teaching staff. The teaching staff can review the details of online tests taken by students. The report allows the teaching staff to choose a student from their batches and view the process. Performance CLiKS can support large number of concurrent users, enables streaming of digital content, supports concurrent streams, and manages bandwidth using defined policies and priorities. Scheduling and Administration User Management CLiKS provides online registration of users in the system and can assign them to desired roles. It provides the flexibility to assign multiple roles to a user who may be required to undertake the assigned tasks. Further, CLiKS supports a scenario where a user needs to be provided permission to a specific function of a different role rather than all functions of the concerned role. Permission to an individual function in the CLiKS system can be revoked explicitly by overriding previously assigned roles. This provides the flexibility to the administrator to grant or revoke permissions to users at the function level. Test Schedule Management The tests for the candidates taking certification or practice tests are scheduled in advance. The test details, time, and duration of the test are provided along with the list of candidates who would take the test. Candidates log on to the system using the assigned user identification and password. After a candidate logs on to a system and if the current time falls within the stipulated time for the test, a link is displayed for the candidate to start the tests. The tests can be re- scheduled for candidates who are unable to take the tests on the allotted time. © Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd 15
  • 16. CLiKS Assessment Engine The tests, after being configured and published can be delivered to the candidates using the delivery engine. The delivery engine receives the candidate’s responses and evaluates the results. There results are stored as a part of the student's assessment records. An important feature of assessment engine is its ability to keep track of individual question papers. A learner can abandon a test midway and can resume it at a later date. The engine keeps tracks of the questions given to the learner in the incomplete test and the options that had been marked by learner. Caching is another important feature of the Online Testing Engine. Questions and their attributes are downloaded and cached in the local machine while the person is answering the first question. This process is done in background without affecting the display. This improves system performance, as the learner need not wait for the questions to be downloaded. Time for the test is calculated as real time and it does not include the download time. This means that low or high bandwidth does not affect the time allowed to take the test. Extensibility CLiKS is a component-based system, which allows a CLiKS component to be replaced with any external system available in the market. The system is open to learning, collaborative, and knowledge base tools developed by other vendors. This is achieved by the implementation of open APIs. Any external system can interface with the existing components using the APIs provided for the component. Multilingual CLiKS supports multibyte UTF-8 character set that allows it to be used for most of the languages in the world, including Asian languages, such as like Japanese, Chinese, and Korean. The data stored in the database, field, labels, and error messages can be multilingual. High Availability The scalable architecture of CLiKS allows configuration for using more than one Application servers and Web servers. In such a case, the whole application will be available to users even when one of the servers is not available. Open API based Architecture CLiKS follows the open API based architecture, which allows easy integration with external systems in an enterprise. The robust design enforces that all the programs in any module can access the data of their own modules. For accessing the data of any other modules the published APIs are used. This allows any other application system to get/pass data to CLiKS modules by calling the published APIs. This allows rapid integration with existing applications in the organization. The system is also compatible with LDAP standards allowing it to be integrated with any other standard application to achieve Single Sign On through an enterprise portal. Ease of Customization CLiKS allows easy customization based on organizational needs. All static text in the system (screen titles, field labels, error messages, etc.) is retrieved from a multilingual file. This allows for rapid modifications to screen names, field names, and other UI elements, thus eliminating the need for any re-programming. © Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd 16
  • 17. CLiKS Assessment Engine CLiKS is a rule-based system, where many rules can be defined for each function at the time of configuration. This once again offers tremendous flexibility during implementation without any need for re-programming. The look-and-feel and other changes in organizational policies can be effected in an extremely short duration. Technical overview of the CLiKS Assessment Engine:  J2EE Application server (Pramati)  Oracle RDBMS  Scalable, load balancing ready, multilingual architecture.  Fully Internet browser based  Portable on all OS including Linux (Red Hat)  Allows 3-tiered implementation where web server, application server & data base servers can be separated by firewalls. Hosting NIIT’s offers a fully hosted and managed eLearning services environment for production facility and appropriate infrastructure necessary for your technical requirements and business strategic goals. NIIT offers scalability, security and transparency of a hosted eLearning infrastructure, without requiring investment in hardware and software evaluation, deployment and maintenance. Features: • 99.5% Server uptime guarantee • 24x7 application and systems monitoring • System, database and network administration • Back-ups • Redundant and high resilience Internet connectivity • Security • Fully managed staging environment • Rapid application deployment • Scalable storage Application Setup NIIT takes care of the installation of Hardware, software and the application. NIIT takes away your worry to Hire, train any additional IT staff to setup the software or configure the hardware. Availability NIIT provides 99.5% uptime for your eLearning application with an ability to continue uninterrupted services. © Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd 17
  • 18. Test Delivery Physical Test Delivery NIIT has an extensive and growing network of education and test centers in more than 30 countries around the world. Online Delivery NIIT provides delivery of online tests to a number of its education and corporate customers around the world. These tests are scheduled and delivered at locations of choice of the customer. These tests are used for a variety of purposes: Education: 1. Entrance into a program 2. As part of courses, to aid in the learning process 3. Ongoing evaluation in various courses 4. Final exams/graduation Corporate 1. For hiring new employees 2. Assessing the impact of training programs 3. Pre-promotion assessment of abilities and behavioral aspects Testing Centers NIIT is setting up a network of dedicated testing centers across India to conduct tests in large numbers to manage scale for its customers. These centers, besides providing a large facility in various cities for conducting over 1,000 tests per day each also have other features for doing pre- assessment screening and post-test interviewing. The testing centers also have the capability for conducting various new types of assessments including automatically assessing language, voice and accent skills through patent-pending applications developed by NIIT. Other Unique Assessment Types NIIT is continuously investing in research and development to improve the effectiveness of computer assisted assessments. In a significant change, NIIT relies less on multiple choice questions which only test the general abilities or theoretical knowledge of test-takers and not their suitability for a specific job-role. NIIT now has several new types of assessments which significantly improve the effectiveness of the assessment process. Some examples of new item/assessment types are: • Computer Screen Simulations • Computer assisted Role-plays • Automated Voice and Accent assessment • Online programming skills testing Proctoring NIIT provides a proctored environment for conducting high-stakes assessments in its dedicated testing centers. These centers are equipped with closed-circuit cameras and physical proctoring by trained proctors. © Assessment Solutions, KSB, NIIT Ltd 18
  • 19. Assessment Solutions Psychometric Analysis Psychometric Analysis is an integral part of NIIT’s assessment solution. Psychometric analysis is performed at two levels: • After field testing: in order to evaluate item performance and for (re)calibrating items for difficulty. This forms an integral part of item authoring process. • On an ongoing basis: to continuously evaluate performance of tests and individual items within the tests. Measurement Theories and Models NIIT uses a variety of models for doing psychometric analysis. The model used is selected in close discussion with the customer and also depends on the nature of tests, types of items and number of test-takers. The popular models used include: • Classical Test Theory (CTT) • Item Response Theory (IRT) o Rasch Model (1 parameter logistic model or 1 PLM) o Two-Parameter Logistic Model (2PLM) o Three Parameter Logistic Model (3PLM) Classical Test Theory Item Analysis During the construction of tests, test developers need to determine the effectiveness of each individual item in the test. This process of evaluating each item is called item analysis. On the basis of the information derived from item analysis on test items’ effectiveness, they can be modified or eliminated from the final test, thus ultimately ensuring the reliability and validity of the test. Three important item functions that need to be evaluated through item analysis are: 1. Item Difficulty 2. Item Discrimination 3. Distractor Efficiency Item Difficulty Analysis Item difficulty is a measure of the overall difficulty of a test item. In CTT, it is represented as a percentage of test takers who answer an item correctly. An easy item is one that is correctly answered by a majority of test takers. Conversely, a difficult item is answered correctly by only a few test-takers. Typically, an effective test must have a balanced distribution of items of varying degrees of difficulty. With a “difficulty value” assigned to each item, it is easy for the test developer to select items of varying difficulty levels for inclusion in the final test. Item Discrimination Analysis The most important function of a test is to differentiate between test takers in terms of their ability, knowledge, skills and personality. Therefore, it is crucial that the building blocks of the test, i.e., the items, have the ability to differentiate, say, between high and low performers on an arithmetic test. Through item discrimination analysis, it is possible to evaluate the performance of an item with respect to its ability to discriminate between people of varying abilities or traits. With a definite discrimination index for each item, test developer can select those items with higher discrimination indices for the test. © Assessment Solutions, KSB, NIIT Ltd 19
  • 20. CLiKS Assessment Engine Point-Biserial and Extreme Group Methods are commonly used approaches to arrive at the Discrimination Index for individual items. Besides providing a detailed reports on discriminative power of individual items, interpretative comments on the psychometric implications of discrimination indices of each item are shared with test developers to facilitate the test review process. Distractor Efficiency Analysis In the case of tests that use multiple-choice items, the incorrect answer choices to a question (distractors) play an important role. A “good” distractor, which is unambiguously incorrect and yet can confuse the less knowledgeable test taker, adds to the discrimination value of the question. The effectiveness of distractors is analyzed by measuring the distribution of responses across all the distractors. Distractors that are not selected at all or often enough by test-takers may be discarded/ modified/ replaced to improve the efficiency of the item. Point-Biserial Method is used to compare between the group that chooses a distractor and the group that chooses the correct option. A distractor efficiency matrix is prepared for all items in a test and recommendations provided to the item developers. Internal Consistency Analysis Items in a test must have internal consistency in measuring the proposed construct or variable. That is, items that are chosen for a test, designed to measure a particular ability or trait, must assess only that ability or trait and, therefore, have high correlation among themselves and with the test. In effect, information from such analysis is essential to know the internal structure of the test and to make decisions on the need to further enhance the quality of the test. With high internal consistency indices, the test developer can confidently rely on the test’s ability to assess the proposed ability or trait. The internal consistency of a test is measured by means of different statistical tools. Common tools used to analyze internal consistency are Cronbach’s Alpha Technique, KR20 Coefficient and Spearman-Brown Alpha. Test Reliability Analysis A good test needs to be consistent in its performance. That is, the same test given to a candidate today should produce identical results a few weeks or months later when the candidate takes it again. In a similar way, two or more parallel tests that assess the same ability or trait must show similar/identical results. In the psychometric analysis of ability/personality/skills tests, two different methods are used to evaluate the reliability of tests - Test-Retest Reliability analysis and Alternate-Form or Split-Half Reliability Analysis. Test Validity Analysis It is absolutely essential that a test measure what it was originally supposed to measure. At various stages of its development, a test needs to be evaluated for its validity. Validity of a test is assessed in different ways: Face validity, content validity, concurrent validity, predictive validity, and construct validity. © Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd 20
  • 21. CLiKS Assessment Engine Standardization and Norm Setting The process of standardization of a test ensures the representativeness of the test to the target audiences. It enables the test to be administered and scored under uniform conditions so that the test can produce comparable results across different situations and target audiences. As part of the process, norms and benchmarks for different groups and situations are set for an unbiased comparison of individual scores on the test. According to the specific requirements of the customer, the team of psychometricians create standard rules for the administration, scoring and interpretation of the test. Standardized scores, such as percentiles, T-scores and STEN scores, and group-specific comparison norms or benchmarks are developed for accurate interpretation of individual test scores. Item Response Theory (IRT) Item Response Theory has a different approach and style from CTT while analyzing the psychometric properties of a test and its items. While CTT evaluates a test and its items against a normative population on which they are psychometrically analyzed and standardized, IRT analyses are not based on any such relative measures. It uses complex probabilistic models to arrive at the psychometric properties of test items. It assumes that latent variables such as ability and personality traits have an underlying measurement scale, on which individual items, tests and individual test takers can be compared. IRT is a statistical procedure used to model item responses with certain parameters to determine the proficiency level of an examinee. IRT computes an estimated item characteristic curve (ICC) for each test item. IRT can use one to three parameters to specify the item response model. IRT has the following advantages when compared to classical item analysis: • Item parameter estimates are independent of the group of examinees selected from the population for whom the test was designed. • Examinee ability estimates are independent of the particular choice of test items used from the population of items that were calibrated. • The precision of the ability estimates is known. One-Parameter Logistic Model This IRT model uses only the difficulty parameter for its analysis. Analysis results include item- specific information in the form of”Item Characteristic Curve” as well as test-specific information in the form of a “Test Characteristic Curve”. The location of the curve on the latent ability scale represents the difficulty of an item. This model takes only the difficulty parameter for the purpose of analysis; it assumes a fixed value for the discrimination parameter. Two-Parameter Logistic Model The two-parameter IRT model utilizes both the difficulty and the discrimination parameters for the psychometric analysis. The resultant Item Characteristic Curve tells about the difficulty level as well as the discrimination power of the test item. While the location of the curve on the ability scale represents item difficulty, the height or steepness of the curve indicates the discrimination index of the item. Three-Parameter Logistic Model In addition to the difficulty and discrimination parameters, the Three-Parameter Model takes a third dimension into consideration, i.e., the guessing parameter. This third parameter indicates the probability of getting the item correct by guessing alone. Therefore, the Item Characteristic Curve resulting from this analysis has information about item difficulty, item discrimination and the “guessability”. © Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd 21
  • 22. CLiKS Assessment Engine Tools and Technology The Assessment Practice of NIIT has built its own, proprietary software tools for performing statistical analysis based on the various models of IRT. Other Services NIIT provides a range of other services related to assessments to its customers around the world. Customization and Integration For medium to large implementations of the CLiKS Assessment Engine, NIIT provides customization of the engine for its customers. These include: • User Interface • Functionality • Reports Since CLiKS uses open APIs and supports industry standards, it is also possible to interface/integrate CLiKS with other internal systems that customers may have. Single Sign-On, ERP and client HR systems are some examples of applications that CLiKS may be interfaced with. Technical Support NIIT provides technical support to its customers and test delivery partners to help resolve any technical query. Helpdesk Since many test-takers are still taking online tests for the first time, they may need some hand- holding. NIIT has 24x7 toll-free helpdesks that provide this service to our test sponsors. Mentoring When tests are used as a practice or preparation mechanism, test-takers need to discuss their performance with someone who is familiar with both – the test items as well as the subject area. NIIT provides this service to several of its customers through phone, email and chat. Test-Process Auditing Some of our customers use our technology to deliver certification tests through a network of their partners/franchisees. In such cases NIIT provides a service to audit the test delivery process through scheduled visits as well as through mystery shoppers at test delivery locations. This helps our customers to maintain a high level of integrity in their testing process. Pre-test Screening For certain kinds of tests, e.g., pre-hire assessments, test sponsors may have a set of criteria for screening candidates for eligibility for the test. NIIT provides both - manual as well automated options for screening. Scheduling and Administration When physical test delivery (for proctored tests) is a constraint, tests need to be scheduled and administered in a manner that optimizes the availability of all resources. NIIT provides services for doing this task. © Assessment Solutions, Enterprise Learning Solutions, NIIT Ltd 22