SlideShare una empresa de Scribd logo
1 de 19
Descargar para leer sin conexión
Internship Report
IIT GUWAHATI
Project report on Quality Assurance of Virtual Labs
Submitted by
Hrishikesh Malakar
B.Tech, Computer Science and Engineering,
Tezpur University
Mentored By
Dr. Santosh Biswas
Asso. Professor, Computer Science and Engineering,
IIT Guwahati
Tezpur University
Duration:1st June - 15th July 2016
CERTIFICATE
This is to certify that the work contained in this project entitled “Quality Assurance of Virtual
Labs" is a bonafide work of Hrishikesh Malakar, carried out in the Department of Computer
Science and Engineering, Indian Institute of Technology Guwahati under my supervision and
that it has not been submitted elsewhere for a degree.
Supervisor: Dr. Santosh Biswas
Associate Professor,
July, 2016
Department of Computer Science & Engineering,
Guwahati. Indian Institute of Technology Guwahati, Assam.
ACKNOWLEDGEMENTS
I would like to express a deep sense of thanks and gratitude to our supervisorDr. SANTOSH
BISWAS,for giving me the opportunity to do an internship under his guidance.During the
project I got a chance to improve my practical skills beyond the limits of laboratories. I
learned a lot of concept of computer science and this project was really helpful for me.
I would also like to thank Mr.HRISHIKESH BARUAH and Mr. BIJU DAS, who helped me
in better understanding of project and helped me when I faced challenges.This work would
not have been possible without his support and valuable suggestions.
I also wish to thank everyone in the Computer Science and Engineering Department who
created such a lively atmosphere that it was always exciting to go to the department.
I have no words to express our sincere gratitude to our parents who have shown us this world
and for every support they have given us.
Finally, I would like to express my sincere thanks to all my friends and others who helped me
directly or indirectly during this project work.
CONTENTS
1 INTRODUCTION 1
1.1 Virtual Labs 1
1.2 Manual Testing of Virtual Labs 1
1.3 Automated Testing of Virtual 1
1.4 Hosting Process of Virtual Lab 1
2. Quality Assurance of Virtual Labs 2
2.1 Manual Testing-Test Plan 2
2.1.1 Testing Objectives 2
2.1.2 Test Requirements 3
2.1.2.1 System Testing 3
2.1.2.2 Integration Testing 3
2.2.2.3 User Interface Testing 3
2.1.3 Tools Used 4
2.1.4 Development of test cases 4
2.1.5 Definition of test cases 4
2.1.6 Structure of test cases 4
2..1.7 Description of various fields of test cases 6
2..1.8 Location of test case 6
2.1.9 Test reports 6
2.1.10 Structure of test reports 7
3 Automated Testing of Virtual Lab 9
3.1 About the modules 9
3.2 Description of the script 10
3.2.1 Link testing 10
3.2.2 Spelling Checking 11
3.3 Snapshot of working instance of the script 12
4 Conclusion 13
5 References 14
Chapter 1
INTRODUCTION
1.1 Virtual Labs
Virtual Labs is a mission mode project initiated by the Ministry of Human
Resources and Development (MHRD). The objective of this project is to provide
laboratory learning experience to the students who do not have access to adequate
laboratory infrastructure. Currently there are around 150 labs which have been
developed by various institutes. A streamlined software development life cycle
process followed for the development of these labs ensure high quality labs. The
integration process of Virtual Labs described here defines the development, quality
assurance and hosting practices followed by the developers (open source
community) of the Virtual Labs project. It aims at delivering responsive, open-
source and device-independent labs, helping us in our strive for excellence.
1.2 Manual Testing of Virtual Labs
Manual testing is a testing process that is carried out manually in order to find
defects without the usage of tools or automation scripting. A test plan document is
prepared that act as a guide to the testing process in order to have the complete test
coverage.
Tools used: Emacs 24.4.2,Github 2.1.4 ,org and html format
1.3 Automated Testing of Virtual Labs:
Automated testing is a testing process that is carried out by running a script in
order to track down whether any link is up or down and later to check for spelling
error is each page.
Tools used: Python 3.5.2, Selenium 2.53.6
1.4 Hosting Process of Virtual Labs:
A virtual lab is hosted by the VLEAD team once the lab is Once 'Approved' from
the IIIT-H QA team, the release engineer would carry out the hosting process using
the Auto Deployment Service (ADS).
1
Chapter 2
Quality Assurance of Virtual Lab
This section captures the Quality Assurance (QA) process to be carried out by the
respective lab developers and the IIIT-H QA team. Every QA process starts with a
test plan followed by the creation of test cases.
2.1 Manual Testing-Test Plan
This section describes the plan that would be followed by the IIIT-H QA team for
testing of the Virtual Labs . The test plan would test all the requirements of the
Virtual Labs. It supports the following objectives:
 Identification of the existing project information and the software
components to be tested.
 Specifying the recommended high level testing requirements.
 Recommendation and description of the testing strategies to be employed.
 Specifying the deliverable elements of the test activities.
This test plan would apply to the integration and system tests that would be
conducted on the Virtual Labs Releases. Testing would be conducted as per the
black box testing techniques.
2.1.1 Testing Objectives
It supports the following objectives:
 Identification of the existing project information and the software
components to be tested.
 Specifying the recommended high level testing requirements.
 Recommendation and description of the testing strategies to be employed.
 Specifying the deliverable elements of the test activities.
This test plan would apply to the integration and system tests that would be
conducted on the Virtual Labs Releases. Testing would be conducted as per the
black box testing techniques.
2
2.1.2 Test requirements
The list below identifies the different levels (functional requirements) of the testing
that would be performed.
2.1.2.1 System Testing
The goal of system testing would be to verify that Virtual Labs works as per user
expectations. This type of testing is based upon black box techniques, that is,
verifying the application by interacting with it and analyzing the output (results).
Identified below is an outline of the testing process :
 Test Objectives: Verification of working of the Virtual Labs home
page and links to the participating institutes.
 Techniques: Using positive and negative data, following would be
verified:
1.Occurence of the expected results when positive data is used.
2.The appropriate error/warning messages displayed when
negative data is used.
 Completion Criteria:
1.All planned tests should be executed.
2.All identified defects should be addressed
2.1.2.2 Integration Testing
The goal of integration testing would be to verify that Virtual Labs fulfills the end
users expectations from the look and feel point of view. A detailed description of
what would be tested in this category is listed below :
 Different labs and experiments would be verified for simulator,
theory, reference and usability.
 Usability is defined as the extent to which an application is
understood, easy to operate and attractive to the users under
specified conditions.
2.1.2.3 User Interface Testing
User Interface testing verifies a user’s interaction with the software. The goal is
to verify the details of the functioning of all the labs and experiments which are
hosted under Virtual-Labs organisation.
3
2.1.3 Tools Used
The following tools have been used for fulfilling manual testing:
 Test Design - Emacs 24.4.1
 Defect Tracking - Github 2.1.4
 Functional Testing - Manual
 Test Report and Statistics - org & html format
 Project Management - Microsoft Project, Microsoft Word, Microsoft Excel
2.1.4 Development of test cases
This section describes the overall development of the test cases including its
definition, structure, owner, type and location.
2.1.5 Definition of test cases
A test case is a set of conditions under which a test engineer will determine
whether an application, software system or one of its features is working as it was
originally established to do. A test case is usually a single step, or occasionally a
sequence of steps, to test the correct behaviour/functionality and features of an
application. For Virtual Labs, a test case would be a file listing all the steps to be
carried out by the test engineers. Every test case should follow a defined structure
encapsulating all the testing conditions necessary for the QA process.
2.1.6 Structure of test cases
The structure of the test cases would be same across all the testing levels and the
labs. Naming convention to be followed for the test case file would be -
experimentname__XX_feature_priority.org
(For example : NumericalRepresentation_01_Usability_smk.org)
 Experiment Name : This part of the test case filename should represent the
name of the experiment.
 XX : This part of the test case filename should be serial number of the test
case.
 Feature : This part of the test case filename should represent the name of
the tested feature.
4
 Priority : This part of the test case filename should represent the level of
(business) importance assigned to an item. Priority assigned to a test case
file could be of different types as given below :
o p1 : These would be the highest level of business importance
assigned to the test cases. These would be the test cases executed first
in each build, identified and assigned by IIIT-H QA team in
conjunction with domain level testing team.
o p2 : These would be next to P1 test cases, in terms of business
importance assigned to these type of test cases.
o smoke test (smk) : These would be a subset of all defined/planned
test cases that cover the main functionality of a component or system.
It would ascertain that the most crucial functions of a program work,
but not bothered with finer details.
Fig 0.1: A sample of test case
5
2.1.7: Description of Various Fields of Test Cases:
 Author : This field should be the name of the author. It could be from the
IIIT-H QA team or from the development team.
 Date Created : This field should be the date of creation of a test case by the
test engineer/developer.
 Environment : This field would describe the environmental setup under
which the testing of a lab would be performed.
 Objective : This field would define the objective of the created test case.
 Pre conditions : This field would list the conditions that should be satisfied
before a test case is executed by the test engineer.
 Post conditions : This field would generally represent the state which
would be obtained after a test case is executed successfully. In some special
cases it would list the steps to be performed to get the system back to its
initial state.
 Test Steps : This field would list the steps to be carried out to execute a test
case.
 Expected result : This field would detail the ideal result expected by the
end user.
 Reviews/Comments : This field would express the comments of the
reviewer of the test cases
2.1.8 Location of test cases
Every lab has its own repository in GitHub under the Virtual-Labs organisation.
The test cases would also be located in the same repository. The test cases
directory would be at the same level as that of README.txt, src, scripts and
release-notes.
2.1.9 Test Reports
Test reports are generated at end of the testing process of each lab. It would contain
a consolidated report of the executed test cases, a boolean result and links to
defects raised against them. Two important details to be noted here are :
 A test case is said to pass only when all its test step pass.
 A composite test which consists of a set of test cases is said to pass only
when all its individual test cases pass.
6
2.1.10 Structure of a test report:
This Section describe the structure of a test report.
Fig 0.2:Structure of a Test report.
Various Fields of a Test report
 Lab Name : This would be the name of the tested lab repository.
 GitHub URL : This would be the GitHub URL for the lab repository
which would hold the test cases and the filed defects.
 Commit id : This would be the commit id against which the testing
happened.
 Experiment Name : This would be the name of the experiment of the
tested lab.
 Test Case : This would be the name of the test case created and tested for
the experiment as shown in the table above.
 Pass/Fail : This field would depict whether the test case for the tested
experiment pass or failed.
 Severity : This field would indicate the severity of the defect.
 Defect Link : This field in the table would be the hyper link for the
corresponding issue in GitHub for a failed test case.
7
Severity of defects
At any given time a defect should only have one of the severity levels as described
below. To change the severity of a defect, the existing label of the defect should be
unchecked and the new severity label should be checked.
S1 : This label indicates that the defect affects critical functionality or critical
data. There are no workarounds to get to this functionality. Example:
Prevention of user interaction, corruption of database, unfaithful to the
semantics of interaction and redirection to an error page.
S2 : This label indicates that the defect affects major functionality or major
data. It could have a workaround but is not obvious and is difficult, like broken
links and a field view being inconsistent with its specifications. Example: In a
form if there is a field which is editable but it is not allowing the user to edit it.
S3 : This label indicates that the defect affects minor functionality or non-
critical data. It could have an easy workaround. Example: Visual imperfections
like spelling and grammar, alignment, inconsistent terminology, colour, shapes
and fonts(css properties).
8
Chapter 3
Automated Testing of Virtual Labs
This section describes how script works in testing the Lab "Creative Design,
Prototyping & Experiential Simulation In Human Computer Interaction (HCI)" of
IIT Guwahati. Automated testing of the lab is done in two steps:
 Link Testing: Here we test the working of link using Selenium 2.53.6 with
python 3.5.2.
 Spell Checking: Here we check the correctness of the spelling in each page
of the lab with the help of PyEnchant Library.
3.1 About the Modules:
Selenium
Selenium is a set of different software tools each with a different approach to
supporting test automation. Most Selenium QA Engineers focus on the one or
two tools that most meet the needs of their project, however learning all the
tools will give you many different options for approaching different test
automation problems. The entire suite of tools results in a rich set of testing
functions specifically geared to the needs of testing of web applications of all
types.
These operations are highly flexible, allowing many options for locating UI
elements and comparing expected test results against actual application
behavior. One of Selenium’s key features is the support for executing one’s tests
on multiple browser platforms.
For our testing purpose we have used selenium with python 3.5.2.
PyEnchant
PyEnchant is a spellchecking library for Python, based on the excellent
Enchant library. PyEnchant combines all the functionality of the underlying
Enchant library with the flexibility of Python and a nice "Pythonic" object-
oriented interface. It also aims to provide some higher-level functionality
than is available in the C API.
By default PyEnchant comes with various dictionaries: en_GB:British
English, en_US: American English, de_DE: German, fr_FR:French
9
3.2 Description of the script
Setting up of Environment : Here the function setUpclass(self) does this. It initializes a
driver object to have the functionalities of chrome webdriver.
3.2.1 Link Testing
fig 0.3: Code Snapshot
Writing Test cases:
Here there are two test cases in the script- test_title(), test_link(self).
Usage of functions:
 driver.get('url'):redirects us to the specified url.
 assertIn("Virtual Lab",dirver.title): this function takes a condition as a
argument and returns a true or false value.
fig 0.4 : Code Snapshot
10
Some functions of selenium-python module to interact with web browser.
 element=driver.get_element_by_css_selector('body')
Select the content of the element 'body' which it locates by searching the css
file.
 element=driver.get_element_by_link_text(''SomeText")
This is a common way of visting link by identifiny links by their text.
We can also have some other function element.click() to click on the
link found by text.
 element=driver.get_element_by_xpath("//xpath"):
This is the most popular way of getting link and by using xpath querry
language.
In the figure 0.4 we have maintained a 2-D list of links of the Lab, which we
iterated over and apply the functions of the selenium-python module to test for the
links.
3.2.2 Spelling Checking
fig 0.4: Code Snapshot.
Here we get the text of the page by the css selector and then pass it to the function
check(webtext) of the module spellCheck. Snapshot of code of the spellCheck
module is given in the next page.
11
 my_dict is a dictionary object that contains English Dictionary along with
a user defined dictionary "mywords.txt"
3.3 Snapshot of working instance of the script.
12
Chapter 4
Conclusion
During our project "Quality assurance of virtual lab" we learned about the various process
of manual and automated testing. We got chance to get familiar with tools like GitHub,
Emacs, we also got chance to try our hands on advanced tools for automation like selenium
and spell checking tools like PyEnchant. Moreover, this project also taught the importance of
virtual labs all over our nation.
13
References
http://pythonhosted.org/pyenchant/tutorial.html
http://selenium-python.readthedocs.io/getting-started.html#simple-usage
https://www.gnu.org/software/emacs/documentation.html
http://vlab.co.in/institute_detail.php?ins=007
---------------------x---------------------
14
Hrishikesh_iitg_internship_report

Más contenido relacionado

La actualidad más candente

Testing and Mocking Object - The Art of Mocking.
Testing and Mocking Object - The Art of Mocking.Testing and Mocking Object - The Art of Mocking.
Testing and Mocking Object - The Art of Mocking.Deepak Singhvi
 
QTP Tutorial
QTP TutorialQTP Tutorial
QTP Tutorialpingkapil
 
Software Quality Assurance
Software Quality AssuranceSoftware Quality Assurance
Software Quality AssuranceVikash Mishra
 
Introduction to testing with MSTest, Visual Studio, and Team Foundation Serve...
Introduction to testing with MSTest, Visual Studio, and Team Foundation Serve...Introduction to testing with MSTest, Visual Studio, and Team Foundation Serve...
Introduction to testing with MSTest, Visual Studio, and Team Foundation Serve...Thomas Weller
 
Mutation Testing Workshop at Ericsson, Kista, Sweden
Mutation Testing Workshop at Ericsson, Kista, SwedenMutation Testing Workshop at Ericsson, Kista, Sweden
Mutation Testing Workshop at Ericsson, Kista, SwedenSTAMP Project
 
Software testing quiz questions and answers
Software testing quiz questions and answersSoftware testing quiz questions and answers
Software testing quiz questions and answersRajendraG
 
St & internationalization
St & internationalizationSt & internationalization
St & internationalizationSachin MK
 
Importance of Testing in SDLC
Importance of Testing in SDLCImportance of Testing in SDLC
Importance of Testing in SDLCIJEACS
 
PROPOSING AUTOMATED REGRESSION SUITE USING OPEN SOURCE TOOLS FOR A HEALTH CAR...
PROPOSING AUTOMATED REGRESSION SUITE USING OPEN SOURCE TOOLS FOR A HEALTH CAR...PROPOSING AUTOMATED REGRESSION SUITE USING OPEN SOURCE TOOLS FOR A HEALTH CAR...
PROPOSING AUTOMATED REGRESSION SUITE USING OPEN SOURCE TOOLS FOR A HEALTH CAR...ijseajournal
 
7 stages of unit testing
7 stages of unit testing7 stages of unit testing
7 stages of unit testingJorge Ortiz
 

La actualidad más candente (14)

Testing and Mocking Object - The Art of Mocking.
Testing and Mocking Object - The Art of Mocking.Testing and Mocking Object - The Art of Mocking.
Testing and Mocking Object - The Art of Mocking.
 
QTP Tutorial
QTP TutorialQTP Tutorial
QTP Tutorial
 
Unit test
Unit testUnit test
Unit test
 
Introduction to Parasoft C++TEST
Introduction to Parasoft C++TEST Introduction to Parasoft C++TEST
Introduction to Parasoft C++TEST
 
Software Quality Assurance
Software Quality AssuranceSoftware Quality Assurance
Software Quality Assurance
 
Introduction to testing with MSTest, Visual Studio, and Team Foundation Serve...
Introduction to testing with MSTest, Visual Studio, and Team Foundation Serve...Introduction to testing with MSTest, Visual Studio, and Team Foundation Serve...
Introduction to testing with MSTest, Visual Studio, and Team Foundation Serve...
 
Mutation Testing Workshop at Ericsson, Kista, Sweden
Mutation Testing Workshop at Ericsson, Kista, SwedenMutation Testing Workshop at Ericsson, Kista, Sweden
Mutation Testing Workshop at Ericsson, Kista, Sweden
 
Software testing quiz questions and answers
Software testing quiz questions and answersSoftware testing quiz questions and answers
Software testing quiz questions and answers
 
St & internationalization
St & internationalizationSt & internationalization
St & internationalization
 
Importance of Testing in SDLC
Importance of Testing in SDLCImportance of Testing in SDLC
Importance of Testing in SDLC
 
Chap2
Chap2Chap2
Chap2
 
PROPOSING AUTOMATED REGRESSION SUITE USING OPEN SOURCE TOOLS FOR A HEALTH CAR...
PROPOSING AUTOMATED REGRESSION SUITE USING OPEN SOURCE TOOLS FOR A HEALTH CAR...PROPOSING AUTOMATED REGRESSION SUITE USING OPEN SOURCE TOOLS FOR A HEALTH CAR...
PROPOSING AUTOMATED REGRESSION SUITE USING OPEN SOURCE TOOLS FOR A HEALTH CAR...
 
Unit Testing
Unit TestingUnit Testing
Unit Testing
 
7 stages of unit testing
7 stages of unit testing7 stages of unit testing
7 stages of unit testing
 

Similar a Hrishikesh_iitg_internship_report

Qa case study
Qa case studyQa case study
Qa case studyhopperdev
 
Tools for Software Verification and Validation
Tools for Software Verification and ValidationTools for Software Verification and Validation
Tools for Software Verification and Validationaliraza786
 
Open Source Software Testing Tools
Open Source Software Testing ToolsOpen Source Software Testing Tools
Open Source Software Testing ToolsVaruna Harshana
 
Automation Testing of Web based Application with Selenium and HP UFT (QTP)
Automation Testing of Web based Application with Selenium and HP UFT (QTP)Automation Testing of Web based Application with Selenium and HP UFT (QTP)
Automation Testing of Web based Application with Selenium and HP UFT (QTP)IRJET Journal
 
SOFTWARE VERIFICATION & VALIDATION
SOFTWARE VERIFICATION & VALIDATIONSOFTWARE VERIFICATION & VALIDATION
SOFTWARE VERIFICATION & VALIDATIONAmin Bandeali
 
Manual Testing Interview Questions & Answers.docx
Manual Testing Interview Questions & Answers.docxManual Testing Interview Questions & Answers.docx
Manual Testing Interview Questions & Answers.docxssuser305f65
 
Lecture 08 (SQE, Testing, PM, RM, ME).pptx
Lecture 08 (SQE, Testing, PM, RM, ME).pptxLecture 08 (SQE, Testing, PM, RM, ME).pptx
Lecture 08 (SQE, Testing, PM, RM, ME).pptxSirRafiLectures
 
Real Time software Training in Nagercoil
Real Time software Training in NagercoilReal Time software Training in Nagercoil
Real Time software Training in Nagercoiljclick2
 
Software testing
Software testingSoftware testing
Software testingRavi Dasari
 
Test driven development and unit testing with examples in C++
Test driven development and unit testing with examples in C++Test driven development and unit testing with examples in C++
Test driven development and unit testing with examples in C++Hong Le Van
 
Software Test Automation - Best Practices
Software Test Automation - Best PracticesSoftware Test Automation - Best Practices
Software Test Automation - Best PracticesArul Selvan
 

Similar a Hrishikesh_iitg_internship_report (20)

hp_alm.docx
hp_alm.docxhp_alm.docx
hp_alm.docx
 
Qa case study
Qa case studyQa case study
Qa case study
 
Tools for Software Verification and Validation
Tools for Software Verification and ValidationTools for Software Verification and Validation
Tools for Software Verification and Validation
 
Prasanth_Pendam_QA_9.5 Years
Prasanth_Pendam_QA_9.5 YearsPrasanth_Pendam_QA_9.5 Years
Prasanth_Pendam_QA_9.5 Years
 
Open Source Software Testing Tools
Open Source Software Testing ToolsOpen Source Software Testing Tools
Open Source Software Testing Tools
 
Automation Testing of Web based Application with Selenium and HP UFT (QTP)
Automation Testing of Web based Application with Selenium and HP UFT (QTP)Automation Testing of Web based Application with Selenium and HP UFT (QTP)
Automation Testing of Web based Application with Selenium and HP UFT (QTP)
 
13090016_vectorcast.ppt
13090016_vectorcast.ppt13090016_vectorcast.ppt
13090016_vectorcast.ppt
 
Quality Assurance Process
Quality Assurance ProcessQuality Assurance Process
Quality Assurance Process
 
Test plan
Test planTest plan
Test plan
 
SOFTWARE VERIFICATION & VALIDATION
SOFTWARE VERIFICATION & VALIDATIONSOFTWARE VERIFICATION & VALIDATION
SOFTWARE VERIFICATION & VALIDATION
 
Manual Testing Interview Questions & Answers.docx
Manual Testing Interview Questions & Answers.docxManual Testing Interview Questions & Answers.docx
Manual Testing Interview Questions & Answers.docx
 
Lecture 08 (SQE, Testing, PM, RM, ME).pptx
Lecture 08 (SQE, Testing, PM, RM, ME).pptxLecture 08 (SQE, Testing, PM, RM, ME).pptx
Lecture 08 (SQE, Testing, PM, RM, ME).pptx
 
Upstream testing.
Upstream testing.Upstream testing.
Upstream testing.
 
Real Time software Training in Nagercoil
Real Time software Training in NagercoilReal Time software Training in Nagercoil
Real Time software Training in Nagercoil
 
T0 numtq0nje=
T0 numtq0nje=T0 numtq0nje=
T0 numtq0nje=
 
Software testing
Software testingSoftware testing
Software testing
 
Test driven development and unit testing with examples in C++
Test driven development and unit testing with examples in C++Test driven development and unit testing with examples in C++
Test driven development and unit testing with examples in C++
 
Testing
TestingTesting
Testing
 
Software Test Automation - Best Practices
Software Test Automation - Best PracticesSoftware Test Automation - Best Practices
Software Test Automation - Best Practices
 
6. Testing Guidelines
6. Testing Guidelines6. Testing Guidelines
6. Testing Guidelines
 

Hrishikesh_iitg_internship_report

  • 1. Internship Report IIT GUWAHATI Project report on Quality Assurance of Virtual Labs Submitted by Hrishikesh Malakar B.Tech, Computer Science and Engineering, Tezpur University Mentored By Dr. Santosh Biswas Asso. Professor, Computer Science and Engineering, IIT Guwahati Tezpur University Duration:1st June - 15th July 2016
  • 2. CERTIFICATE This is to certify that the work contained in this project entitled “Quality Assurance of Virtual Labs" is a bonafide work of Hrishikesh Malakar, carried out in the Department of Computer Science and Engineering, Indian Institute of Technology Guwahati under my supervision and that it has not been submitted elsewhere for a degree. Supervisor: Dr. Santosh Biswas Associate Professor, July, 2016 Department of Computer Science & Engineering, Guwahati. Indian Institute of Technology Guwahati, Assam.
  • 3. ACKNOWLEDGEMENTS I would like to express a deep sense of thanks and gratitude to our supervisorDr. SANTOSH BISWAS,for giving me the opportunity to do an internship under his guidance.During the project I got a chance to improve my practical skills beyond the limits of laboratories. I learned a lot of concept of computer science and this project was really helpful for me. I would also like to thank Mr.HRISHIKESH BARUAH and Mr. BIJU DAS, who helped me in better understanding of project and helped me when I faced challenges.This work would not have been possible without his support and valuable suggestions. I also wish to thank everyone in the Computer Science and Engineering Department who created such a lively atmosphere that it was always exciting to go to the department. I have no words to express our sincere gratitude to our parents who have shown us this world and for every support they have given us. Finally, I would like to express my sincere thanks to all my friends and others who helped me directly or indirectly during this project work.
  • 4. CONTENTS 1 INTRODUCTION 1 1.1 Virtual Labs 1 1.2 Manual Testing of Virtual Labs 1 1.3 Automated Testing of Virtual 1 1.4 Hosting Process of Virtual Lab 1 2. Quality Assurance of Virtual Labs 2 2.1 Manual Testing-Test Plan 2 2.1.1 Testing Objectives 2 2.1.2 Test Requirements 3 2.1.2.1 System Testing 3 2.1.2.2 Integration Testing 3 2.2.2.3 User Interface Testing 3 2.1.3 Tools Used 4 2.1.4 Development of test cases 4 2.1.5 Definition of test cases 4 2.1.6 Structure of test cases 4 2..1.7 Description of various fields of test cases 6 2..1.8 Location of test case 6 2.1.9 Test reports 6 2.1.10 Structure of test reports 7 3 Automated Testing of Virtual Lab 9 3.1 About the modules 9 3.2 Description of the script 10 3.2.1 Link testing 10 3.2.2 Spelling Checking 11 3.3 Snapshot of working instance of the script 12 4 Conclusion 13 5 References 14
  • 5. Chapter 1 INTRODUCTION 1.1 Virtual Labs Virtual Labs is a mission mode project initiated by the Ministry of Human Resources and Development (MHRD). The objective of this project is to provide laboratory learning experience to the students who do not have access to adequate laboratory infrastructure. Currently there are around 150 labs which have been developed by various institutes. A streamlined software development life cycle process followed for the development of these labs ensure high quality labs. The integration process of Virtual Labs described here defines the development, quality assurance and hosting practices followed by the developers (open source community) of the Virtual Labs project. It aims at delivering responsive, open- source and device-independent labs, helping us in our strive for excellence. 1.2 Manual Testing of Virtual Labs Manual testing is a testing process that is carried out manually in order to find defects without the usage of tools or automation scripting. A test plan document is prepared that act as a guide to the testing process in order to have the complete test coverage. Tools used: Emacs 24.4.2,Github 2.1.4 ,org and html format 1.3 Automated Testing of Virtual Labs: Automated testing is a testing process that is carried out by running a script in order to track down whether any link is up or down and later to check for spelling error is each page. Tools used: Python 3.5.2, Selenium 2.53.6 1.4 Hosting Process of Virtual Labs: A virtual lab is hosted by the VLEAD team once the lab is Once 'Approved' from the IIIT-H QA team, the release engineer would carry out the hosting process using the Auto Deployment Service (ADS). 1
  • 6. Chapter 2 Quality Assurance of Virtual Lab This section captures the Quality Assurance (QA) process to be carried out by the respective lab developers and the IIIT-H QA team. Every QA process starts with a test plan followed by the creation of test cases. 2.1 Manual Testing-Test Plan This section describes the plan that would be followed by the IIIT-H QA team for testing of the Virtual Labs . The test plan would test all the requirements of the Virtual Labs. It supports the following objectives:  Identification of the existing project information and the software components to be tested.  Specifying the recommended high level testing requirements.  Recommendation and description of the testing strategies to be employed.  Specifying the deliverable elements of the test activities. This test plan would apply to the integration and system tests that would be conducted on the Virtual Labs Releases. Testing would be conducted as per the black box testing techniques. 2.1.1 Testing Objectives It supports the following objectives:  Identification of the existing project information and the software components to be tested.  Specifying the recommended high level testing requirements.  Recommendation and description of the testing strategies to be employed.  Specifying the deliverable elements of the test activities. This test plan would apply to the integration and system tests that would be conducted on the Virtual Labs Releases. Testing would be conducted as per the black box testing techniques. 2
  • 7. 2.1.2 Test requirements The list below identifies the different levels (functional requirements) of the testing that would be performed. 2.1.2.1 System Testing The goal of system testing would be to verify that Virtual Labs works as per user expectations. This type of testing is based upon black box techniques, that is, verifying the application by interacting with it and analyzing the output (results). Identified below is an outline of the testing process :  Test Objectives: Verification of working of the Virtual Labs home page and links to the participating institutes.  Techniques: Using positive and negative data, following would be verified: 1.Occurence of the expected results when positive data is used. 2.The appropriate error/warning messages displayed when negative data is used.  Completion Criteria: 1.All planned tests should be executed. 2.All identified defects should be addressed 2.1.2.2 Integration Testing The goal of integration testing would be to verify that Virtual Labs fulfills the end users expectations from the look and feel point of view. A detailed description of what would be tested in this category is listed below :  Different labs and experiments would be verified for simulator, theory, reference and usability.  Usability is defined as the extent to which an application is understood, easy to operate and attractive to the users under specified conditions. 2.1.2.3 User Interface Testing User Interface testing verifies a user’s interaction with the software. The goal is to verify the details of the functioning of all the labs and experiments which are hosted under Virtual-Labs organisation. 3
  • 8. 2.1.3 Tools Used The following tools have been used for fulfilling manual testing:  Test Design - Emacs 24.4.1  Defect Tracking - Github 2.1.4  Functional Testing - Manual  Test Report and Statistics - org & html format  Project Management - Microsoft Project, Microsoft Word, Microsoft Excel 2.1.4 Development of test cases This section describes the overall development of the test cases including its definition, structure, owner, type and location. 2.1.5 Definition of test cases A test case is a set of conditions under which a test engineer will determine whether an application, software system or one of its features is working as it was originally established to do. A test case is usually a single step, or occasionally a sequence of steps, to test the correct behaviour/functionality and features of an application. For Virtual Labs, a test case would be a file listing all the steps to be carried out by the test engineers. Every test case should follow a defined structure encapsulating all the testing conditions necessary for the QA process. 2.1.6 Structure of test cases The structure of the test cases would be same across all the testing levels and the labs. Naming convention to be followed for the test case file would be - experimentname__XX_feature_priority.org (For example : NumericalRepresentation_01_Usability_smk.org)  Experiment Name : This part of the test case filename should represent the name of the experiment.  XX : This part of the test case filename should be serial number of the test case.  Feature : This part of the test case filename should represent the name of the tested feature. 4
  • 9.  Priority : This part of the test case filename should represent the level of (business) importance assigned to an item. Priority assigned to a test case file could be of different types as given below : o p1 : These would be the highest level of business importance assigned to the test cases. These would be the test cases executed first in each build, identified and assigned by IIIT-H QA team in conjunction with domain level testing team. o p2 : These would be next to P1 test cases, in terms of business importance assigned to these type of test cases. o smoke test (smk) : These would be a subset of all defined/planned test cases that cover the main functionality of a component or system. It would ascertain that the most crucial functions of a program work, but not bothered with finer details. Fig 0.1: A sample of test case 5
  • 10. 2.1.7: Description of Various Fields of Test Cases:  Author : This field should be the name of the author. It could be from the IIIT-H QA team or from the development team.  Date Created : This field should be the date of creation of a test case by the test engineer/developer.  Environment : This field would describe the environmental setup under which the testing of a lab would be performed.  Objective : This field would define the objective of the created test case.  Pre conditions : This field would list the conditions that should be satisfied before a test case is executed by the test engineer.  Post conditions : This field would generally represent the state which would be obtained after a test case is executed successfully. In some special cases it would list the steps to be performed to get the system back to its initial state.  Test Steps : This field would list the steps to be carried out to execute a test case.  Expected result : This field would detail the ideal result expected by the end user.  Reviews/Comments : This field would express the comments of the reviewer of the test cases 2.1.8 Location of test cases Every lab has its own repository in GitHub under the Virtual-Labs organisation. The test cases would also be located in the same repository. The test cases directory would be at the same level as that of README.txt, src, scripts and release-notes. 2.1.9 Test Reports Test reports are generated at end of the testing process of each lab. It would contain a consolidated report of the executed test cases, a boolean result and links to defects raised against them. Two important details to be noted here are :  A test case is said to pass only when all its test step pass.  A composite test which consists of a set of test cases is said to pass only when all its individual test cases pass. 6
  • 11. 2.1.10 Structure of a test report: This Section describe the structure of a test report. Fig 0.2:Structure of a Test report. Various Fields of a Test report  Lab Name : This would be the name of the tested lab repository.  GitHub URL : This would be the GitHub URL for the lab repository which would hold the test cases and the filed defects.  Commit id : This would be the commit id against which the testing happened.  Experiment Name : This would be the name of the experiment of the tested lab.  Test Case : This would be the name of the test case created and tested for the experiment as shown in the table above.  Pass/Fail : This field would depict whether the test case for the tested experiment pass or failed.  Severity : This field would indicate the severity of the defect.  Defect Link : This field in the table would be the hyper link for the corresponding issue in GitHub for a failed test case. 7
  • 12. Severity of defects At any given time a defect should only have one of the severity levels as described below. To change the severity of a defect, the existing label of the defect should be unchecked and the new severity label should be checked. S1 : This label indicates that the defect affects critical functionality or critical data. There are no workarounds to get to this functionality. Example: Prevention of user interaction, corruption of database, unfaithful to the semantics of interaction and redirection to an error page. S2 : This label indicates that the defect affects major functionality or major data. It could have a workaround but is not obvious and is difficult, like broken links and a field view being inconsistent with its specifications. Example: In a form if there is a field which is editable but it is not allowing the user to edit it. S3 : This label indicates that the defect affects minor functionality or non- critical data. It could have an easy workaround. Example: Visual imperfections like spelling and grammar, alignment, inconsistent terminology, colour, shapes and fonts(css properties). 8
  • 13. Chapter 3 Automated Testing of Virtual Labs This section describes how script works in testing the Lab "Creative Design, Prototyping & Experiential Simulation In Human Computer Interaction (HCI)" of IIT Guwahati. Automated testing of the lab is done in two steps:  Link Testing: Here we test the working of link using Selenium 2.53.6 with python 3.5.2.  Spell Checking: Here we check the correctness of the spelling in each page of the lab with the help of PyEnchant Library. 3.1 About the Modules: Selenium Selenium is a set of different software tools each with a different approach to supporting test automation. Most Selenium QA Engineers focus on the one or two tools that most meet the needs of their project, however learning all the tools will give you many different options for approaching different test automation problems. The entire suite of tools results in a rich set of testing functions specifically geared to the needs of testing of web applications of all types. These operations are highly flexible, allowing many options for locating UI elements and comparing expected test results against actual application behavior. One of Selenium’s key features is the support for executing one’s tests on multiple browser platforms. For our testing purpose we have used selenium with python 3.5.2. PyEnchant PyEnchant is a spellchecking library for Python, based on the excellent Enchant library. PyEnchant combines all the functionality of the underlying Enchant library with the flexibility of Python and a nice "Pythonic" object- oriented interface. It also aims to provide some higher-level functionality than is available in the C API. By default PyEnchant comes with various dictionaries: en_GB:British English, en_US: American English, de_DE: German, fr_FR:French 9
  • 14. 3.2 Description of the script Setting up of Environment : Here the function setUpclass(self) does this. It initializes a driver object to have the functionalities of chrome webdriver. 3.2.1 Link Testing fig 0.3: Code Snapshot Writing Test cases: Here there are two test cases in the script- test_title(), test_link(self). Usage of functions:  driver.get('url'):redirects us to the specified url.  assertIn("Virtual Lab",dirver.title): this function takes a condition as a argument and returns a true or false value. fig 0.4 : Code Snapshot 10
  • 15. Some functions of selenium-python module to interact with web browser.  element=driver.get_element_by_css_selector('body') Select the content of the element 'body' which it locates by searching the css file.  element=driver.get_element_by_link_text(''SomeText") This is a common way of visting link by identifiny links by their text. We can also have some other function element.click() to click on the link found by text.  element=driver.get_element_by_xpath("//xpath"): This is the most popular way of getting link and by using xpath querry language. In the figure 0.4 we have maintained a 2-D list of links of the Lab, which we iterated over and apply the functions of the selenium-python module to test for the links. 3.2.2 Spelling Checking fig 0.4: Code Snapshot. Here we get the text of the page by the css selector and then pass it to the function check(webtext) of the module spellCheck. Snapshot of code of the spellCheck module is given in the next page. 11
  • 16.  my_dict is a dictionary object that contains English Dictionary along with a user defined dictionary "mywords.txt" 3.3 Snapshot of working instance of the script. 12
  • 17. Chapter 4 Conclusion During our project "Quality assurance of virtual lab" we learned about the various process of manual and automated testing. We got chance to get familiar with tools like GitHub, Emacs, we also got chance to try our hands on advanced tools for automation like selenium and spell checking tools like PyEnchant. Moreover, this project also taught the importance of virtual labs all over our nation. 13