In this webinar, XBOSoft's VP of Engineering discusses some of the challenges that he and his team have faced in the areas of mobile test automation and mobile usability testing. He'll discuss how to gain the best platform coverage, when to use automation, when not to, and when to use shared cloud services versus emulators and real devices.
2. XBOSoft, Inc. All Rights Reserved.
Expectations
• I won’t read the slides
• Lots of material
• You can have it, just email me
– ed.curran@xbosoft.com
2
3. XBOSoft, Inc. All Rights Reserved.
Meet Our Speakers
Philip Lew
CEO and Founder, XBOSoft
Relevant specialties and
passions
• Software quality process, evaluation,
measurement and improvement
• Software quality in use / UX design
• Mobile User Experience and usability
• Cycling and travel
3
XBOSoft Team
4. XBOSoft, Inc. All Rights Reserved.
Ed Curran
VP of Engineering
4
• Over 25 years of experience: technology;
international; BSEE/MBA
• Much of career dedicated wireless industry,
as well as software development
• Developer of performance management
monitoring systems for wireless networks for
use in capacity planning
• VzW Representative for Verizon Quality Council
• QuEST Forum / TL9000 Speaker
• iOS Application Designer/Developer
Meet Our Speakers
XBOSoft Team
5. XBOSoft, Inc. All Rights Reserved.
XBOSoft
Dedicated to Software Quality Improvement
Founded in 2006
We speed products to market with our expertise:
• Software QA consulting
• Software testing
Global team with offices in San Francisco and
Beijing
“Thorough, accurate, and fast”
5
6. XBOSoft, Inc. All Rights Reserved.
House Rules
▪ Participants other than the speakers are muted
▪ Questions via the GoToWebinar control on the
right side of your screen or through Twitter
@XBOSoft
▪ Questions may be asked throughout the webinar -
we’ll try to answer them at the end
▪ You will receive info on recording after the
webinar
6
7. XBOSoft, Inc. All Rights Reserved.
Agenda
1. Mobile Vs. Desktop Platforms
– Fundamental differences between the mobile and desktop
platforms
– Foundation to further explore mobile software best
practices for design and test
2. Designing/Testing Mobile Applications
– Specific design issues associated with the creation of a
great user experience (UX)
– Localization
– Testing and automation considerations inherent to
complex, mobile devices
7
8. XBOSoft, Inc. All Rights Reserved.
Mobile vs. Desktop Platforms
1.1 - Similarities
Leverage Platform Development If Similar
• Across Web Interfaces
– HTML/Responsive Web Design, when possible …
• Across Data Connections
– APIs, when possible ...
• Across Similar Code Bases
– Same application, different platforms, assisted
by IDE (e.g., Xcode)
8
9. XBOSoft, Inc. All Rights Reserved.
Mobile Vs. Desktop Platform
1.2 Differences
… Leveraging similarities provides simple
solution, but...
• Tendency to accept similarities
– Fails to address subtle UX requirements
• Inherent/platform-specific
– Functional/application-specific
… Testers need to be aware of these differences
9
10. XBOSoft, Inc. All Rights Reserved.
Mobile vs. Desktop Platform
1.2 Differences
10
1.Platform proliferation
2.Location
3.Photography
4.Notifications
5.Content
6.Research (search) tasks
7.Chat
8.Email
11. XBOSoft, Inc. All Rights Reserved.
Mobile Vs. Desktop Platform Differences
1.2.1 - Platform Proliferation
11
"Android Fragmentation Visualized (August 2015) - SlideShare." 7 Aug. 2015,
http://www.slideshare.net/FrdricZugaj/android-fragmentation-visualized-august-2015.
12. XBOSoft, Inc. All Rights Reserved.
• Over 18000 Unique Devices Detected
• Android a Far Bigger Problem Than iOS
• Fragmentation a Strength and Weakness for
Android*
– Access to large variety of devices offers broad,
global reach
– Headache for developers
12
* "ANDROID FRAGMENTATION VISUALIZED (AUGUST 2015) DEVICE ...."
https://opensignal.com/assets/pdf/reports/2015_08_fragmentation_report.pdf.
Mobile Vs. Desktop Platform Differences
1.2.1 - Platform Proliferation (cont.)
13. XBOSoft, Inc. All Rights Reserved.
Screen Size/Aspect Ratio Variation Challenges
• Large Numbers of Android Displays (left)
• Relatively Small Variety of iOS Displays (right)
• Developers Forced to Prioritize
• Testers Need to Accommodate
13
Mobile Vs. Desktop Platform Differences
1.2.1 - Platform Proliferation (cont.)
14. XBOSoft, Inc. All Rights Reserved.
• Mobile Environment
– Mobiles accessed 150 times a day
– Mobiles constantly changing location and
conditions
• Location Based Services
– Unique “mobile-only” location-based services
– Added Services Dimension
• Navigation, inventory control, restaurant reviews,
etc.
• Ambient Conditions
– Adjust to varying lighting conditions
14
Mobile Vs. Desktop Platform Differences
1.2.2 - Location
15. XBOSoft, Inc. All Rights Reserved.
• Enabling
– Mobile cameras always with you
– No real desktop equivalent for photography
• Mobile Cameras Utilized 10x Over Traditional
Cameras
– Photos
– Scanning
15
Mobile Vs. Desktop Platform Differences
1.2.3 - Photography
16. XBOSoft, Inc. All Rights Reserved.16
Mobile Vs. Desktop Platform Differences
1.2.3 - Photography (cont.)
17. XBOSoft, Inc. All Rights Reserved.
• Applications - New Social Paradigm
– Rapidly take pictures
– Quick editing (e.g., Instagram and Facebook
Filters) and sharing
• Requirements for UI to be Supported
– Rapid access to take photos
– Rapid storage
– Editing photos must be seamless and simple
Mobile Vs. Desktop Platform Differences
1.2.3 Photography (cont.)
17
18. XBOSoft, Inc. All Rights Reserved.
Mobile Vs. Desktop Platform Differences
1.2.4 Notifications
• Notification Implementation Approach
…Directly affects overall UX
• Potentially Large Number of Notifications
Received by Mobile
– And desktops associated with mobile
– Receipt of notifications at inappropriate times and
locations
– Give Users a choice that is easy to find and change
18
19. XBOSoft, Inc. All Rights Reserved.
… Mobile Devices - Always with you
• Ideal for Digesting Content During Idle Times
• UI Designed for Fast Access
– During short time availability (e.g., in line at
grocery store)
– Large buttons and gestures
19
Mobile Vs. Desktop Platform Differences
1.2.5 - Content
20. XBOSoft, Inc. All Rights Reserved.20
Mobile Vs. Desktop Platform Differences
1.2.6 - Mobile-Based Research (search)
21. XBOSoft, Inc. All Rights Reserved.
• Limitations
– Small Screen sizes
– Input Difficult
• Benefits and Customizations
– Location component
– Application UI
• Minimize screen input
• Intelligent, automatic access of location info
WITHOUT user intervention
• Customization (if desired)
21
Mobile Vs. Desktop Platform Differences
1.2.6 - Mobile-Based Research (search)
22. XBOSoft, Inc. All Rights Reserved.
• Limitations
– Input limitations
• Benefits and Customizations
– Always with you
– Ecosystem of apps can exist within Chat app
• Mini Apps (e.g., WeChat, Facebook)
– Application UI
• Simple and minimal
• Use of canned responses
• Short, multiple interactions on the go
22
Mobile Vs. Desktop Platform Differences
1.2.7 - Chat Applications
23. XBOSoft, Inc. All Rights Reserved.
• Limitations
– Input methods
– Viewing screen size
• rendering comprises
– Reduced feature set
• Benefits and Customizations
– Always with you
– Customized UI
• Autocorrect has dramatically improved, reducing
fatigue and increasing accuracy
• Use of canned/customized responses
23
Mobile Vs. Desktop Platform Differences
1.2.8 - Email
24. XBOSoft, Inc. All Rights Reserved.
2. Challenges in Designing and
Testing Mobile Apps
2.1 UX Criticality
2.2 Localization
2.3 Testing and Automation
24
25. XBOSoft, Inc. All Rights Reserved.
Challenges in Designing and Testing Mobile Apps
2.1 - User Experience Criticality
• UX Essential to Mobile Application
• Development is a Team Effort
– Design -> Coding - > Testing
• Not Limited to Discrete Functional Capability
• Must Meet Demands of User Expectations for
Overall UX
• Maintain Key Tenets of Brand’s Value
25
26. XBOSoft, Inc. All Rights Reserved.
• Traditional Business Perspective Applies…
– It is cheaper and less time-consuming to keep
your current customers than acquire new ones
• … and It Is Amplified
– Level of competition for Applications greater
– App Stores simplify finding alternate solutions
– Opinions can be formed in seconds
26
Challenges in Designing and Testing Mobile Apps
2.1 - User Experience Criticality (cont.)
27. XBOSoft, Inc. All Rights Reserved.
• Localytics:
– 25% of applications will be used ONCE.
– 33% of applications fully abandoned after 1
month
– 70-80% Fully Abandoned after 3 months.
• Dimensional Research
– 80% of app users will attempt to use a
problematic application less than 4 times
27
"Mobile Apps: What's A Good Retention Rate? | Localytics." 21 Mar. 2017,
http://info.localytics.com/blog/mobile-apps-whats-a-good-retention-rate.
"Whitepaper: Failing to meet mobile app user expectations | ITProPortal (REF: Demensional Research) ." 30 Jan. 2016,
http://www.itproportal.com/2016/01/30/whitepaper-failing-meet-mobile-app-user-expectations/. Accessed 21 Apr. 2017.
Challenges in Designing and Testing Mobile Apps
2.1 - User Experience Criticality (cont.)
28. XBOSoft, Inc. All Rights Reserved.28
Challenges in Designing and Testing Mobile Apps
2.1 - User Experience Criticality (cont.)
Travel and Lifestyle retain the most users on average, with a 23% retention rate post 90 days.
29. XBOSoft, Inc. All Rights Reserved.
UX Dependencies
• UI
– First thing new user will see
– Responsive and Purpose-Built
– Provide expected results with few obstacles
– No confusion in piloting
• Annoying User Experiences Lead to Abandonment
… “Simple but Mediocre” will trump
“Comprehensive but Difficult”
29
Challenges in Designing and Testing Mobile Apps
2.1 - User Experience Criticality (cont.)
30. XBOSoft, Inc. All Rights Reserved.
… Data Speed Challenges for Mobile
Applications
• Operate in Widely Varying RF Environments.
• Deal gracefully with intermittent data connections
– Intelligent Caching
– Moving between wireless interfaces (WiFi, Cellular,
Bluetooth)
• UI Optimized to Maximize UX (... Illusion)
– Slow download speeds masked with progress screens
– Inform User of progress
– Distract User from slow progress
30
User Experience Criticality
2.1.1 - Performance Issues
31. XBOSoft, Inc. All Rights Reserved.
• Overall Performance Measurements and Monitoring
– Data Throughput
– Screen Delays
– Battery Consumption
– Database Performance
– Memory Utilization (leaks, abandoned memory)
• Mobile Performance Tool Examples
– XCode’s UI Test * Infrastructure (previously Instruments)
(Apple): iOS
– Monkop (Monkop, Inc.): Android, iOS
31
User Experience Criticality
2.1.1 - Performance Issues (cont.)
* "The Thumb Zone: Designing For Mobile Users – Smashing Magazine." 19 Sep. 2016,
https://www.smashingmagazine.com/2016/09/the-thumb-zone-designing-for-mobile-users/.
32. XBOSoft, Inc. All Rights Reserved.
Mobile Device Unique
• Always with the person
• Typically held in one hand
• Biometrics availability
… More a personalized extension of user, less an
independent device
How many of you would loan your phone to a friend?
User Experience Criticality
2.1.2 Design Controls Based on Hand Position
32
33. XBOSoft, Inc. All Rights Reserved.
… 50% of People Hold Phone in 1 hand, using
thumb to drive interface. *
… 75% Interactions on mobile are thumb driven. **
* "Design for Fingers, Touch, and People, Part 1 :: UXmatters." 6 Mar. 2017,
http://www.uxmatters.com/mt/archives/2017/03/design-for-fingers-touch-and-people-part-1.php. Accessed 19 Apr. 2017.
** "How We Hold Our Gadgets · An A List Apart Article." 3 Nov. 2015, https://alistapart.com/article/how-we-hold-our-gadgets.
Accessed 19 Apr. 2017.
User Experience Criticality
2.1.2 Design Controls Based on Hand Position
(continued)
33
34. XBOSoft, Inc. All Rights Reserved.34
Thumb-zone mapping for left- and right-handed users. *
* "The Thumb Zone: Designing For Mobile Users – Smashing Magazine." 19 Sep. 2016,
https://www.smashingmagazine.com/2016/09/the-thumb-zone-designing-for-mobile-users/.
2.1.2 - Design Controls Based on Hand Position
(continued)
35. XBOSoft, Inc. All Rights Reserved.
• UX Formulated by Software and Hardware,
Together
• Physical Home Button Easily Accessible
• Virtual Buttons and Controls Easily Reachable
• Haptic Feedback Helping Blur Lines Between
Virtual and Real
35
User Experience Criticality
2.1.2 - Design Controls Based on Hand Position
(continued)
36. XBOSoft, Inc. All Rights Reserved.
• Less Defined Interactions
– Thumb Driven inherently less accurate
– Swiping gestures, pressing Icons
• Text Input
– “Small Gesture” challenge
– Capacitive Screens w/correction algorithms
– Still imperfect and tedious
• Should be kept to a minimum
• Employ autocomplete for forms
36
User Experience Criticality
2.1.3 - Design to Larger Gestures
37. XBOSoft, Inc. All Rights Reserved.
Text Size
• Do NOT Maximize Information On Small Screen
• Text Should Be Easily Readable
– in Mobile Environments
• e.g., on a Bus, in a car
– When Displaying User-facing screen elements
• e.g. table column headings, buttons, help menus
37
User Experience Criticality
2.1.4 - Design For Visual Clarity
38. XBOSoft, Inc. All Rights Reserved.
Contrast
• Flexibility in Contrast Can Be
Aesthetically Pleasing
… however…
• Users With Marginal Vision
Should NOT Have to Struggle.
• Contrast Setting Guidelines
Available From www.w3.Org
38
"The Logical Blog by IconLogic: eLearning : Three Tips For Better Color ...." 7 Mar. 2017,
http://iconlogic.blogs.com/weblog/2017/03/elearning-three-tips-for-better-color-usage.html.
User Experience Criticality
2.1.4 - Design For Visual Clarity
(continued)
39. XBOSoft, Inc. All Rights Reserved.
• Purpose-Driven
– Clear view to primary action
– Secondary actions should be added, but only if necessary.
• All Actions Should Be Intuitive as Possible
– Easier to learn minimal memorization required
– Easier to use
– Easier when expanding functionality
39
User Experience Criticality
2.1.5 - Design for Simplicity
41. XBOSoft, Inc. All Rights Reserved.
Design to leverage Context-Aware Sensors
• Expanding Proliferation of Sensors
• Development Toolkits Support Coding and Easy Access
• Eliminates Need for Users to Manually Input Data
– e.g., GPS Application, where User’s location immediately
known
– Fingerprint and Face ID
41
User Experience Criticality
2.1.5 - Design for Simplicity
(continued)
42. XBOSoft, Inc. All Rights Reserved.42
Sensor Proliferation
"OpenSignal Mobile Sensors - OpenSignal." https://opensignal.com/sensors/library/.
2.1.5 - Design for Simplicity (continued)
43. XBOSoft, Inc. All Rights Reserved.
• All Actions Should Be Intuitive if at All Possible
– Easy to learn with minimal memorization
– Easy to use
– Eliminates the need for Reference/Help files
• Utilize Well-Established Norms for the Mobile OS
– iOS: Human Interface Guidelines (Apple)*
– Android: Design | Android Developers **
43
* "Design Principles - Overview - iOS Human Interface Guidelines."
https://developer.apple.com/ios/human-interface-guidelines/.
** "Design | Android Developers." https://developer.android.com/design/index.html. Accessed 10 Apr.
2017.
User Experience Criticality
2.1.6 - Navigation Should be
Self-Evident
44. XBOSoft, Inc. All Rights Reserved.
• Development For Multiple-Platforms
• User Expectations
– Functions will be duplicated on desktop and mobile
application
• Design Considerations
– UX principles should be duplicated where possible
– Aggressively modify function to support mobile UI where
appropriate
– Developer should abandon desktop functions
inappropriate for Mobile environment
44
User Experience Criticality
2.1.7 - Seamless Experience Across Devices
45. XBOSoft, Inc. All Rights Reserved.45
Spotify Application on Various Devices
2.1.7 - Seamless Experience Across Devices
46. XBOSoft, Inc. All Rights Reserved.
• Simple Applications
– Manual Testing
• Test for look and feel
• Usability
• Serious/Complex Applications
– Software Testing QA Team integral part of SDLC.
– Manual Testing cannot be avoided for many areas
– Automation of certain functional areas
• Only done after careful consideration
46
User Experience Criticality
2.1.8 - UX Testing Challenges
47. XBOSoft, Inc. All Rights Reserved.
Manual UX Review
• Access Right-Hand Versus Left-Hand Usability
• Screen Interaction (Small Items)
• Visual Clarity
– i.e., contrast of displayed information under various
lighting conditions
• General Navigation Through Application
– Adherence to appropriate device guidelines
– Confirmation of cross-platform unified experience
47
User Experience Criticality
2.1.8 - UX Testing Challenges (cont.)
48. XBOSoft, Inc. All Rights Reserved.
UX Evaluations Via Manual Methods
• Well Planned Tests
• “Hallway Tests”
• Coordination With Other SDLC Stakeholders
48
User Experience Criticality
2.1.8 - UX Testing Challenges (cont.)
49. XBOSoft, Inc. All Rights Reserved.
• Localization Challenges
– Google Play Store , iOS App Store
– Distribution of Application Globally is easy
– Creating Suitable Application for a Market is hard
• Application Market Acceptance
– Local Language
– Market-specific Images
49
Challenges in Designing and Testing Mobile Apps
2.2 - Localization
50. XBOSoft, Inc. All Rights Reserved.
… 90% of all mobile activity takes place in Apps,
not browsers
… App Localization will increase downloads
on average well over 100% *
50
"There's a Language for That: Translating Mobile Apps And Content."
http://www.demandgenreport.com/features/demanding-views/there-s-a-language-for-that-translating-mobile
-apps-and-content.
Challenges in Designing and Testing Mobile Apps
2.2 - Localization (cont.)
Price Paid for Not Localizing
51. XBOSoft, Inc. All Rights Reserved.
Critical Localization Considerations
• Assets
– Images, tutorials, other content
• Layout Flexibility - to ensure that text will autosize
– Chinese requires less space than English
– Spanish, French, German take up more space than
English
– iOS/Android Toolkits help with dates, times, currency
51
"There's a Language for That: Translating Mobile Apps And Content."
http://www.demandgenreport.com/features/demanding-views/there-s-a-language-for-that-translating-mobile
-apps-and-content.
Challenges in Designing and Testing Mobile Apps
2.2 - Localization (cont.)
52. XBOSoft, Inc. All Rights Reserved.
Critical Localization Considerations (cont.)
Translations
• What to Localize?
– Entire application
– Help files, menus, other references
– Regions to support
• Native Translations by Subject Matter
Experts
– Must understand context
52
"There's a Language for That: Translating Mobile Apps And Content."
http://www.demandgenreport.com/features/demanding-views/there-s-a-language-for-that-translating-mobile
-apps-and-content.
Challenges in Designing and Testing Mobile Apps
2.2 - Localization (cont.)
53. XBOSoft, Inc. All Rights Reserved.
Testing and Review of Localization Data
• Reviewer
– must know the language
– must know subject and context.
• Compatibility Testing
– Accurate Rendering
• Use of device pools to test application on target devices
– e.g., Sauce Labs, BrowserStack, AWS Device Farm
– Automation of Compatibility tests (covered in next section)
53
"There's a Language for That: Translating Mobile Apps And Content."
http://www.demandgenreport.com/features/demanding-views/there-s-a-language-for-that-translating-mobile
-apps-and-content.
Challenges in Designing and Testing Mobile Apps
2.2 - Localization (cont.)
54. XBOSoft, Inc. All Rights Reserved.
2.3 Testing and Automation
1. Wireless Interfaces
2. API Integration
3. Test Tools
4. Device Diversity
5. What to Automate
6. Mobile Test Lab
7. Continuous Integration
8. Mobile Data Security
54
55. XBOSoft, Inc. All Rights Reserved.55
"• Mobile and multi-channel application testing challenges 2013-2017 ...."
https://www.statista.com/statistics/500630/worldwide-mobile-and-multichannel-application-testing-challenges/.
Statista - Challenges in Testing Mobile & Multi-Channel (Mobile, Social, Traditional)
56. XBOSoft, Inc. All Rights Reserved.
Mobile Testing Landscape is Changing
• Different Applications, be they web apps or native, require
different testing tools and approaches
• Numerous device interfaces that need to be supported
• Less and Less Time to Test.
56
* "World Quality Report 2016 - Sogeti.", p.9, 10
https://www.sogeti.com/globalassets/global/downloads/testing/wqr-2016-2017/wqr_2016-17_final_secure.
Challenges in Designing and Testing Mobile Apps
2.3 - Testing and Automation
57. XBOSoft, Inc. All Rights Reserved.
Mobile Devices Operate in Complex Environment
• Data Feeding From a Variety of Air Interfaces
– Wireless - High Speed
• Indoors: Wifi
• Outdoors: Cellular (e.g., 3G, 4G, 5G)
• Device Moves Automatically Between Services
– behind the scenes
• Data Handling
– defaults set to avoid cellular service fees when possible
– elegant handling of data interruptions
57
Testing and Automation
2.3.1 - Wireless Interfaces
58. XBOSoft, Inc. All Rights Reserved.
API Considerations
• API Integration to External Services Exploding
– Increased Application development speed
• WideSpread Use
• Prone to Errors (e.g., Customized
Implementations)
58
Testing and Automation
2.3.2 - API Integration
59. XBOSoft, Inc. All Rights Reserved.
Underestimating of Testing Challenges
• Some APIs are well established (less risk)
– e.g., music services
• Many APIs Unstable
– More eclectic sets of devices (e.g., IoT) feeding different
information to mobile applications
– Immature API development
• Care Must Be Taken
– Proper evaluation and testing of less mature APIs
– Cooperation between both Developer and API service
provider(s)
59
Testing and Automation
2.3.2 - API Integration (cont.)
60. XBOSoft, Inc. All Rights Reserved.
Tool Selection
• Many Things to Consider
– Cost (Open Source/Free vs. Paid)
– OS coverage (e.g., Android)
• Cost Is a Factor … but Not the Most Important
– Will the Application Do the Job?
– Is the Application Supported (native vs. web)
60
Testing and Automation
2.3.3 - Testing Tools
61. XBOSoft, Inc. All Rights Reserved.
Top 10 Mobile Testing Tools for
2017 for Android and iOS *
Kobiton
Appium Studio
Appium
Robotium
Selendroid
61
MonkeyRunner
Calabash
Frank
KIF
MonkeyTalk
* "Top 10 Mobile Testing Tools In 2017 for Android & iOS - Guru99." 27 Oct. 2017,
https://www.guru99.com/mobile-testing-tools.html. Accessed 4 Dec. 2017.
62. XBOSoft, Inc. All Rights Reserved.
XBOSoft Preference
• Standardized on Appium
– XBOSoft holds no Stake in Appium!
– Appium is Open Source
• Answers two questions for us
– Does the Job of Testing Native Mobile Applications
– Supports the Applications (iOS and Android)
62
Testing and Automation
2.3.3 - Testing Tools (cont.)
63. XBOSoft, Inc. All Rights Reserved.
Appium Highlights
• No recompilation / modification to App
– Using vendor-provided automation frameworks -
XCUITest / UIAutomator
• Support most languages/frameworks to write
tests
– Wrapping vendor-provided frameworks into one API
(e.g., Webdriver API)
63
Testing and Automation
2.3.3 - Testing Tools (cont.)
64. XBOSoft, Inc. All Rights Reserved.
Appium Highlights (cont.)
• No Need To Reinvent the Wheel
• Webdriver for Mobile Automation Is a Standard
– Appium has extended the protocol with extra API
methods useful for mobile automation, instead of
creating any thing new.
• Open Source
64
Testing and Automation
2.3.3 - Testing Tools (cont.)
65. XBOSoft, Inc. All Rights Reserved.65
Test Script Selenium
WebDriver
JSON Wire
Protocol
Automation
commands
are sent in
form of JSON
via HTTP
request
XCUITest
UIAutomator
iOS Device /
Simulator
Android
Device /
Emulator
Appium
server logs
the results to
console
Appium server
invokes
vendor-provided
automation
frameworks
66. XBOSoft, Inc. All Rights Reserved.
Appium Test Engineering Requirements Support*
• Tests can be written for iOS and Android using same API
(e.g., Selenium WebDriver)
• Write tests in Preferred Language (e.g., Java, C#)
• Automation Support for Hybrids, Native, and Web Apps
• No Source Code Required (unlike Robotium)
• CI Compatible With Jenkins
• Runs on Selenium Grid for Parallel, Multi-Device Tests
• Works With Real Devices or Simulators, Cloud Services
• Cross-Platform Support (Mac, Windows, Linux)
66
* "Advantages and Disadvantages of appium - Stack Overflow." 6 Feb. 2015,
http://stackoverflow.com/questions/28363221/advantages-and-disadvantages-of-appium/28367587.
"Illustrated Appium Tutorial: 8 Steps | Waverley." 23 Feb. 2017,
https://waverleysoftware.com/blog/appium-tutorial/.
2.3.3 - Testing Tools (cont.)
67. XBOSoft, Inc. All Rights Reserved.
• Too Many Device Varieties to Test
• Employment of Analytics
– Help Prioritize Device Focus
– Localytics, Google, etc.
• Example Parameters for Market Data Collection
– Screen Size, OS, CPU, RAM Usage, etc
• Data Assists Designers to Prioritize Device
Models
– Helps determine type of support an app will provide. *
• Test Engineers Need to Ensure
– Prioritized device configurations appropriately supported
67
* "Five types of mobile analytics data and their uses." 10 Jun. 2016,
http://mobilebusinessinsights.com/2016/06/five-types-of-mobile-analytics-data-and-their-uses/.
Testing and Automation
2.3.4 - Device Diversity
68. XBOSoft, Inc. All Rights Reserved.
• Manual Testing
– Manual testing is essential to determine overall UX
– Look and Feel
– Baselines for test are established
• Automation
– A necessary, extremely useful tool
– Challenge is what to automate and how to prioritize
68
Testing and Automation
2.3.5 - What to Automate
69. XBOSoft, Inc. All Rights Reserved.
• Some Things Should NOT Be Automated
• Some Things CAN Be Automated and Make
Sense to Do So
– Certain established look-and-feel tests that are
repeatedly successful
– Assertions can be applied to ensure tests are within
bounded parameters
69
Testing and Automation
2.3.5 - What to Automate (cont.)
70. XBOSoft, Inc. All Rights Reserved.
Based on that, these types of test cases are
good to be automated:
• Have been successfully executed manually
• Stable areas, pages that are not likely to
change frequently
• Transactions, workflows that are not likely to
change
2.3.5 - What to Automate (cont.)
Good to Automate
70
71. XBOSoft, Inc. All Rights Reserved.
2.3.5 - What to Automate (cont.)
Difficult to Automate
These test cases, due to the differences in mobile, are
difficult to automate , the cost of both automation and
maintenance would be high:
• New features without manual testing
• Complex gestures
• No resulting well-defined expected results
• Switch across Apps, Interrupts
• Dependences like SIM card, camera, etc.
71
72. XBOSoft, Inc. All Rights Reserved.
2.3.5 - What to Automate (cont.)
Test Automation Script Guidelines
Similar for all automation testing, we want the
automated script for mobile to be
• Small - Easier to understand and fix
• Fast - Parallel execution for faster feedback
• Independent - Can run any subsets in any order
• Repeatable - Tests get the same result every time
• Self-Checking - No human checking (e.g., assertions)
• Reusable - Avoid maintenance nightmares
72
73. XBOSoft, Inc. All Rights Reserved.
• Local and Remote Test Lab Configurations
– Both Have Advantages
• Support for Large Number of Devices (if required)
Accomplished by various means:
– Local Real Devices
– Local Simulated Devices
– Remote Real Devices
– Remote Simulated Devices
73
Testing and Automation
2.3.6 - Mobile Test Lab
74. XBOSoft, Inc. All Rights Reserved.
Local Device Simulator
• Virtual device
– not actual phone
• Phone emulated through software
• Same OS and software (in large part) as that
loaded on actual mobile hardware
• Performance may vary from real device
– CPU/RAM/Network Speed
• Typically free (for personal use)
Testing and Automation
2.3.6 Mobile Test Lab (cont.)
74
75. XBOSoft, Inc. All Rights Reserved.75
Testing and
Automation
2.3.6 Mobile Test Lab
(cont.)
Examples of local
Android simulator
https://www.genymotio
n.com/ (android only)
76. XBOSoft, Inc. All Rights Reserved.76
Testing and Automation
2.3.6 Mobile Test Lab (cont.)
Examples of local iOS simulators
(Apple XCode Built-in Simulator list)
77. XBOSoft, Inc. All Rights Reserved.77
Typical Local Software Test Lab
Testing and Automation
2.3.6 - Mobile Test Lab (cont.)
78. XBOSoft, Inc. All Rights Reserved.78
Typical Remote Software Test Environment
Testing and Automation
2.3.6 - Mobile Test Lab (cont.)
79. XBOSoft, Inc. All Rights Reserved.
Remote Device Farm Server
• Similar to Local Software Test Environment
• Majority of Local Test Configuration Function
Moved to Cloud
• ADVANTAGE: Local Tester Does Not Have to
Physically Manage
– Remote test client software
– Growing Device Pool
79
Testing and Automation
2.3.6 - Mobile Test Lab (cont.)
80. XBOSoft, Inc. All Rights Reserved.
Remote “Real” devices
• Typically Most Recently Available Mobile Devices
• Devices Accessed Through Web Browser
– not actually holding phone
– “mirrored image” of real device
80
Testing and Automation
2.3.6 - Mobile Test Lab (cont.)
81. XBOSoft, Inc. All Rights Reserved.
Cloud-based
Real iPhone
Mirrored in
Local Firefox
(BrowserStack)
81
What you can
expect when
using
simulators
locally, or in
the cloud, or if
using real
devices in the
cloud.
82. XBOSoft, Inc. All Rights Reserved.
Remote
Simulated
Device
Mirrored in
Local
Chrome
(AWS Device Farm)
82
83. XBOSoft, Inc. All Rights Reserved.
Simulated iPhone running on remote MacOS Desktop
viewed on local Chrome Browser (BrowserStack)
83
84. XBOSoft, Inc. All Rights Reserved.
Testing and Automation
2.3.6 Mobile Test Lab (cont.)
Available Devices - BrowserStack Example
84
85. XBOSoft, Inc. All Rights Reserved.
Expected Result
Testing and Automation
2.3.6 Mobile Test Lab
(cont.)
86. XBOSoft, Inc. All Rights Reserved.
Local Simulated Devices in Local Mobile Test Lab
• Advantages
– Supplements real device pool
– Large array of devices available
– Can run automated tests simultaneously on multiple
devices (e.g., Using Appium)
• Issues
– Limited Look-and-Feel automated test applicability
– Requires some maintenance to upkeep lab
– Limited to local computer power
Testing and Automation
2.3.6 Mobile Test Lab (cont.)
86
87. XBOSoft, Inc. All Rights Reserved.
Remote Test Lab (Device Farm)
• Advantages
– Large assortment of real and simulated devices
– One URL can be displayed against numerous browsers
– No maintenance required for local devices
– Automated Tests are Supported
• Issues
– Automated Tests can get costly if required to run
simultaneously
– Limited Look-and-Feel automated test applicability
Testing and Automation
2.3.6 Mobile Test Lab (cont.)
87
88. XBOSoft, Inc. All Rights Reserved.
Real/Simulator/Cloud Device
When is best to use which one (1/3)
88
Situation Solution Why
During the initial stages of
application development /
coding phase
Simulator Fast debugging
Sanity, smoke testing,
performance,
interoperability and
network feasibility and
regression testing
Real
Device
A series of real devices with high
popularity is always the final gate, real
usage with real situation must be
covered before final release.
Web application through
URL
Simulator It’s much easier to copy url to
simulator than input it to real phone
Situation-based
application
Real
Device
Only real device can answer: is it easy
to use the app on the train, or while
walking down the street? The situation
about in bright sunlight or in the rain?
89. XBOSoft, Inc. All Rights Reserved.89
Situation Solution Why
Feeling of closeness
towards the real
handheld devices
Real
Device
Look and feel, color resolution of the screen,
whether the picture is bright or not under both day
and night conditions and so on
Capturing
screenshots of the
situations where
defects appear
Simulator Capturing issue of screenshots over simulator is
much easier for manual testing. For automated
testing, screen captures can be configured for
simulators as well as real devices.
Battery/exact
color/incoming
interrupts/Camera /
Dynamic
lighting conditions
Real
Device
Simulators are not able to simulate the Battery
status, or the exact color display of the devices
when the real device is in sunlight or in the dark, or
simulating incoming interrupts for SMS or calls and,
no cameras.
Performance Real
Device
The original devices tend to perform faster than the
emulator/simulator.
Memory related
issues
Real
Device
The memory available at the simulator tends to be
far more than the real devices so this may create
misconceptions
When is best to use which one (2/3)
90. XBOSoft, Inc. All Rights Reserved.90
Situation Solution Why
Responsive
Design Testing
across Devices
Cloud
Device
No hardware factors as what matters are just width and
ratio, Cloud device farm provides that.
Cross Browser
Testing
Cloud
Device
Safari, Chrome, IE, Firefox, different version, on
different OS, with different screen size, the cloud service
is designed to manage all of it.
Automated
regression test
script
Cloud
Device
Make sure the script works for 1 real devices locally,
and then have the cloud to multiply it to 20 or even
more, in parallel, and you don’t have to maintain any of
them.
Devices with
lower popularity
Cloud
Device
Cover the devices in your supported list as much as
possible.
Call center,
customer
reported issues
Simulator Troubleshooting the issues with simulators first before a
real device is available.
When is best to use which one (3/3)
91. XBOSoft, Inc. All Rights Reserved.
Automated Mobile Application Testing
• Appropriate Tests Are Created
– Various functional areas
– Projects run in parallel to save time
• Tests Are Scheduled and Run
Testing and Automation
2.3.7 Continuous Integration
91
92. XBOSoft, Inc. All Rights Reserved.
Continuous Integration
• Kicked off immediately after latest application
build completed
• Typically runs smoke tests
– To exercise critical test areas for proper configuration
and operation
– Done prior to detailed test run
• Local Test Environment
• Database Initiation and Access
• Network and device access
• General Application testing of major functional areas
92
Testing and Automation
2.3.7 - Continuous Integration (cont.)
93. XBOSoft, Inc. All Rights Reserved.
Continuous Integration Benefits
• Well Suited for Agile Development Teams
• Can Initiate Test Process Directly After Latest
Application Build
• Rapid Feedback
– rather than waiting until next day
… Identified Problems Can Be Quickly Addressed
93
Testing and Automation
2.3.7 - Continuous Integration (cont.)
94. XBOSoft, Inc. All Rights Reserved.
Mobile Data Security Considerations
• Extremely Important, Specialized
• Open-Source as Well as Commercial Tools
Available
–OWASP (Open Web Application Security Project)
• Certain Security Tests can be Performed by
Performance Tools
–e.g., JMeter DDOS, Monkop
Testing and Automation
2.3.8 Mobile Data Security
NOTE: Due to Depth and Nature Mobile software security,
XBOSoft covers this topic in a separate presentation.
94
95. XBOSoft, Inc. All Rights Reserved.
Mobile Testing: Challenges and Solutions
Summary (1/3)
Differentiation of Requirements for
Desktop/Notebook Computers Versus Those for
Mobile Devices
• Design Parallels should be leveraged wherever
possible
• Caution using time-saving techniques
– e.g., web apps to gain broad platform footprint rather
than device platform specific native applications that
optimize UX.
95
96. XBOSoft, Inc. All Rights Reserved.
The Needs of the End-User of the Application Must
Served, First and Foremost
• Which devices to be supported
• Which sensors to be accessed
• Where app will be used
• Can the application easily accessed and timely in data
presentation
• How will the user be holding the device (which hand)?
• Can Elegant handling of data through multiple data services
be supported? (e.g., 4G vs. WiFi)
Mobile Testing: Challenges and Solutions
Summary (2/3)
96
97. XBOSoft, Inc. All Rights Reserved.
… Mobile Design Considerations are
Complex
With More Mobile devices, plotting out
your test strategy is critical
• Resource limitations are amplified
– Time and Money (test environments)
• Leverage test automation
• Leverage ‘combo’ lab depending on needs
Mobile Testing: Challenges and Solutions
Summary (3/3)
97