You are on page 1of 44

An AUTOMATED TESTING INSTITUTE Publication - www.automatedtestinginstitute.

com

A utomated . . . . . . . S T
oftware esting MAGAZINE
July 2012 $8.95

Addressing Flaws & Technology


Monitoring an Ajax Cloud-based Web Application
Using Selenium and Nagios For Monitoring an AJAX Web Application Identify common mistakes made in test automation

Addressing The Flaws in Test Automation

SQL

Altering Your Lifecycle For Mobile

Planning a Mobile Test Automation Strategy

Impenetrable Systems: Automate a Penetration Test in 3 Simple Steps

The KIT Is Koming!

TestKIT Conference 2012

Register Now!
OctOber 15 - 17, 2012 BWI AIrport MArrIott Linthicum, mD
Tutorials Concurrent Presentations TABOK Certification Training & Exam Discussion Forums & Networking 4th ATI Automation Honors Keynotes

www.testkitconference.com

A utomated S T
oftware esting
July 2012, Volume 4, Issue 2

Contents
Flaws & Technology
By understanding current shifts and trends in technology, one can make reasonable assumptions about where testing and test automation is headed, and can thus chart a course to move in that direction. It is important, however, to move into the future without all of the same mistakes from the past. This issue focuses on moving into the realm of new technology without all of the same old flaws in our approaches.

Features

Monitoring a cloud-based application using Selenium and Nagios 12


This article reveals the importance of ensuring application availability and correctness. Read as the author describes experiences with integrating Selenium tests that mimic user activity into a Cloud-based production monitoring framework based on Nagios. By Viktor Doulepov

Addressing the Flaws in Test Automation 18

If repeating the same action while expecting different results is the definition of insanity, then automators are often insane! Read this article for help escaping the crazy cycle by avoiding common automation flaws. By Clinton Sprauve

Planning a Mobile Test Automation Strategy That Works 24

Mobile offers an array of unique challenges for testing, and in addition, changes test automation from a nice-to-have to a must-have. Read this article to learn how to implement a winning mobile test automation strategy. By Yoram Mizrachi

Columns & Departments


Editorial
Look Into My Crystal Ball A discussion of how a successful future is dependent on learning from the past and preparing for the future.

TestKIT Conference Schedule 22


Updates to previous issues.

I BLog To U 34

Learn about AST authors and upcoming events.

Authors and events testKIT Tip 8

Read featured blog posts from the web.

Go On A Retweet 36

Read featured microblog posts from the web.

Impenetrable Systems Learn how to automate a simple penetration test in 3 steps. Cloud Services Wrapped Around Open Source A look at some up-and-coming Cloud services with a reliance on open source test tools.
July 2012

ATI Is Globally Hot! 38


Where in the World is ATI? A look at the international impact of ATI.

Open Sourcery

10

Local Chapter News 40


The latest from the local chapters.

www.automatedtestinginstitute.com

Automated Software Testing Magazine

Editorial

Look Into My Crystal Ball


by Dion Johnson
I recently conducted a keynote presentation at an ATI sister event called Test Automation Day conducted in Rotterdam, Netherlands. The presentation, in accordance with the conference theme, was entitled The Future of Test Automation. To begin the presentation I reminded the audience that software test automation is a broad discipline that affects many different groups and individuals, all with different goals and objectives to be met. Then I cautioned that predictions based on too many generalizations will ultimately alienate much of the community, which would in effect render those predictions useless. Without some predictions, however, I wouldnt have anything to present on! So I took a crack at providing some useful predictions, based on what I felt were key indicators and trends that supported those predictions, which would allow participants to judge for themselves how accurate and relevant they deem those predictions. The three indicators used were: Historical Evolution of Automation Technological Shifts Process & Methodology Effects

Mobile, Virtualization and The Cloud... The rate of speed with which these words are becoming clich is only being outpaced by the rate at which the technology represented by these words is rising in significance
technological shifts discussed were Mobile, Virtualization and The Cloud. There is just no escaping these popular buzz words. The rate of speed with which these words are becoming clich is only being outpaced by the rate at which the technology represented by these words is rising in significance. You are therefore advised to learn as much about these technologies as possible and prepare yourself as testing and test automation become increasingly intertwined with them. This issue of the AST Magazine is dedicated to aiding you in your pursuit of knowledge in the aforementioned areas. The first feature entitled www.automatedtestinginstitute.com Monitoring a cloud-based AJAX web application using Selenium and Nagios by Viktor Doulepov describes one teams experience with integrating Selenium into a Cloud-based production monitoring framework based on Nagios. Next, Addressing The Flaws in Test Automation, is a featured article by Clinton Sprauve that discusses common flaws that repeatedly plague automation regardless of the technology involved. Understanding these issues will help us move into the future without all of the mistakes of the past. Finally, the Planning a Mobile Test Automation Strategy That Works article by Yoram Mizrachi tackles the test automation mobile challenge head on. July 2012

The indicator that probably received the most focus was the one that addressed technological shifts. Before actually discussing current shifts and trends I invoked the Volume 1, Issue 2 article by Linda Hayes entitled The Evolution of Automated Software Testing in order to reveal how past technology trends provoked responses from the test automation discipline. I then discussed current technological trends and revealed evidence for what we can expect as a response from test automation. Even if you werent at the presentation, I bet you could guess that the primary 4 Automated Software Testing Magazine

w w w. a u tom atedtestinginstitute.com

ATI Local Chapter Program


Enhance the awareness of test automation as a discipline that, like other disciplines, requires continuous education and the attainment of a standard set of skills

Help provide comprehensive, yet readily available resources that will aid people in becoming more knowledgeable and equipped to handle tasks related to testing and test automation

Offer training and events for participation by people in specific areas around the world

ATIs Local Chapter Program is established to help better facilitate the grassroots, global discussion around test automation. In addition, the chapter program seeks to provide a local based from which the needs of automation practitioners may be met.

Start a Local Chapter Today


Email contact(at)automatedtestinginstitute.com to learn more

ATI - Meeting Local Needs In Test Automation

Authors and Events


Whos In This Issue?
Victor Dulepov is a QA Lead in the
IBM Solutions Group at Axmor Software Inc. With more than 10 years of IT experience, he contributed to numerous large- and middlescale enterprise projects developed for customers from U.S. and other countries as a business analyst, test designer, tester and configuration manager. He is NQA Certified Internal Quality Auditor, ISTQB Certified Tester. His current main focus is on software development, process improvements, and on providing self-monitoring and health control for production deployments, filling the gap between development and operations. Victor can be reached at vicd@axmor.com.

A utomated S T
oftware esting
Managing Editor Dion Johnson Contributing Editors Donna Vance Edward Torrie Director of Marketing and Events Christine Johnson
A PUBLICATION OF THE AUTOMATED TESTING INSTITUTE

Clinton Sprauve is senior product


specialist for Silk Testing Solutions at Micro Focus. Sprauve has more than 15 years of experience in the software quality assurance industry. Previously he was the senior product marketing manager for Silk Testing Solutions at Borland Software and Segue Software, and served as a senior technical sales engineer for both companies. Clint also has been an independent consultant, specializing in test management and test automation. CONTACT US AST Magazine astmagazine@automatedtestinginstitute.com ATI Online Reference contact@automatedtestinginstitute.com

ATI and Partner Events


October 15-17, 2012
TestKIT Conference
http://testkitconference.com/

Yoram Mizrachi is a mobility veteran


with a wealth of experience in mobile quality, networking, security, and telecommunications. Yoram founded Perfecto Mobile after serving as CTO of the Comverse Mobile Data Division. In this capacity, he encountered a variety of technological aspects in mobile applications, WAP, and location-based services. In 1999, Yoram was the founder and CTO of Exalink, which was later acquired by Comverse. Even earlier experience included several technology-related positions in the communication and cryptography fields. With hands-on experience in enterprise mobile quality, Yoram is an established thought leader in the industry, presenting and writing for many events and publications.

Forth Quarter 2012


ATI Europe Automation Training
contact(at)automatedtestinginstitute.com

The Automated Software Testing (AST) Magazine is an Automated Testing Institute (ATI) publication. For more information regarding the magazine visit http://www.astmagazine.automatedtestinginstitute.com

Automated Software Testing Magazine

www.automatedtestinginstitute.com

July 2012

K The KIT is Coming

October 15-17 2012


http://www.testkitconference.com

July 2012

www.automatedtestinginstitute.com

Automated Software Testing Magazine

TestKIT Tip

Impenetrable Systems
Automate a Penetration Test in 3 Simple Steps

TestKIT

SQL
ecurity concerns are paramount to software quality considerations these days, which has given rise to increased attention to security testing. One of the most common types of security testing is penetration testing, which is defined as a test method that simulates an attack on a computer system or application in order to identify vulnerabilities and how those vulnerabilities may be exploited. There are several certification programs that focus on information and systems security including the CISSP (Certified Information Systems Security Professional) and the SANS GIAC (Global Information Assurance Certification) Certified Penetration Tester. In addition, there are several commercial and open source tools that may be employed to aid in penetration testing, including tools known as Metasploit, Wireshark and BackTrack. Testers who are heavily engaged in security testing typically use some specialized security testing tool, but it is also possible to use a functional 8 Automated Software Testing Magazine

is the name used by ATI for describing ones testing toolkit. A TestKIT is filled with knowledge, information and tools that go with us wherever we go, allowing our projects and organizations to quickly reap the benefits of the practical elements that weve amassed. This section provides tips to add to your TestKIT.

1. Construct SQL Fragment 2. Add to Field and Submit 3. Verify No Inappropriate System Access or Data Granted

Step 1: Construct SQL Fragment


automated test tool to do some of the most basic forms of penetration testing. All that is required is a basic understanding of some fundamental penetration testing techniques such as cross-site scripting or SQL injection. Both of these techniques involve replacing normal application input strings with strings of data that are meant to fool the system into providing a user with data and/or access that the user should not be provided. This article focuses on a simple SQL injection penetration test that can be employed via virtually any functional automated test tool. This test involves three steps: This is the most involved step in the process and requires a basic understanding of SQL and of common vulnerabilities found in application data processing algorithms. For example, one vulnerability commonly found in software occurs when that software fails to appropriately process data inputs that include escape characters prior to those escape characters being added to a query. Look at the sample application screen in Figure 1. This illustration represents a common application login screen that grants access to users who exist in the applications user database table a table well call user_accounts. The login screen validates an access request triggered by clicking the login button by taking the entered Username and Password and placing it in a parameterized query that appears as follows:
SELECT userID FROM user_accounts WHERE userField = + Username + AND passField = + Password +

Figure 1: Login Screen


www.automatedtestinginstitute.com

July 2012

TestKIT Tip
So entering Michael_Jerome into the Username field and h_pass into the Password field would result in the construction of the following query:
SELECT userID FROM user_accounts WHERE userField = Michael_Jerome AND passField = h_pass

vulnerable to SQL injection attacks will dynamically construct and submit the following query to the database:
SELECT userID FROM user_accounts WHERE userField = anytext OR a=a -- AND passField = h_pass

As long as a userID is returned by the query, the user is validated and allowed into the application. Otherwise, a login error is presented. If the application is not designed to handle special character inputs special character inputs including single quotes (), dashes (-), etc a user is left with an open door to accessing the application without a valid login, by dynamically manipulating the backend query. Manipulation may occur by entering a SQL fragment similar to the following: anytext OR a=a -The fragment may need to change based on the type of database being used by the application. For example, the -- characters represent what is known as a comment in some databases. Others may represent a comment using /*.

Note that the constructed query has been dynamically changed from its original form, to become a query that will always return data from the database - even without a valid username and password. The WHERE clause forces the query to always return the first record in the user_ accounts table because, although there is probably no anytext value in the userField of the table, a = a is always true; the WHERE clause only requires one of the two conditions joined by OR to be true. What about the AND clause, you ask? Since comment characters (--) were introduced, the portion of the query following the comment characters with the AND clause that checks for the password is ignored and thus rendered irrelevant.

C ontent C oday T
Community Comments Box

ontribute

Announcements & Blog Posts

Step 3: Verify No Inappropriate System Access or Data Granted


The final step in the process is to use your automated tool to ensure you are not granted access to the application, and that you are not presented with SQL-based errors. SQL-based errors, even in the absence of full access to the application under test, may be a sign that the application is vulnerable SQL injection attacks.

Step 2: Add to Field and Submit


Once the statement has been constructed, the next step is to have your automated tool enter it into the appropriate field which in this situation is the Username field then click the Login button. For example, if the application under test is a Swing application that is being automated using Java and the Abbot automated test library, a small portion of the script may appear as shown in Figure 2. By entering our SQL fragment into the Username field and clicking the Login button, an application that is

Automation Events

Wait! Theres More!

Need a more information about Abbot? Visit the following site:


http://abbot.sourceforge.net/doc/ overview.shtml

As a registered user you can submit content directly to the site, providing you with content control and the ability to network with like-minded individuals.

Learn more today at


http//www.about.automatedtestinginstitute.com

Figure 2: Swing Automation Code Snippet


July 2012 www.automatedtestinginstitute.com Automated Software Testing Magazine 9

Open Sourcery

Cloud Services Wrapped Around Open Source Test Tools


Crowdamation Crowdamation is an offering that doesnt just focus on any single tool, but rather on the true power behind automation the people. The Automated Testing Institute (ATI) is partnering with Quilmont LLC to jointly deliver this new crowdsourcing platform that will help to expand the reach of test automation implementation. Practitioners across the globe are looking for increased opportunities to exercise their test automation expertise, while organizations and enterprises are looking for the ability to quickly implement test automation against multiple platforms and/or on an as needed basis, said Dion Johnson, chief advisor to ATI. The impending automated testing, crowdsource platform and community will offer a common, tool neutral project space for automated test development and collaboration. It will offer the flexibility to use a tool of choice (open source and commercial), have teams operate out of different locations, address the challenges of different platforms introduced by mobile and other technologies, all while still maintaining and building a cohesive, standards-driven automated test implementation that is meant to last.

Everyone seems to have their heads in the Cloud these days, and open source is no exception. There have been many interesting things going on with respect to services built around open source test tools, so read on as ATI provides a peek into some of these occurrences. BlazeMeter BlazeMeter is a provider of a self-service, load testing Apache JMeter cloud. As a reader of the Automated Software Testing Magazine, youre probably also a follower of the ATI Automation Honors, and are well aware of JMeters dominance in the Best Open Source Performance Test Tool categories over the years. BlazeMeter has sought to capitalize on the communitys demand for JMeter through a cloud-based service aimed at simplifying the deployment and increasing the scalability of the tool. JMeter is an excellent automation tool and has already had more than a million downloads this year, but it is challenging to deploy and is often limited in terms of scalability for the requirements of enterprise and high-traffic websites, says Girmonsky(1). BlazeMeter is fairly new, but has already been pretty busy making new announcements such as the release of a module for quickly launching high volume load tests against Drupal websites, and the deployment of services to efficiently load test complex, rich Facebook applications.

Netflix

Once known for its postal service-based offering that centered around snail mailing DVDs to subscriber mail boxes, Netflix is now becoming more known for its cloudbased, on-demand, streaming media service. Given the Netflix business models heavy reliance on service availability and reliability, it was incumbent upon them to produce some way to ensure these quality attributes were rated highly. Automated tools to the rescue! Beginning with a tool called Chaos Monkey, Netflix has created an army of monkey tools that theyve dubbed their References simian army. Chaos Monkey is a tool 1. http://www.networkcomputing. that randomly disables virtual machines to com/end-to-end-apm-techensure the system as a whole can continue center/232300034 with no customer impact. In addition, there is a Latency Monkey, Conformity Monkey, 2. http://www.msnbc.msn.com/ Doctor Monkey, Janitor Monkey, Security id/48197228/ns/business-press_ Monkey and a 10-18 Monkey. releases/t/blazemeter-releases-holygrail-cloud-testing-open-sourceThese monkeys have been used to great drupal/#.UAbsslIR4W0 effect by Netflix, with the promise that there 3. http://www.marketwire.com/presswill be more to come. But more to come for release/blazemeter-launches-selfwho? Apparently, theres more to come for service-performance-load-testing-forthe world, given that Netflix now plans to facebook-applications-1662528.htm open source these tools over the next few 4. http://techblog.netflix.com/2011/07/ months. According to Adrian Cockcroft, the netflix-simian-army.html Director of Cloud Architecture at Netflix, they plan on releasing pretty much all of our 5. http://www.wired.com/ wiredenterprise/2012/04/netflix_ platform, including the Monkey infrastructure, monkeys/ over the rest of this year. (5)

10 Automated Software Testing Magazine

www.automatedtestinginstitute.com

July 2012

ATI Automation Honors


Celebrating Excellence in the Discipline of Software Test Automation Voting Opens Soon!
www.atihonors.automatedtestinginstitute.com
July 2012 www.automatedtestinginstitute.com Automated Software Testing Magazine 11

4 Annual
th

Monitoring a cloud-based AJAX

web application using Selenium and Nagios


By Viktor Doulepov
onitoring basic health parameters of your production server, such as memory and CPU utilization or availability of specific ports, is a routine task. However, this is not enough when you need to know whether your business application is actually available to clients, and is operating correctly. The article describes our experience with integrating Selenium tests mimicking user activity into a production monitoring framework based on Nagios. Some familiarity with Selenium and Nagios is expected.

12 Automated Software Testing Magazine

www.automatedtestinginstitute.com

July 2012

Using Selenium and Nagios, we have built a stable, smallfootprint and low-cost solution for monitoring the availability and basic functionality of an AJAX web application.

July 2012

www.automatedtestinginstitute.com

Automated Software Testing Magazine

13

Background
Monitoring Basics
Typical monitoring solutions allow you to observe what is happening with the devices and hosts on your network. They also provide you with early warnings for critical parameters such as memory consumption, CPU utilization, free disk space, availability of specific ports on the nodes and their response times (e.g., pinging the nodes in question). Most of the available solutions allow you to perform SNMP checks over the managed devices within your network. However, out-of-the-box setups usually focus around low-level checks. Therefore, if you need to verify that the records in your production database are up-to-date or that your running application allows users to login and view their reports, you might spend noticeable efforts on customization.

You need to be sure your application is available to users worldwide, and not only within its cloud subnet.
First, you are most likely to pay for resources (i.e., memory/disk space/ CPU) by volume, so you should thoroughly assess performance and the footprint of your solution. In the very beginning of your journey, you probably would not want a dedicated monitoring node, but rather would be satisfied with the service running on one of production nodes. Second, you should take into account a modular approach. Depending on available horsepower, you can either keep all of your tools on a single node, or spread them throughout your network. For example, the head of your monitoring solution can be on one node, the test drivers on another node, the test executors on yet another node and so on. The modular approach is beneficial for situations where you want to evenly spread load amongst your nodes. Last but not least, you should try to keep your tests external to your solution. You need to be sure your application is available to users worldwide, and not only within its cloud subnet.

AJAX
The advent of AJAX applications introduced another issue: simple tools like wget, curl or even JMeter became inadequate for verifying web applications. A test run is no longer a mere sequence of HTTP requests that can be simply determined and programmed in advance. The checking tool should now be context-aware that is, able to dynamically determine presence of UI elements on the browser screen, and then to operate them according to your test scenarios. Luckily, you can achieve this with freely available or commercial testing tools. To name a few: Selenium; WATIR; HP Quick Test Professional; SmartBear Softwares TestComplete; IBMs Rational Functional Tester. HTMLUnit currently also provides a certain level of AJAX support.

Case Description
In our case, we had a GWT-based retail sales web console deployed in the cloud, and it needed regular monitoring for availability. A simplified deployment outline is provided in Figure 1 (the

What about the cloud?


Virtualization and moving a setup to the cloud imposes a set of restrictions.

Figure 1: Deployment outline of the Application-Under-Test


14 Automated Software Testing Magazine www.automatedtestinginstitute.com July 2012

Monitoring AJAX
#!/bin/sh #Runs a Selenium/JUnit smoke test provided in SEL_TEST echo Selenium/JUnit smoke test #Selenium test class SEL_TEST=com.companyname.tests.selenium.smoketest #Selenium test home SEL_HOME=/usr/local/selenium #Selenium home directory SEL2_JARS=$SEL_HOME/selenium-2.0b1 #Selenium RC host SEL_RC_HOST=10.162.42.12 # remote cleanup call via NRPE: CLEANUP_CMD=/opt/nagios/libexec/check_nrpe -H $SEL_RC_HOST -c selenium_clean # call remote cleanup before starting tests $CLEANUP_CMD cd $SEL_HOME # Test run with a Selenium client. Set proper Selenium JAR names # and have Java in your path. java -cp $SEL_HOME/junit-4.8.2.jar:$SEL2_JARS/selenium-java-2.0b1.jar:$SEL2_JARS/ selenium-server-standalone-2.0b1.jar:. org.junit.runner.JUnitCore $SEL_TEST # We do some housekeeping after Selenium/JUnit test run completes, # so lets save the JUnit exitcode to a variable: JUNITEXITCODE=$? echo JUnit exit code is $JUNITEXITCODE #call remote cleanup after completing tests $CLEANUP_CMD # Now exit with the saved JUnit exitcode. # Nagios will use it to judge whether the Selenium/JUnit test run was OK. exit $JUNITEXITCODE

nodes are RHEL 5.5): The limitations for the monitoring solution were as follows: Fast deployment and update cycle; Minimal maintenance efforts; Low footprint (disk, memory) and low CPU consumption; No code recompilation or redeployment for non-major UI changes.

Nagios was already set up for monitoring basic health parameters of the nodes. Based on our previous experience with Selenium, as well as its ability to wrap tests into JUnit tests, it was the obvious candidate for the test driver/test executor. Initially we were considering the possibility of delegating application availability monitoring to external paid services (as BrowserMob or Saucelabs - both are viable options if you want a purely separate monitoring solution running outside of your environment). In that case, however, we would still have to develop a complete Selenium test suite on our own. After estimating the advantages over in-house execution and the related costs, we dropped the idea.

Figure 2: Main shell script (Selenium test suite launched on core node)

Selenium Test Suite


Initially we planned to utilize the existing selenium_check plugin for Nagios, but found two issues preventing us from doing so. First, it turned out that we would have had to recompile our test suite frequently. Without modifications, the plugin would only accept basic tests compiled from exported Selenium IDE runs. Externalizing parameters (mainly locators) and making other useful changes to the tests while keeping them runnable by the plugin proved to be time consuming.

#!/bin/sh # Script for cleaning up immediately after/before a Selenium test run # on a local Selenium RC server # Will kill Firefox processes and remove temporary dirs # (Selenium RC often fails to do so). # The user running this script should be given appropriate sudo permissions # in your /etc/sudoers file. sudo pkill -f firefox echo Kill firefox exit code: $? sudo rm -rf /tmp/customProfileDir* & echo Remove firefox customProfileDir dirs exit code: $? sudo rm -rf /tmp/seleniumSslSupport* & echo Remove seleniumSslSupport dirs exit code: $? echo DONE...

Figure 3: NRPE from the previous script).


July 2012

Second, the layout of the plugin implied that an embedded Selenium RC server is started along with the test suite each time the test runs. This was undesired Remote cleanup shell script (selenium_clean, called through due to load/performance issues. We

www.automatedtestinginstitute.com

Automated Software Testing Magazine

15

instance of Firefox). Gradual disk space leakage due to multiplying profiles also occurs in this case. Thus, before starting another test run it was important to check for hanging browser instances, kill their processes and clean up obsolete temporary browser profiles.

Figure 4: Selenium Check Status in the Nagios List of Services


wanted maximum modularity in order to evenly distribute the load between rather modest cloud nodes (e.g., test driver on one node, test executor on another), so a dedicated instance of Selenium RC on a separate node was preferable. Besides, launching the RC server each time would noticeably increase test execution times. Our cost estimates for introducing all of the required changes to the original plugin were too high. We ended up implementing a console Selenium/JUnit test suite using a rigid scenario (i.e., login do something meaningful logout). All the locators and all configuration parameters were stored externally in a single text file. This concept is close to Seleniums Page Object pattern, but all of the required changes are made in the text configuration file at the deployment location. Therefore, in most cases you do not have to recompile and redeploy the test suite. Selenium RC was also set up as a service, starting after XVFB in the init.d sequence and using the provided virtual display. We have set up Firefox 3.6 on the same node, and explicitly configured our test suite to use it for the test run. Another typical problem we had to deal with was Firefox accidentally locking up after completing the test suite, and then blocking its temporary profile. In the simplest case, this led to an inability to run subsequent tests. However, if the test suite is configured to generate new Firefox profiles automatically, we could end up in a quick memory leak (assume you spend 200MB to 300MB on each

The cleanup procedure was implemented as a separate NRPE command called from the main shell script running the test suite (immediately before and after the test run). Please refer to the code snippets in Figures 2 and 3.

Integrating the test suite with Nagios


Nagios concept of interacting with its plugins is truly simplistic it will accept anything that is:
Runnable from the command line; Producing console output; and Giving proper exit codes (0 for success, 1 for warning, 2 for critical, and 3 for unknown).

Setting Up Selenium RC
To keep the test run times short and to increase the stability of the environment we decided to set up the Selenium RC server as a dedicated service the main reason being noticeable memory and CPU consumption by the browser under RC control. Selenium RC needs X11 for running in graphic mode (this is required to host the browser in which all operations on the application undertest are performed). However, the cloud nodes were provided headless with no X11 server by default. So we first added a virtual display using XVFB. It was configured to run as a service; we did not need to start it manually each time.

Figure 5: Detailed output of the Selenium check in Nagios


www.automatedtestinginstitute.com July 2012

16 Automated Software Testing Magazine

Monitoring AJAX
We did not really need to accumulate performance statistics and other tricky features a result in terms of passed/ failed was sufficient. Hence, we would be safe even with a direct call like this as the Nagios command definition:
java cp $PATH_TO_JUNIT_JAR:$PATH_ TO_SELENIUM_CLIENT_JAR:$PATH_TO_ SELENIUM_TESTSUITE_JAR org.junit. runner.JUnitCore $SELENIUM_TEST_ CLASSNAME

We wrapped the execution of the Selenium test suite, which is actually a JUnit test run, into a shell script. This script performs some remote cleanup before and after the run, and passes the effective exit code of the test suite to Nagios (see Figure 2). The first line of the scripts console output is used as the status message on the Nagios web summary page. The whole output is printed on the details page of the service check. This offers a convenient way of reviewing stack traces of exceptions in case any occurred during the test run. For a successful run, Nagios will report something similar to the chart shown in Figure 4. The illustrations in Figure 6 and 7 represent the final wiring of our monitoring solution.

Figure 6: Overall monitoring layout

Summary
Using Selenium and Nagios, we have built a stable, small-footprint and low-cost solution for monitoring the availability and basic functionality of an AJAX web application. The maintenance efforts are minimized thanks to selfcleaning and externalization of the frequently changed test suite parameters in a text configuration file.

Figure 7: Selenium checks layout


Executor/details 4. Selenium - http://seleniumhq. org/ 5. Selenium RC - http:// seleniumhq.org/projects/remotecontrol/ 6. check selenium plugin - http:// devops-abyss.blogspot. com/2010/06/selenium-andnagios.html 7. Alan Richardsons Selenium Simplified - http://www. compendiumdev.co.uk/selenium/ 8. Web application monitoring using Nagios and HTMLUnit - http://otacres.wordpress. com/2010/09/10/webapplication-monitoring-usingnagios-and-htmlunit/

References
1. Nagios - http://www.nagios.org/ 2. Nagios plugins - http:// nagiosplugins.org/ 3. NRPE (Nagios Remote Plugin Execution) - http://exchange. nagios.org/directory/Addons/ Monitoring-Agents/NRPE-2D-Nagios-Remote-PluginJuly 2012

www.automatedtestinginstitute.com

Automated Software Testing Magazine

17

Its often said that repeating the same action over and over and expecting different results is the definition of insanity. Yet oddly, this seems to have become the standard for many organizations implementing software test automation.

Addressing th Test Automatio


Mainstream, commercial test automation tools have been around since the late 1980s. Open source test automation frameworks and tools like FIT, FitNesse, Selenium, White, Sahi, and Cucumber are commonplace. Every up-and-coming development methodology puts a strong emphasis on testing. In fact, test automation is no longer simply considered a nice-to-have its quickly becoming a requirement. Why, then, do most development organizations still fail to get it right? The answer is bad practices across the spectrum of development and testing methodologies. Whether its traditional waterfall or Agile, the general community still isnt getting it. So where are we going wrong?

18 Automated Software Testing Magazine

www.automatedtestinginstitute.com

July 2012

Record & Playback

he Flaws in on
Single Point of Failure

u ve pra to n S By Clin

Wrong Framework

July 2012

www.automatedtestinginstitute.com

Automated Software Testing Magazine

19

Its (not) So Simple


Its often assumed that the record-and-playback nature of the tools means that anyone who can use a mouse can do test automation. This assumption is due in part to the industrys heavy reliance on subjective criteria (such as ease of use or user interface) more than objective criteria of test automation software (such as built-in support for the technologies of the application under test, ability to re-factor test code, etc.). The reality, however, is that test automation at its core is software development. Assuming that you can send the intern to test automation training and then be able to build efficiencies into your software development process is ridiculous. Quite the contrary, businesses must take the time to build a test automation team of people who understand software development and programming concepts. The testers dont need to be seasoned developers, but they must have the wherewithal to understand, develop, and maintain test automation code.

Record and Playback


This relates heavily to the previous assumption that anyone can build test automation. Record and playback is popular because it makes team members feel productive, even though its nothing more than an exercise in futility. For example, lets say you were able to record and replay your morning commute. What happens when there is a wreck, or traffic, or a four-way stop? Record and playback lets your team members create throwaway test scripts. The script doesnt work? Dont worry, launch the recorder and start over. An item in the list was moved from #5 to #29? No problem, launch the recorder and start over. And so on. Once again, this shows the importance of taking the time to build a test automation team of people who understand software development concepts. If automation were simple, there would be no need for test automation specialists.

Job Security Test Frameworks


There is occasionally a downside to having a dedicated test automation specialist, or having just one person responsible for building and maintaining a framework for an army of testers. If that person leaves the company or is absent for a period of time, who maintains the framework? Companies should never rely on a single engineer to develop and maintain something that another technical team member or new-hire cant quickly

20 Automated Software Testing Magazine

www.automatedtestinginstitute.com

July 2012

Addressing Flaws
understand or easily pick up. It is imperative that your framework is simple enough to maintain and well documented so that the company and team dont lose the efficiencies created by the framework in the first place.

Keyword-Driven Testing
Some companies have explored the idea of Keyword-Driven Testing (KDT), which involves building a code library of table-based functions/ action words so anyone can help automate application testing. Now the entire team can help us automate! Its a nice thought, but lets explore the pros and cons of this approach. First, KDT requires less technical expertise to create test automation and involves Business Analysts (BA) and Subject Matter Experts (SME) in the test automation process, while still allowing the automation engineers to do the heavy lifting. It also simplifies the link between testing and requirements specifications. On the flip side, KDT can actually increase the amount of maintenance for test automation efforts, rather than reduce it. For example, imagine that someone from payroll is brought in to test the new accounting application. The employee has been trained on the framework and is presumably ready to go. While testing the app, an error occurs: Object xyz failed to initialize. Shutting down. The test automation guru must then get involved, further complicating a process that a properly trained professional could have handled without assistance. In this sense, KDT can involve SMEs, BAs and testers in the wrong way. The intentions are good, but its setting a trap for failure. All that said, KDT is not necessarily a bad thing. The problem is not how it is implemented, but for whom it is implemented. Again, this goes back to the previous assertion that most people think test automation is so simple that anyone can do it. The entire team does not need to be involved in the test automation process. A better approach is to utilize those on the team that have the technical expertise to develop, maintain, and execute a keyword-driven framework.

Now the entire team can help us automate ! Its a nice thought, but lets explore the pros and cons of this approach.

Remember, test automation is software development, and it is not easy. Building efficiencies into the development process is a difficult undertaking in itself. However, repeating the same mistakes will keep test automation on the crazy cycle of software development. Dont look for the ultimate panacea for test automation, rather look for a practical, realistic approach to building a robust and reusable automation library that will deliver true ROI.

July 2012

www.automatedtestinginstitute.com

Automated Software Testing Magazine

21

Schedule At A Glance

TestKIT Conference
(Sign-up today - www.testkitconference.com)
Agile Testing (AG) Performance Testing & Security (PS)

Tracks

Test Automation (TA)

Automated Tools & Implementation (AT) Mobile, Virtualization & The Cloud (MC)

Frameworks & Methodologies (FM) Test Management, Teams & Communications (TM)

Monday, October 15, 2012


7:00am - 8:00am Conference & Tutorial Registration & Continental Breakfast 8:00am - 8:30am General Session: Welcome To TestKIT! 8:45am - 11:45pm Tutorial Morning Sessions
TUT1: Agile Functional Test Automation (Morning Session), Linda Hayes, Worksoft, Inc. TUT2: Production Performance Testing in the Cloud, Dan Bartow, SOASTA TUT3: Free and Cheap Test Tools, Randy Rice, Rice Consulting Services, Inc. TBK: Test Automation Body of Knowledge Training - Day 1 (Morning Session)

12:00pm - 1:00pm Lunch 1:00pm - 2:00pm Vendor Exhibition 2:00pm - 5:00pm Tutorial Afternoon Sessions

TUT1: Agile Functional Test Automation (Afternoon Session), Linda Hayes, Worksoft, Inc. TUT4: Preparing for the CISSP, James Hanson, Helm Point Solutions, Inc. TUT5: Transitioning to Agile Testing - The Mind of the Agile Tester, Bob Galen, iContact TBK: Test Automation Body of Knowledge Training - Day 1 (Afternoon Session)

Tuesday, October 16, 2012


7:00am - 9:00pm Conference Registration 7:00am - 8:00am Continental Breakfast 8:00am - 9:00am Keynote Presentation: Test Team Leadership: Yes, Theres a Place for it in Agile, Bob Galen, iContact 9:15am - 10:15am Breakout Session Group 1
TA1: How To Become An Automation Entrepreneur, Linda Hayes, Worksoft, Inc. AT1: Agile Automated Testing with Open Source Tools and BDD: Cucumber, Specflow, WatiN and Selenium 2.0, Patricia Coronel, Huddle Group SA TM1: Moneyball and the Science of Building Great Testing Teams, Peter Varhol, Seapine Software FM1: The Pyramid Approach To Selecting an Automated Test Tool, Bernd Beersma, Squerist TBK: Test Automation Body of Knowledge Training - Day 2 (Morning Session)

10:30am - 11:30am Breakout Session Group 2


TA2: Open Source or Proprietary? A Roadmap to Test Automation at a Government Agency, Andrew Gillis, Virginia Workers Compensation Commission AT2: Support for the Unsupported: Extending QTPs Ability to Interact with Third Party Controls, Jeff Downs, LexisNexis TM2: The Software is Ready... No Its Not, Peter Varhol, Seapine Software FM2: Insight to Designing an Automated Framework, Robert Mastrostefano, Booz Allen Hamilton TBK: Test Automation Body of Knowledge Training - Day 2 (Morning Session continued)

11:45am - 1:15pm Lunch and Keynote Presentation: Keynote, Linda Hayes, Worksoft, Inc. 1:30pm - 2:30pm Breakout Session Group 3
TA3: Test Automation Patterns, Seretta Gamba, Steria Mummert ISS GmbH AG1: Agile Testing: Facing the Challenges Beyond the Easy Context, Bob Galen, iContact

22 Automated Software Testing Visit Magazine www.automatedtestinginstitute.com http://www.testkitconference.com for available speaker bios and session descriptions.

July 2012

Schedule At A Glance
PS1: Production Performance Testing in the Cloud, Dan Bartow, SOASTA, Inc. MC1: Open Source or Commercial Mobile Platform: Which is Right For My Testing Team?, Patrick Quilter, Quilmont TBK: Test Automation Body of Knowledge Training - Day 2 (Afternoon Session)

2:45pm - 3:45pm Breakout Session Group 4


TA4: Improving Automation Execution By Distributed Testing, Arul Murugan Mani, Cognizant Technology Solutions AG2: Agile and Exploratory Testing For Test Automation Design, Randy Rice, Rice Consulting Services, Inc. PS2: Performance Test Environments On Demand with Service Virtualization, Wayne Ariola, Parasoft MC2: Best Practices and Case Studies In Selecting a Mobile Testing Solution For Your Enterprise, Yoram Mizrachi, Perfecto Mobile TBK: Test Automation Body of Knowledge Training - Day 2 (Afternoon Session continued)

3:45pm - 4:30pm Vendor Exhibition 4:30pm - 5:30pm Breakout Session Group 5


TA5: Success with Automated Regression Test Using QC/QTP and BPT, Ane Clausen, Alm.Brand AG3: Ride the Lightning: Conducting QA in an Ultra Fast Environment, Jeff Perlin, Videology PS3: Getting Ready To Meet DoD 8570, Information Assurance Workforce Improvement Program Requirements, James Hanson, Helm Point Solutions, Inc. MC3: Automation with Virtualization, Don Goodman, Mandiant TBK: Test Automation Body of Knowledge Training - Day 2 (Afternoon Session continued)

6:00pm - 8:00pm Dinner Reception: 4th Annual ATI Automation Honors Awards Ceremony

Wednesday, October 17, 2012


7:00am - 8:00am Continental Breakfast 8:00am - 9:00am Keynote Presentation: Keynote, Dan Bartow, SOASTA, Inc. 9:15am - 10:15am Breakout Session Group 6
AT3: Java Multithreading for Test Tools, Robert Wimsatt, Sotera Defense Solutions, Inc. AG4: Using Acceptance Tests To Drive Development, Quality and Faster Releases, Arin Sime, AgilityFeat FM3: Will Your Automation Be Running in 10 Years? (Part 1) , Simon Mills, Ingenuity System Testing Services Ltd. TM3 Marketing You 101: Harnessing Social Media, From Interview to Offer, Christine Keady, TrustedQA, Inc. TBK: Test Automation Body of Knowledge Exam (Morning)

10:30am - 11:30am Breakout Session Group 7


AT4 UI Test Automation with Jemmy, Alexandre Iline, Oracle AG5 Orthogonal Arrays, Model-based Automation and Other Techniques for Testing in an Agile Lifecycle, Anastasios Kyriakipoulos, Tricentis FM4: Will Your Automation Be Running in 10 Years? (Part 2), Simon Mills, Ingenuity System Testing Services Ltd. PS4: Penetration Testing Demystified, Edward Bonver, Symantec Corporation TBK: Test Automation Body of Knowledge Exam (Morning)

11:45am - 12:45pm Lunch 1:15pm - 2:15pm Discussion Forum 2:15pm - 2:30pm Afternoon Break 2:30pm - 3:30pm Breakout Session Group 8
AT5: Tcl/Tk For Testing, Robert Wimsatt, Sotera Defense Solutions, Inc. MC4: Accelerate Parallel Development with Service Virtualization, Wayne Ariola, Parasoft FM5: Getting It Right The First Time, Nick Olivo, SmartBear TBK: Test Automation Body of Knowledge Exam (Afternoon)

3:45pm - 4:45pm Breakout Session Group 9


AT6 Automation of WPF Applications - An Extensibility Approach, Arul Murugan Mani, Cognizant Technology Solutions MC5: Test Automation Empowered by Crowdtesting, Peter Kartashov, Bugpub FM6: Automated Testcase Generation and Execution from Models, Dr. Dharmalingam Ganesan, Fraunhofer Center for Experimental Software Engineering TBK: Test Automation Body of Knowledge Exam (Afternoon)

4:30pm - 5:00pm General Session: TestKIT Closeout July 2012 www.automatedtestinginstitute.com Automated Software Testing Magazine 23

Visit http://www.testkitconference.com for available speaker bios and session descriptions.

A testing

solution should accommodate multiple platforms

Automation

Plannin

C
The

ustomers expect their banks to be accessible from their mobile devices. They use their mobiles to book flights,
shop and perform most of the actions traditionally associated with desktops.

Enterprise

employees expect to use

enterprises do not exist.

To

remain

their internal mail and additional

applications on their mobile devices.

competitive, enterprises are mobilizing their systems and providing instant reliable access to their services. institutions

Organizations must have a mobile presence; there is no choice about it.


immediate availability of information access and communication has become a standard. dependency means

This

is especially prominent in financial

Mobile

device

always

staying

(insurance and banking), health, retail, and travel-service providers. In order to keep up with market needs and to stay relevant,
enterprises are rushed into the mobile industry without appropriate planning and quality assurance.

Stat research forecasts that proximity


mobile payment transactions approach

connected, anywhere, anytime. NPD Inwill

This

1.1

9.9

billion in

billion in

increase.

Without

2012,

a mobile presence,

nearly a ten-fold

2016,

up from

in turn results in poorly developed applications that lack proper quality and support.

24 Automated Software Testing Magazine

www.automatedtestinginstitute.com

July 2012

ng

a Mobile Test n Strategy That Works


By Yoram Mizrachi

Employing the ACE Strategy: A - Automated QA C - Cloud-based Platform E - Existing ALM

July 2012

www.automatedtestinginstitute.com

Automated Software Testing Magazine

25

The Mobile Challenges


Understanding mobile challenges requires understanding the mobile market. The mobile market is rapid, fragmented, and localized. The exciting competition that is occurring among device platforms is improving the market dynamic for consumers; however, at the same time, it is simultaneously creating very difficult working grounds for application developers.

Rapid mobile market changes


Due to the ongoing battle for mobile market dominance, changes occur rapidly. These changes are dynamic and take place on a monthly basis. More than 200 android devices were launched last year alone. These go hand in hand with the many operating systems and form factors that contribute to this ever-growing market. In addition to all of this, devices connect to various networks. In some cases, the devices cannot be shipped out of their target

Figure 1: The Mobile Environment


geography due to security regulations. As a whole, the mobile application development environment contains various combinations of platforms, networks, and operating systems. The large number of devices making up the mobile market is ever growing with additional devices constantly being introduced. The below illustration shows the mobile device market dynamics, including: Android, iOS, Windows, BlackBerry, and other device releases in a single year. For example, Android has introduces three major OS versions, alongside new devices such as tablets all in one year. This also includes several minor versions, which in many cases incorporated major changes to the device functionalities. These new devices and OS versions must be supported, developed and tested by an application to keep it relevant.

Figure 2: Mobile phone market dynamics


26 Automated Software Testing Magazine www.automatedtestinginstitute.com July 2012

Multiple devices, platforms,


and operating systems

Here today, gone tomorrow is probably the best way to characterize the pace of change in the mobile market. Its safe to say that at least 30 percent of the popular handsets and tablets today will become outdated and irrelevant in the next few months. The mobile market is extremely dynamic, unpredictable and fragmented. The numerous operating systems and multitude of platform versions, networks, hardware, and form factors make it challenging to maintain mobile application quality. Taking a look at OS versions. New devices contain the latest or near-latest OS versions, and usually automatically upgrade to the newest available OS version replacing the older OS version. There are no guarantees that an application developed according to an older OS version will function properly with a newly introduced OS version; enterprises have no choice but to conform to this pace, and continuously develop/test version updates for their applications. Figure 3 provides a mere 6-month timeframe of the Android OS version updates. During January 2011, Android 2.2 OS version was leading approximately half of the Android mobile market. A few months following, Android 2.3.3 took its place. In March 2012, Android 2.3.3 reached over half of the Android mobile market, and is projected to eventually take over the market. This is only one example of the many competing platforms that are available in the market. For a better understanding of the market dynamics, this example should be multiplied by the number of available platforms including iOS, Android, BlackBerry, and Windows Phone. Taken from StatCounter Global Stats, Figure 4 shows the usage growth rates of the top eight mobile operating systems in North America. Here it can be seen how the mobile market unexpectedly July 2012

Figure 3: Android devices/platform versions Compatibility


Source: Android Developer, Platform Versions

fluctuates with no defined leader or standards. To ensure the success of an application, all relevant platforms should be covered. Mainly from a performance point of view, mobile networks should also be included in testing.

versions, and connect right away to applications and websites. Although an organization may not be prepared to introduce an updated application version, users expect nothing less than a flawless user experience. In the Mobile market, the risk accumulated between product releases is much greater than with traditional software. This leaves no choice but to accelerate the release cycle in order to limit risk exposure. In conclusion, when it comes to mobile, a shorter development cycle is needed as well as the ability to test an application continuously.

Adapting the software

development cycle to mobile

Mobile application testing simply cannot be served by the traditional development/ QA cycle. As stressed previously, the market is extremely dynamic and unpredictable. A tremendous number of customers will instantly adopt newly released mobile devices and OS

Figure 4: Top Mobile OSs in North America (Feb 2010 to Feb 2012)
Source StatCounter global Stats

www.automatedtestinginstitute.com

Automated Software Testing Magazine

27

Keep in mind
that with all these difficult challenges, mobile is one of the most exciting technological advancements available today
Figure 6: Software QA vs. Mobile QA (simplified)
Source: Perfecto Mobile

Figure 5: Shorter mobile development cycles reduce risk


Source: Perfecto Mobile

28 Automated Software Testing Magazine

www.automatedtestinginstitute.com

July 2012

Mobile applications undergo a porting process. This creates several different versions of the application with respect to each device. When developing and testing for mobile, short development cycles and continuous QA enable accommodating to the rapid market changes. The two factors measuring the market gap are: what users want (such as new features and functionalities); and what the market offers (such as devices, browsers, and processing power). Increasing the timeframe between versions will increase the application response gap to the market needs. Shortening the release cycle will allow a quicker reaction to the market needs. The gap between market requirements is larger because of various changes and new introduction of technologies and platforms. Shortening this gap will shorten the development cycle, which will require releasing application updates more frequently. This message is a very powerful one for successful mobile applications. Iterative and agile mobile methodologies are more aggressive. Choosing not to quickly release update versions will make an application irrelevant. See the poor application example in Figure 7 with its ratings and user comments.

Automation
is an enabler for success and not a cost reduction tool in testing

Figure 7: Poor App Rating


Source: Perfecto Mobile

Mobile affects all testing


phases

The simplified illustration in Figure 6 depicts the shift from traditional to mobile development. The orange highlights show all of the areas that have been affected by mobile. In short, all of the traditional development activities have remained, with some changes, and new activities such as Interoperability and Compatibility (porting) have been added.
Interoperability is a completely new mobile phenomenon. Browsers do not receive phone calls, but phones do. Similar device events, such as an SMS, can occur at any moment, causing unexpected interrupts to a running application or transaction. For example, an online purchase can be interrupted by an incoming call or dropped because of Wi-Fi issues. Traditional software developers have the luxury of assuming all PCs are basically the same, regardless of the manufacturer, CPU and memory. In mobile this is impossible. The differences between devices are too great and cannot be ignored. Compatibility, also known as porting, is therefore added to the development cycle to validate application/ content compatibility, performance and user experience across devices. As a response to the array of available devices,

Identifying a testing strategy that works


A testing solution should accommodate multiple platforms, form factors, and networks. Mobile applications include the traditional internal software versioning, as well as version accommodation for mobile. As per the

version is released. However, when releasing a mobile application version, continuous testing is needed because of the ever growing stream of devices and versions.

Automation testing is a must!


To be able to deliver a short development cycle, the QA timeframe must be shortened. This is a major shift in the delivery concept. As development cycles become shorter over time, the need for automation in regression testing

traditional software release cycle, once an application has been tested in the QA phase, it is released for production. Production is updated when the next

has turned from nice to have into mandatory, particularly when using agile or iterative development methodologies. By automating the functional and 29

July 2012

www.automatedtestinginstitute.com

Automated Software Testing Magazine

A cloud-based solution enables cost/time-efficient


management

Best practices for mobile testing indicate the need to access between 30 and 40 fully functional devices. To keep up with the market dynamics, an estimated 10 devices will have to be replaced each quarter. The number of supported devices will grow significantly within the first year of introducing the mobile application to the market. Managing the logistics of these handsets within the different geographical locations is a challenging task. Utilizing a cloud-based solution will allow an enterprise to avoid the hassle and costs of procuring and managing new devices.

Figure 8: Must, Major, and Market devices (simplified)


Source: Perfecto Mobile

regression testing of mobile applications, it is possible to shorten the timeframe and provide an accurate application state snapshot. Automation allows testing on more devices in less time and reduces the requirements gap. The result is a shortened and systematic ALM cycle that allows for continuous QA, better coverage, easier re-creation of problems and substantial cost savings. Do not to underestimate the complexity of launching mobile applications. It is common to experience a cycle of over-optimism following development on the first platform (OS), followed by disillusionment resulting from difficulties once the second platform and associated devices are added to the mix. The realities of an extremely dynamic market require a well-planned and methodical approach. In light of this fragmentation, it is highly recommended to adopt a deviceagnostic testing approach that allows writing test scripts once, and then reusing them on multiple platforms. Script automation should support low-level functionalities, such as key and screen press as well as, logical abstraction which enable the execution of virtual functions, such as a login, that are not dependent on a particular device or 30 Automated Software Testing Magazine

platform. When planning QA, automation is a must. As opposed to some beliefs, automation is an enabler for success and not a cost-reduction tool in testing. Automating testing enables the use of a single testing script across many devices. An example of this is the Perfecto Mobile patented ScriptOnce technology, a comprehensive mobile testing automation solution.

Purchasing devices and having them on a developers desk poses security issues as these devices tend to disappear and limits access to only one physical location. To meet enterprise security standards, it is essential that mobile application testing is performed on secure devices that can only be accessed by the organization. A dedicated cloud of devices ensures that the required devices are always available for testing, and that applications in the development process are always secured. This private cloud should also be configurable to comply with the organizations security policies, including firewall requirements

Figure 9: HP QTP UI with MobileCloud extension


Source: Perfecto Mobile

www.automatedtestinginstitute.com

July 2012

and other needs. As a whole, a cloud-based approach: Enables globally distributed teams to share devices during live testing; Meets enterprise security measures; Targets network availability; Is logistics free.

organization. Companies around the world invest more than $50 billion per year on applications testing and quality assurance, according to Pierre Audoin Consultants (PAC). Rather than reinventing the wheel, it is significantly more effective to utilize existing ALM processes such as a management console, business-logic and high level scripts, and scripting languages. Using a system that extends rather than just integrates with the existing platform is a cost-efficient and timesaving solution. There are existing solutions available. An example solution is the Perfecto Mobile MobileCloud for QTP. Since this solution is a QTP extension, it allows the user to connect to the MobileCloud and execute scripts within QTP, using traditional QTP scripting elements. This leverages existing assets extended to mobile, with a hidden debugging support component. Mobile developers and testers can use the MobileCloud extension to log into specific devices remotely. The MobileCloud for QTP was developed with close collaboration between the Perfecto Mobile and HP development teams. This concept should not be confused with available integrated applications that require working with multiple environments. This is a single environment within a single application, extending the existing ALM to include Mobile. There is no exchange of data items between the two applications. The MobileCloud works within QTP, access

and manipulations are not performed on a separate system. Additionally, this integration goes beyond script writing. It is possible to leverage it to include the full range of HPs ALM tools including Quality Center, LoadRunner and BSM.

To Summarize
Offering an attractive application that remains relevant and available across devices is a challenge. This challenge can be overcome with a combination of a good methodology and tools. This methodology will need to embrace the Mobile timeframe, apply a quick and continuous lifecycle, and utilize existing ALM tools. Automating testing is a must. It enables a quick-pace. A cloudbased approach will help to enable collaboration and to remove all logistical challenges in an effort to keep up with the market pace. It is recommended to use the following ACE selection criteria: Automation Mobile test automation enables a shortened ALM cycle, increases coverage, facilitates re-creation of problems and saves costs. Cloud-based platform Cloudbased access to REAL handsets located in live networks helps avoid the hassle and costs of procuring and managing new devices, while it facilitates distributed teams collaboration. Use Existing ALM Resources Leverage existing tools, processes and knowledge by extending the current ALM framework to support mobile testing.

To address market dynamics, mobile applications need to be developed and tested on multiple platforms. For this reason, it is important to identify between six and eight must devices to run rigorous sanity and regression testing nightly. To achieve a better representation of the market, it is recommended to extend testing to cover approximately 12 major devices during the QA phase. The bulk of the functional and regression testing is to be performed against these devices. Automation in both of these phases is critical in order to allow the release of new applications and functionality to the market in a timely fashion. Regardless of the phase, approximately 30 percent of the devices will need to be replaced each quarter to account for new devices introduced to the market.

There is no need to re-invent another ALM environment


Although traditional software and mobile testing have some differences, it is certainly beneficial to continue using existing testing tools and knowledge that are readily available within the

Keep in mind that with all these difficult challenges, mobile is one of the most exciting technological advancements available today. Although this is the eye of the storm, it is also the center of technology.

July 2012

www.automatedtestinginstitute.com

Automated Software Testing Magazine

31

32 Automated Software Testing Magazine

www.automatedtestinginstitute.com

July 2012

We have 203 guests online

July 2012

www.automatedtestinginstitute.com

Automated Software Testing Magazine

33

I BLog To U

Latest From the B

Automation blogs are one of the greatest sourc automation information, so the Automated Tes decided to keep you up-to-date with some of th posts from around the web. Read below for so posts, and keep an eye out, because you never will be spotlighted.
Blog Name: Narendra Parihars Blog Post Date: March 29, 2012 Post Title: Test Automation Failures Author: QualitySpreader
Blog Name: Software Quality Matters Post Date: June 22, 2012 Post Title: .HTML5 Test Automation for Beginners Author: Goran Begic

Every now and then we keep seeing automation failures. Most of Testers have been part of these failure stories as actors or audience or directors :-) I am sharing top 3 reasons for automation failure in this post which is kind of little modified from my post on blogspot @ http://infominesoftware.blogspot.com/#!/2010/10/ why-does-test-automation-fail-everynow.html

Everything you type into browser windows, Web page forms, all the buttons you click, pages you open are remembered together with the order with which you interact with the Application Under Test (AUT). These sequences can be played back as tests, they can also be reviewed, updated, or turned into test scripts. The benefit of this approach is that the automation tool can do everything you can do when testing manually and you dont have to script pre-conditions and other setup.
Read More at: http://blog.smartbear.com/software-quality/bid/174155/HTML5-TestAutomation-for-Beginners

Read More at: http://blogs.msdn.com/b/narendra_parihars_blog/archive/2012/03/29/test-automation-failures.aspx

34 Automated Software Testing Magazine

www.automatedtestinginstitute.com

July 2012

Blogosphere

ces of up-to-date test sting Institute has he latest blog ome interesting know when your post
Blog Name: Test This Blog Post Date: April 12, 2012 Post Title: Test Automation Scrum Meeting Ambiguity Author: Eric Jacobson Blog Name: 3Qi Labs Post Date: April 18, 2012 Post Title: Automation Best Practices: Building From Scratch Author: Admin

The goal of writing automated checks is to interrogate the system under test (SUT), right? The goal is not just to have a bunch of automated checks. See the difference? Although your team may be interested in your progress creating the automated checks, they are probably more interested in what the automated checks have helped you discover about the SUT.

This is perhaps the most critical aspect of a good test automation implementation. The decisions you make during the Build phase of the implementation will impact you throughout your automation life-cycle. This means the initial building out phase of a proper test automation implementation requires a number of things which we will be covering in this section of the our blog series: Best Practices for Achieving Automated Regression Testing Within the Enterprise
Read More at:
http://3qilabs.com/2012/04/best-practices-for-achieving-automated-regressiontesting-within-the-enterprise-building-your-test-automation-from-scratch-section-1/

Read More at: http://www.testthisblog.com/2012/04/test-automation-scrum-meeting-ambiguity.html

July 2012

www.automatedtestinginstitute.com

Automated Software Testing Magazine

35

Go On A Retweet

Paying a Visit To
Microblogging is a form of communication based on the concept of blogging (also known as web logging), that allows subscribers of the microblogging service to broadcast brief messages to other subscribers of the service. The main difference between microblogging and blogging is in the fact that microblog posts are much shorter, with most services restricting messages to about 140 to 200 characters. Popularized by Twitter, there are numerous other microblogging services, including Plurk, Jaiku, Pownce and Tumblr, and the list goes on-and-on. Microblogging is a powerful tool for relaying an assortment of information, a power that has definitely not been lost on the test automation community. Lets retreat into the world of microblogs for a moment and see how automators are using their 140 characters.

Chrome uses way more memory than Firefox, Opera or Internet Explorer http://prsm.tc/oCUyvb well done to @opera

Twitter Name: TechWell Post Date/Time: May 7 Topic: Testing Mobile Apps

Wherever You Go: Testing Mobile Applications, Part 2 In part 1 of this interview with Jonathan Kohl on mobile test... http://ow.ly/1jwaWt

Twitter Name: AutomatedTester Post Date/Time: Jun 22 Topic: Browser Memory


36 Automated Software Testing Magazine www.automatedtestinginstitute.com July 2012

The Microblogs
Cartoon Tester: A bug is a bug http://bit.ly/JNWArx

Twitter Name: alanpage Post Date/Time: Jul 9 Topic: Whining About Testing

Re: my testers whining tweet. 1) There are better methods of communication. 2) Testers also seem to whine about *testing* a lot. #imo #ymmv

Twitter Name: sbarber Post Date/Time: May 17 Topic: Bugs

tech debt is code which a reasonable engineer, in the present, wishes was different http://bit.ly/IpXuL5

Twitter Name: QATestLab Post Date/Time: Jun 26 Topic: Testing Career

Building the Career as a Professional Software Tester http://p. ost.im/p/e3SwTC

Twitter Name: rubytester Post Date/Time: Apr 25 Topic: Technical Debt

July 2012

www.automatedtestinginstitute.com

Automated Software Testing Magazine

37

Hot Topics in Automation

Where in the World is ATI?


With ATI Europe and More, ATI is Hot Around the World!
Netherlands, this was consider an ATI sister event and even featured, ATIs own lead advisor, Dion Johnson as a co-host and keynote speaker. The event focused on the future of test automation, yet had numerous talks that were pertinent to successful test automation implementation not only in the future, but also in the present. Topics included: Testing of processes within agile environment using Fitnesse and Selenium, Model-based testing in action, and Application Virtualization in Practise: Modelling of the GBA-V. ATI is officially announcing its local chapter program, beginning with its first local chapter based in Europe appropriately called ATI Europe. The goals of the Local Chapter Program are as follows: Further the disciplines of testing and test automation The automation community has spoken loud and clear, and ATI is clearly hot! Anywhere in the world that you can find software testing and test automation, you are likely to hear a reference to the Automated Testing Institute. Here are a few highlights. Increase visibility and awareness of test automation as a distinct discipline that supports the disciplines of software quality and testing. Enhance the awareness of test automation as a discipline that, like other disciplines, requires continuous education and the attainment of a standard set of skills.

ATI Europe

Automation Day Test Automation Day 2012 was a full day event focused on test automation. Organized by CKC Seminars and held on Thursday, June 21, 2012 in the WTC in Rotterdam,

Help meet the needs of the community on a more localized and personal level Offer training and events for participation by people in specific areas around the world. Help provide comprehensive, yet readily available resources that will aid people in becoming more knowledgeable and equipped to handle tasks related to testing and test automation. Assist in making professional certifications more readily available.

ATI Europe will help achieve these goals in Europe and around the world. You can register as a member of ATI Europe from the general ATI Registration site. In addition, ATI Europe is planning a training event towards the end of 2012, so stay tuned for more.

Keynotes Dion Johnson and Scott Barber Talk at Test Automation Day 2012

TABOK in Japan . This is a greeting to the ATI Community in Japan. Over the past year, demand for the Test Automation (Continued on page 40)
July 2012

38 Automated Software Testing Magazine

www.automatedtestinginstitute.com

Crowdamation
Crowdsourced Test Automation

It

will offer the flexibility to


commercial), have teams

use a tool of choice (open source and

operate out of different locations, address the challenges of different platforms introduced by mobile and other technologies, all while still maintaining and building a cohesive, standards-driven automated test implementation that is meant to last.

Its OK To Follow the Crowd


Inquire at contact@automatedtestinginstitute.com

Its OK To Follow the Crowd

Local Chapter News

Latest From the Local Chapters


ATI Welcomes the Newest Local Chapter

Newsflash!
ATI welcomes ATI Europe to the ATI family! Led by Andre Boeters, this latest chapter has been established to address the local needs and concerns of the European automation community. Test automation training is being organized and planned by ATI Europe in conjunction with ATI so stay tuned!
Where in the World is ATI?
(Continued from page 38) strategies, techniques and best practices from peers and leaders in their field relative to security, testing techniques and methodologies, the cloud, test automation, mobile test automation, test tool implementations, open source solutions and more. Although held in the US, this event has a strong international presence. Not only are attendees signing up from abroad, but several speakers helm from outside of the states. In addition to speakers from the US, we will welcome speakers from Buenos Aires, The Netherlands, India, Germany, Russia, Denmark and England. Come network, learn and exchange ideas with like-minded professionals from around the world in an environment that will allow you to build your testkits with concrete takeaways and information that youll use to move your testing and test automation efforts forward. July 2012

Body of Knowledge (TABOK) and TABOK Guidebook has dramatically increased in Japan. The TABOK is a tool neutral skill set designed to help software test automation professionals address automation challenges that are present in the world of software testing. Japans embrace of the TABOK is not necessarily surprising, given the way that William Edward Deming, often credited as being the father of modern day quality, was embraced in Japan during the last century. Japan clearly understands the importance of standardization and the identification of critical skills for effectiveness.

TestKIT 2012 Conference International Presence The TestKIT Testing and Test Automation Conference being held on October 15-17, 2012 at the BWI Airport Marriott in Linthicum, MD, provides a platform for attendees to learn
40 Automated Software Testing Magazine

www.automatedtestinginstitute.com

Are You Contributing Content Yet?


The Automated Testing Institute relies heavily on the automated testing community in order to deliver up-to-date and relevant content. Thats why weve made it even easier for you to contribute content directly to the ATI Online Reference! Register and let your voice be heard today!

As a registered user you can submit content directly to the site, providing you with content control and the ability to network with like minded individuals.

Community Comments Box

>> Community Comments Box - This comments box, available on the home page of the site, provides an opportunity for users to post micro comments in real time. >> Announcements & Blog Posts - If you have interesting tool announcements, or you have a concept that youd like to blog about, submit a post directly to the ATI Online Reference today. At ATI, you have a community of individuals who would love to hear what you have to say. Your site profile will include a list of your submitted articles. >> Automation Events - Do you know about a cool automated testing meetup, webinar or conference? Let the rest of us know about it by posting it on the ATI site. Add the date, time and venue so people will know where to go and when to be there.

Announcements & Blog Posts

Automation Events

Learn more today at http//www.about.automatedtestinginstitute.com

http://www.googleautomation.com

Software Test Automation Training


www . training . automatedtestinginstitute . com Software Test Automation Foundations Automated Test Development & Scripting Designing an Automated Test Framework Advanced Automated Test Framework Development Mobile Application Testing & Tools Virtual Courses Automated Test Development & Scripting Designing an Automated Test Framework Advanced Automated Test Framework Development Mobile Application Testing & Tools
Come participate in a set of test automation courses that address both fundamental and advanced concepts from a theoretical and hands on perspective. These courses focus on topics such as test scripting concepts, automated framework creation, ROI calculations and more. In addition, these courses may be used to prepare for the TABOK Certification exam.

Training Thats Process Focused, Yet Hands On

Public Courses

July 2012

Public and Virtual Training Available


www.automatedtestinginstitute.com

Automated Software Testing Magazine

43

K The KIT is Coming

If you thought ATIs 2011 event was good, wait until you see 2012.
http://www.testkitconference.com
44 Automated Software Testing Magazine www.automatedtestinginstitute.com July 2012

You might also like