
Key Technologies and Tools: Jenkins, Docker, Docker-Hub, Git, Git-Hub, Amazon Web Services, Saucelabs, Blazemeter, Selenium, Appium, Webdriver, Test Automation, Agile, Waterfall, Rapid Development and Test, Business Driven Testing, Data Driven Testing, JUNIT, Test Suites , Java, Maven
This article is aimed at the long suffering Test Manager. Often the unsung hero, who at the last minute and under great pressure, brings it all together polishing the proverbial and assuring the delivered product actually meets most of its’ requirements and will operate as expected. Amongst the day to day chaos you have been given the task of finding out what all the fuss is around virtualisation, continuous integration and delivery and also, if it’s any good, can we have one as soon possible! If this description fits you we think you will find this article very useful. We describe how we have successfully implemented Test Automation within a continuous build and deployment framework.
A quick google will bring you all the definitions you could ever need (there are some good links at the end of this article) so let’s think about what you would like, what you can have, what it would cost and what tools and processes you would use. The list could far exceed the space we have here so, in interest of keeping this brief, we list a few small nice to haves that are consistent amongst our client surveys:
- Wouldn’t it be good if you could test changes as they were developed and automatically deploy or stage the tested code so we don’t have a mass panic before go live?
- As a Test Manager I would like to have a cost effective and rapid way of setting up Test Environment, Application and Test Data then wiping it all clean and starting a fresh on demand
- I would like a common framework that allows me to test applications with a write once and test in multiple ways say, across the web and mobile platforms
- My set up would have to grow and reduce in near real time and we only pay for what we use
The return on investment must exceed the cost of set up
Well that would be nice wouldn’t it? It’s probably no surprise that the above is possible; what probably is surprising is the fact that the tooling required to get going is absolutely free and is industry standard! That is worth repeating, the cost of tooling is absolutely free!
The above in fact describes just some of the benefits of continuous integration and virtualisation.
Ah, you say, that all sounds great but what does it really mean? Where do I get started?
Let’s take this a step at a time…
The first thing you will need is a platform to act as a backdrop. There are lots of cloud providers competing for your business – we have settled on Amazon Web Services (AWS). ASW is free to get started and will allow you to spin up and dispose of servers at will. You only pay for what you use and can replicate your builds easily. For example, we have created Linux based servers and windows boxes. You can log on using a device of your choice – laptops, tablets etc; and utilise the full power of the Cloud. If you find your machines lacking in power or storage you can expand at will. This will, of course, lead to higher charges, so if you find after a particularly intense testing effort you no longer need the horsepower, you can scale back and reduce costs. This is where the elastic comes from in EC2 Elastic Compute Generation 2.
The second thing you need is something to orchestrate the end to end flow and that is Jenkins. Jenkins is a continuous integration and continuous delivery application. Use Jenkins to build and test your software projects continuously. It is truly a powerful tool so it must be expensive right? It is free! Also, you would expect it to be hard to install and configure – well the basic implementation is quick and easy. Complexity of job configuration will increase in line with your actual tests; however, there are a wide range of plugins that ease the task of set up and configuration and cater for nearly everything you can think of. Once you get in to the swing of it you will find it hard to resist tinkering as you can set up a new job in minutes.
What about code control and deployment? We use a combination of Git Hub and Docker Hub for our version control and image build. GitHub is a web-based Git hosting service. It offers all of the distributed revision control and source code management (SCM) functionality of Git and comes with plugins for linking to Jenkins. The Docker Hub is a Cloud-based registry service for building and shipping application or service containers. It provides a centralized resource for container image discovery, distribution and change management, user and team collaboration, and workflow automation throughout the development pipeline. Both Git Hub and Docker Hub are, you guessed it, free to get started. If you want to make your repositories private you will start paying a small fee.
We mentioned images earlier and in this context we refer to Docker images. Docker allows you to package an application with all of its dependencies and data into a standardized unit for software development. With a single command you can for example, run a Tomcat server with a baked in application along with any static and user data. Sound useful? It is! With another command or two you can flatten and pull a new version allowing total reset of the environment. So, if the Development Team build and push the code you can extract and test it in a rapid time-frame.
The above components allow the software and data bundle to be developed, tested, and changed as required and pushed again. The cycle continues on and on building test coverage as it goes.
In summary so far:
- Developers create code using their tool of choice and push it to the Git repository
- Git Hub triggers Docker Hub – We use this to bundle the application and data into a single package for test
- Docker Hub notifies Jenkins that a fresh build is available for test
At last I have mentioned testing! True the above does start to stray into development and deployment territory, though it is important information for you to wrap your head around. From a testing perspective it is really helps to focus in on the Docker image as being the product.
We have built an application ROVER TEST INTELLIGENCE this is an excellent application in its own right allowing rapid comparison and analysis of millions of records in seconds. To test this we need a Tomcat server, a war file containing our application and a supporting database; a fairly typical bundle for a web based application. We have a single Docker image for the Tomcat server and war file and another for the database and one for the data. That is three in total – this suits our development approach. However, for testing purposes all these can be treated as a single unit. For us a change in any of the underlying components triggers the full set of test suites.
We use Jenkins to control our tests. A Git change triggers a Docker build which in turn triggers Jenkins to spin up a ‘slave’ machine on AWS and execute the tests. As illustrated we have two slave machines. Docker type operations are executed on a native Linux instance and GUI tests are run on a Windows based platform; the instances are only active whilst needed keeping costs to a minimum.
We create tests using the JUnit framework and Selenium Webdriver classes. The code is reusable and a single script can be executed for Web, JMeter and Appium mobile testing, minimising redundancy and duplication.
We also take advantage of some of the services offered by third party Cloud based providers, namely Saucelabs to provide extensive cross browser testing, and Blazemeter to scale up performance tests when we really need to crank up the horsepower and perform short burst enterprise level testing. This is done with minimum alteration to the script. Configuration is passed in via request parameters. Saucelabs and Blazemeter are elastic too with a free tier account ramping up and down with usage.
Further, Jenkins can be configured to run on a schedule as well as in response to changes; this allows you to Soak Test applications, driving out intermittency due to fore example environment factors and run tests when it’s cheaper. You can actually negotiate for cheap server time! Also, it will keep you updated by email.
In summary:
- Jenkins, Git Hub and Docker Hub can be used for your automated framework to build, test and deploy your code
- Focus on the Docker image as being the testable product; this can include code, databases, data and even servers
- JUnit and Selenium can be used for writing your reusable automated test scripts
- Test scripts are portable and can be directly utilised by third party Cloud providers to extend testing capabilities in an elastic fashion
- The tooling cost for your initial set ups are zero, you just need to add time and effort
When you get this combination right, it really does liberate you with less time spent manually testing and more time spent innovating. The traditional test integration phase all but disappears and non-functional requirements, so often forsaken in an agile context, get built as part of the deal. The return on investment accumulates as you go, with test coverage increasing at each iteration. Of course, there is a learning curve and a (less than you may think) maintenance cost, though we feel the benefits gained well worth the time and effort.
If you would like us to help you please get in touch at
info@redhound.net
Tel: +44(0)800 255 0169 FREE
Demos and further reading
What is Rover Test Intelligence? http://redhound.net/rover-test-intelligence/
What are Amazon Web Services? https://aws.amazon.com/
What is Jenkins? https://wiki.jenkins-ci.org/display/JENKINS/Meet+Jenkins
What is Git Hub? https://github.com/
What is Docker Hub? https://docs.docker.com/docker-hub/overview/
What is Docker?
What is JUnit? http://www.tutorialspoint.com/junit/junit_test_framework.htm
What is Selenium WebDriver? http://www.seleniumhq.org/
What is Sauce Labs? https://saucelabs.com/
What is Blazemeter? https://www.blazemeter.com/