CI (2)

Stable Selenium Tests

Let’s face it, they are slow, brittle and expensive to maintain. But then again there is no real replacement for making browser-based end-to-end tests. And as long as you keep the system test suite small, isolated and concise there shouldn’t be much to maintain.

On code testing code

Let’s keep it in mind that for any good system should include the following type of testing

  • many unit tests to test that the code honors its contact, how else can you know?
  • some integration tests for important flows through integrated components functionally
  • few system tests for important end-to-end flows, this tests that the system stands up
  • little manual testing using the human-eye for things that are hard to detect like usability

For this post i’m going to focus on the few system tests and how to make them as stable, readable and simple as possible. If you have heard of page object pattern, data-tags and selenium grids then this post is redundant to you. But if it is news for you then please read on.

They are the three most important patterns to follow if you’re going to make selenium tests. Here’s what they do and why they help.

Page Object Pattern and data tags

Single purpose classes that are responsible to act as an API for a test that is responsible for resolving WebElement using CssSelectors, ids or classes. This makes for readable tests. By using a tag called data-test that holds an unique identifiera that never changes we can lock the test onto that tag no matter which type of element or classes it has. This is preferrable to relying on either XPath, ids or classes which are subject to change if the page changes. This will alleviate tests breaking because someone on the frontend refactors CSS classes, and leave the breakages that means something is missing, invisible or immovable.


This kind of page object contains the page-specific element names and maps them the variabler that makes sense for the content. The suoerclass takets care of initializing @FindBy-annotated variabler lazily, meaning it is evaluated upin accessing the element. Something like

A simple grid

A selenium grid is a server that runs selenium tests as a service. There are naturally many cloud providers for this use case but there are some FOSS disruptors in that space asswell. I have especially taken heart to selenoid, that spawns docker containers containing insividual isolated browsers in a headless way that is VERY suitable for running Quick system tests upon a new deploy.

Setup a selenoid grid locally by using this Vagrantfile if you like, it will speed up and automate the process, i’m going to assume it is available from your localhost moving on.

A Java Project

Include the following dependencies in your pom.xml, build.xml or build.gradle to run selenium-java-remote using JUnit 4. We’re going to stay away from large frameworks and simply run Selenium Tests as a regular unit-test. Use whatever loggning library you like and replace my println, logging is not the focus of this article. 😉

Here’s a tiny that initializes the Page Objects and creates the remote webdriver.

Any JUnit test must extend this class and will initialize a remote webdriver per each test in this case. This can also be used to run as many test-methods in parallel as there are concurrent containers in the grid. This is when Selenoid shines the brightest, because it will create a lightweight container for each test method making for awesome idempotent and isolated tests. The TestBase will parse if a test has failed and put a recording of it into the target folder if you’re running maven.

This results in small, stable and readable tests such as this one.

I hope that helps you running better, faster and more stable Selenium-based test at all levels!


Sonar Comments on BitBucket Pull-Requests

At work we have a scramble to use static code analyzers to improve the quality of code in general. Both from a security perspective and from a standardization perspective. I have worked with Sonar before, but it has almost always been in the background, alone and forgotten by everyone who are pushing features. Now those who know me are aware that i prefer early feedback, preferably pre-merge. I like to think of the Patch, Pull or Merge request as the real guard against flinchy developers like myself who don’t have time to run the tests, or check sonar for issues that should be fixed while i’m covering that particular code. This article is about resolving that and getting sonar comments directly on pull-requests.


  • TeamCity as a build server
  • C# classic as software platform
  • MSBuild as a build system
  • BitBucket cloud for a source repository .

High level design

This is what it looks like from a high level. A Pull-Request in BitBucket triggers a TeamCity job that, in turn, runs the same pull-request builder build-process as would be done with a regular pre-merge job but with a sonar-analysis in preview-mode and a specific sonar-plugin that is able to post comments.


Things you should probably do before delving in to all the configuration.


  • A specific user that can be named Sonar-Reviewer and added to your team


Make sure you build the pull-request trigger from master branch if the latest release is still pullrequest-20172603195632 since it needs the fix in this PullRequest by yours truly to be able to post the pull-request id to sonar. mvn package with maven should create the zip you need)



There aren’t that many things to setup for this to work actually.

Configuration in BitBucket

Configuration in Sonar

  • If analysis is protected then create a system user for TeamCity to login to sonar

Configure TeamCity

  • Set the JAVA_HOME variable to where your JRE 8 is for each agent
  • Make sure any proxies the agent should use to post to is also specified in the SONAR_SCANNER_OPTS environment variable, either as agent property or as build parameter. In my case i had to se env.SONAR_SCANNER_OPTS=-Dhttp.proxyHost=myproxy.tld -Dhttp.proxyPort=1234 in the AGENT_HOME/conf/
  • Configure a pull-request trigger to look like this
  • Make sure your VCS root has the following branch specification: +:refs/heads/*
  • Go to parameters

    • Add the following parameters, it’s of course possible to skip the ones you don’t want configurable
    • Make sure you added the OAuth Client Key and Client Secret from your BitBucket user created earlier
  • Go to build steps

    • Add Sonar Analysis Begin step
    • Set a project key, version and branch as you see fit, they may not be empty but they are not important for this either
    • Add Sonar Analysis begin with the following huge parameter list with the following Additional CommandLine Args

/d:sonar.analysis.mode=preview /d:sonar.bitbucket.repoSlug=YOUR_REPOSITORY /d:sonar.bitbucket.accountName=YOUR_ORGANIZATION_OR_USER /d:sonar.bitbucket.oauthClientKey=%sonar.bitbucket.oauthClientKey% /d:sonar.bitbucket.oauthClientSecret=%sonar.bitbucket.oauthClientSecret% /d:sonar.bitbucket.pullRequestId=%trigger.pullRequestId% /d:sonar.bitbucket.minSeverity=%sonar.bitbucket.minSeverity% /d:sonar.bitbucket.approvalFeatureEnabled=%sonar.bitbucket.approvalFeatureEnabled% /d:sonar.bitbucket.maxSeverityApprovalLevel=%sonar.bitbucket.maxSeverityApprovalLevel% /d:sonar.bitbucket.buildStatusEnabled=%sonar.bitbucket.buildStatusEnabled%

Make sure it corresponds to the parameters you added before. Save the build step.

  • Add a MSBuild step with whatever targets you want. Sonar for MSBuild suggests MSBuild.exe /t:Rebuild
  • Add a Sonar Analysis End step with default settings

That’s it!

At this point you should be able to create a pull-request, see the job trigger in TeamCity and have the sonar-plugin work its magic and post any issues introduced by the PR as comments like this.

I’m especially happy i was able to put this integration in place, seeing as i had no prior C#, Sonar Analysis for MSBuild or TeamCity experience experience. But it all get’s easier with time, and most integrations look similar and require the same kind of tinkering.