Monday, 19 November 2012

21st Century Integration Testing

Everything starts here




So what is it?

A combination of OpenSource projects, bundled with a user-friendly interface, to bring your integration tests in to the 21st Century.

As shown on the screenshot above, the framework will give you access to the following:
  • Run Chart - historic availability of MyApp1
  • Blame Chart - test(s) that have failed
  • Frontends, Service and Backends - gives us a simple view of where the fault lies, and more importantly, which functionality has been impacted

What are the key benefits?

  • Increased availability
  • Improved productivity
  • Reduced maintenance costs

Who will benefit?



How does it work?

It's quite simple, we take our existing (or new) integration tests, add them to this framework and navigate to the monitor page. Sound simple? That's because it is!

Time for a test drive?

Would you buy a car without test driving it? Even though nothing here is for sale, time still costs money, and for anyone to invest time and effort, it has to be worth it.
In order to show the power of "21st Century Integration Testing", a bundle with both the server and test cases is provided below. It enables you to test run it before deciding on setting up your own solution.

Please make sure you have the following products installed:

Java - http://www.java.com/getjava/

Firefox - http://www.mozilla.org/en-US/firefox/new/

The provided package is a fully functional three layered Bottom Up Testing, including SoapUI and Selenium tests. All code used to make this package, can be found on GitHub.
  1. Download https://github.com/downloads/dunse/try-it-out/21st%20Century%20Integration%20Testing.zip
  2. Unpack and run 1_startServer.bat (or 1_startServer.sh if you run *NIX)
    This should automatically start the browser, if not you can access the monitor page at: http://localhost:18080/ShowMyTests/
  3. Run a test: 2_runTestSuite.bat (2_runTestSuite.sh)
    Some tests are random, so run again to get different results...
    Cuanto (the backend) can be accessed by http://localhost:18080/cuanto/

How do I get started?

Step-by-step instructions on how to setup your own test server and add your own tests can be found here: http://dsysadm.blogspot.com.au/2012/11/getting-started-with-21st-century.html

You can now have your own system up and running in no time!


Not convinced?

Please leave a comment about your thoughts. Constructive citisism is always appreciated!

Getting started with 21st Century Integration Testing


Setup the server

Download and install the software listed below.
Plenty of useful tutorials exist on how to install Java, Tomcat and MySQL on every possible operating system, so please follow one of those. For Cuanto, please follow the instructions in the next two sections.

Installing Cuanto and MySQL JDBC Driver

(Source: INSTALL.html):

Creating the database

Create a database on your SQL server named "cuanto". Typically this is done with a command like "create database cuanto;" when logged onto your SQL server, although you may want to specify additional details depending on any additional database restrictions or options you want to enable.

Customizing cuanto-db.groovy

Edit cuanto-db.groovy with a text editor. You'll see a section like this:
production {
 dataSource {
  username = "my_sql_user"
  password = "my_sql_password"
  driverClassName = "com.mysql.jdbc.Driver"
  url = "jdbc:mysql://my_sql_server:3306/cuanto?autoreconnect=true"
 }
}
Edit the username, password, driverClassName and url to correspond to the correct values for your database credentials, JDBC driver and the JDBC URL for your SQL server. Make sure you edit the "production" section.

Deploying the application

Unzip the WAR into your application server's webapps directory into a subdirectory named "cuanto". Copy the cuanto-db.groovy you customized into the cuanto/WEB-INF/classes directory.


Installing ShowMyTests

Simply copy the downloaded .war file in to Tomcat's webapps directory, start Tomcat and navigate to http://<Tomcat_Server_IP>:<Tomcat_Server_Port>/ShowMyTests/.
For example, http://localhost:8080/ShowMyTests/
This should bring up a black screen with the friendly words "ERROR: No TestRuns found":


Server installation is now ready for use! Move on to checkout the framework and start running some tests....


Get the framework

Make sure your IDE have Maven Integration and supports Git.
Then clone the following Git repositories (Remember to tick the "Import all existing projects after clone finishes"):
git://github.com/dunse/ShowMyTests.git
git://github.com/dunse/my-integration-tests.git

While you wait, read through the next section if you are not familiar with Integration Testing fundamentals.


Integration Testing

As the name indicates, it is used to test integration between components. It does not matter if it is one service to another, or end-user to web-applications, they are all integrated and should be tested.

For the concept of 21st Century Integration Testing, we adopted the Bottom-Up Testing as it is the most suitable.

Bottom Up Testing is an approach to integrated testing where the lowest level components are tested first, then used to facilitate the testing of higher level components. The process is repeated until the component at the top of the hierarchy is tested.
All the bottom or low-level modules, procedures or functions are integrated and then tested. After the integration testing of lower level integrated modules, the next level of modules will be formed and can be used for integration testing. This approach is helpful only when all or most of the modules of the same development level are ready. This method also helps to determine the levels of software developed and makes it easier to report testing progress in the form of a percentage. (Source: Wikipedia)


It is important when creating tests for this purpose to approach them via a Top Down fashion, which means:
  1. Identify a User Action on the Frontend Layer, write a test for it and verify it.
  2. Move to the next Layer (Service Layer) and write tests for all calls produced by the User Action.
  3. Finally, move to the Backend Layer and write tests for all calls produced by the Service Layer for the given User Action.
Making the tests generic
Never, ever, ever have any environment specific details in the test itself. It is easy enough to create a property file and load it during runtime, so there is no reason not to.


Run the provided tests

In your IDE, create a new Run Configuration as:
  • Main class: org.testng.TestNG
  • Program arguments: ./src/main/resources/testng-MyApp1.xml
  • VM arguments:
    -Dcuanto.url=http://localhost:8080/cuanto
    -Dcuanto.projectkey=MyApp1
    -Dcuanto.testrun.create=true
    -Dcuanto.includeConfigDuration=false
    -DcrmId=399
If your Tomcat installation is not running on localhost:8080, update the cuanto.url accordingly.

Example screenshots using Eclipse:

  


When done, run the new configuration and switch back to ShowMyTests.
It should now display something simliar to this:



Add your own tests

The easiest way to get started is to reuse the existing classes:
  • OnlineFrontend - Frontend tests, could be login or a purchase using a browser
  • CRMService - Service (SOA) tests, normally WebServices
  • CRMBackend - Backend tests, such as databases or depending WebServices
Let's say we want to test a Login User Action.
First, we identify the flow through the application:

As shown in the above, we have four tests to write, starting with the bottom one moving up.

User DB

We create a new method called "userDB" in CRMBackend, which will check if the database if functional or not. Preferrably executing a query against the database.
@Test(groups = {"MyApp1", "userDB"})
public void userDB() {
  assertTrue("Here we can test if the DB works", true); // Replace this line with your own test
}

Important to note here, is the specified "groups". These will be used for two purposes:
  1. "MyApp1" - groups all tests which belongs to MyApp1. This is useful for reusing the same test code for multiple applications
  2. "userDB" - it is good practice to include the method in a group with the same name. You will see why when we move on to the Service layer

Security WS & User WS

In CRMService, add the following code:
@Test(groups = {"MyApp1", "authenticateUser"}, dependsOnGroups = {"userDB"})
public void authenticateUser() { // Part of Security WS
  assertTrue("Here we can test user authentication works", true); // Replace this line with your own test
}

@Test(groups = {"MyApp1", "retrieveUserDetails"}, dependsOnGroups = {"userDB"})
public void retrieveUserDetails() { // Part of User WS
  assertTrue("Here we can test if user details can be retrieved", true); // Replace this line with your own test
}

We use the same principle as for the User DB with one exception, "dependsOnGroups".
"dependsOnGroups" tells TestNG that both methods depend on the success of "userDB" and will not be executed if "userDB" fails.

This is a key feature which makes your tests raise above the rest.

Login

Final method :
@Test(groups = {"MyApp1", "login"}, dependsOnGroups = {"authenticateUser", "retrieveUserDetails"})
public void login() { // Part of User WS
  assertTrue("Use Selenium to test the login sequence and verify retrieved data", true); // Replace this line with your own test
}

Summary

As a result of the "dependsOnGroups", TestNG will manage the dependencies and execution order. If multiple methods don't specify "dependsOnGroups", they will be executed in no particular order.
If we map out the execution, we would end up with something like this:

As you can see, it follows the Bottom Up Testing technique to the letter.

If you now run "my-integration-tests" again, the new tests will show up on "ShowMyTests" screen (hidden since they would all pass by default). To see how the userDB affects the upstream functions, you can simply change the assertTrue() call by changing true to false and re-run the tests.



Congratulations! You have now created your very first tests as part of 21st Century Integration Testing!

But this is only the beginning. If you haven't already, replace the tests with your own tests and see it all come to life.


Some help along the way

Refactoring classes

The names of the classes probably won't make much sense for your application, so let's take a closer look at how we can change them.

ShowMyTests will put any tests run from a class' name ending with Frontend into the Frontends layer, ending with Service into Service layer and Backend into the Backends Layer. This means no change should be required for the monitor page, as long as you follow the given naming convention.

TestNG however, needs to know which classes to look for tests in. Inside /src/main/resources/testng-MyApp1.xml make sure your new classes are mentioned inside <classes></classes> and they will be included in the test run.

Making SoapUI tests smarter

SoapUI provides an interfact to override any custom properties defined inside its projects. For example, we pass in "-DcrmId=399" while running the tests above.
This crmId is then picked up in CRMService and passed to the SoapUI project as:
String[] props = {
  "crmId=" + Init.getProperty("crmId")
};
System.out.println("crmId=" + Init.getProperty("crmId"));
runner.setProjectProperties(props);

Inside SoapUI we have crmId declared as a Project Customer Property:



Which we use inside one of the Test Steps:



This makes it very easy to adjust any (for example) environment specific values while running the integration tests, without the need to duplicate the tests.

One custom property that normaly comes in handy is the "myEndpoint". This makes it possible to pick and choose which server you want to run the tests against.


Here is how it can be implemented:

1. In SoapUI, add the custom property (which also serves as default value when running the tests inside SoapUI):



2. In SoapUI, change the endpoint(s) to use this new custom property:



3. Inside CRMService, add line 27:
String[] props = {
  "crmId=" + Init.getProperty("crmId")
  "myEndpoint=" + Init.getProperty("myEndpoint")
};
System.out.println("crmId=" + Init.getProperty("crmId"));
runner.setProjectProperties(props);

Now the next time you run my-integration-tests, it is possible to run the Service tests against any server, using the "-DmyEndoint=<new_value>" argument.

Friday, 14 September 2012

Human readable cron expressions using JavaScript

So I was faced with the task of presenting cron expression (like '3 * * * *') on a web page in human readable format. I figured there would be plenty of scripts available to do just that, ready for use.
With the limitations to using JavaScript, I found nothing! Either it is hidden away somewhere or no one have published a script to convert cron expression in to human readable format.

After a little soul searching I decided to write my own implementation.
The final result looks like this:
Cron expression: 15 3,8,10,12,14,16,18 16 * *
Human readable: Every 15th minute past the 3, 8, 10, 12, 14, 16 and 18th hour on the 16th every month
Next run: Sunday at 3:15 AM

Cron expression: 30 * * * *
Human readable: Every 30th minute past every hour
Next run: Today at 1:30 AM

Cron expression: 0 * * * *
Human readable: Every hour, on the hour
Next run: Today at 2:00 AM

Cron expression: * 8,10,12,14,16,18,20 * * *
Human readable: Every minute of 8, 10, 12, 14, 16, 18 and 20th hour
Next run: Today at 8:00 AM

Cron expression: 2 8,10,12,14,16,18 * 4 0,3
Human readable: Every 2nd minute past the 8, 10, 12, 14, 16 and 18th hour in Apr on Sun and Thu
Next run: 04/07/2013

Dependencies

First download the dependencies:
moment.min.js - Credit Moment.js
later.min.js - Credit bunkat
prettycron.js - Credit myself

How to use

Copy the dependencies above to the web server.
In the same directory, create a new basic HTML file called prettycron.html.
Include the dependencies in prettycron.html:
    <script src="later.min.js" type="text/javascript"></script>
    <script src="moment.min.js" type="text/javascript"></script>
    <script src="prettycron.js" type="text/javascript"></script>

Add this in the body of prettycron.html:
document.write(getPrettyCron(cronParser().parse('0 * * * *', false)));

Open the page in a web browser and you should see the following output:
Every minute

That is it, just replace '0 * * * *' with any basic cron expression to see it in human readable format.


Full example

Please find the full example below to produce the example from introduction:
<html>
  <head>
    <title>Human readable cron schedule</title>
    <script src="later.min.js" type="text/javascript"></script>
    <script src="moment.min.js" type="text/javascript"></script>
    <script src="prettycron.js" type="text/javascript"></script>
  </head>
<body>
<script type="text/javascript">
function printCron(cron) {
    var s1 = cronParser().parse(cron, false);
    document.write('<b>Cron expression: </b>' + cron + '<br/>');
    document.write('<b>Human readable: </b>' + getPrettyCron(s1['schedules'][0]) + '<br/>');
    document.write('<b>Next run: </b>' + moment(later(60,true).getNext(s1)).calendar() + '<br/><br/>');
}
printCron('15 3,8,10,12,14,16,18 16 * *');
printCron('30 * * * *');
printCron('0 * * * *');
printCron('* 8,10,12,14,16,18,20 * * * ');
printCron('2 8,10,12,14,16,18 * 4 0,3 ');
</script>
</body>


Internet Explorer pre-9

Only in Internet Explorer 9 support came for Array.indexOf. The scripts depends heavily on this method so to make it work in Internet Explorer 6-8, perform the following steps:
1. Create ielt9.js containing the code from "Compatibility" section of:
https://developer.mozilla.org/en-US/docs/JavaScript/Reference/Global_Objects/Array/indexOf

2. Inside <head></head> add:
<!--[if lt IE 9]><script src="js/ltie9.js" type="text/javascript" charset="UTF-8"></script><![endif]-->

Sunday, 9 September 2012

My first Android app

I just finished writing my first Android app Screensaver Photo Frame:
During this project I hit a few gotchas which I wanted to share. Below I listed the most critical things to remember while starting developing for Android.

1. Exceptions = Force close

Whenever the application throws an exception, it will cause Android to force close the application. So to avoid frustration for the users, catch any potential exception and handle them properly.

2. Database locking issues

As you might already noticed, multi-threading is critical for any Android application. The issue with multi-threading and built in database is however that you are destined to run into SQLiteDatabaseLockedException by following most getting started guides.
The trick is to use SQLiteOpenHelper as a singleton and let SQLite manage the synchronization.
In the cut-down example below, two threads can call addSomething() simultaneously and SQLite will handle the synchronization within itself preventing lock exception.
DBAdapter:
public class DBAdapter {
    private static SQLiteDatabase db = null;
    public static synchronized DBAdapter getInstance(Context ctx)
    {
        if (db == null ) {
            DatabaseHelper DBHelper = new DatabaseHelper(ctx);
            db = DBHelper.getWritableDatabase();
        }
        return new DBAdapter();
    }
    public void addSomething(String something) {
        ContentValues initialValues = new ContentValues();
        initialValues.put(KEY_SOMETHING, something);
        db.insertWithOnConflict(DATABASE_SOMETHING_TABLE, null, initialValues,
                SQLiteDatabase.CONFLICT_REPLACE);
    }
    private static class DatabaseHelper extends SQLiteOpenHelper
    {
        ...
    }
}
Activity:
public class MainActivity extends Activity {
    private DBAdapter database;
    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        database = DBAdapter.getInstance(getApplicationContext());
        database.addSomething("something");
    }
}


3. Multi-row SQL execution

Since Android 4-ish it is not possible to run queries separated by semi-colon anymore.
So instead of executing SQL as:
db.execSQL("CREATE TABLE A ...; CREATE TABLE B ...;");
Split it up into multiple executions:
db.execSQL("CREATE TABLE A ...;");
db.execSQL("CREATE TABLE B ...;");



Tuesday, 27 March 2012

Dansguardian Log Analyser

I recently started to learn Dojo and thought that writing an application would be a fun way of doing so.

When writing a new web application I wanted to create something useful, so I reflected back to when I installed Dansguardian on the My Book Live.

After installation I found out there wasn't any good log analyser available (at least not outside webmin). So when deciding to write an application, the choice was easy. The application turned out to be "Dansguardian Log Analyser" which is based on Dojo and PHP.

As with my other small projects I published the final product on github for everyone's use.

Read on for details and installation instructions...


Details

There's three sections available:
  • Summary
    Shows a graph of today's usage.
  • Denied requests
    A list of today's denied requests. By clicking on a row it will open the denied page in the iframe below the grid.
  • Realtime log
    This simply tail the Dansguardian logfile to the browser. It displays all requests logged in the access.log from the time the tab is selected.


Screenshots



Installation

Download the package from here.
Copy dla/ directory to your web/proxy server running Dansguardian. (E.g. to /var/www/dla/)
Access it through: http://your.web.server/dla/

Notes

Make sure "logfileformat = 1" is set in dansguardian.conf
If "loglocation" is change from default, update getDansguardianLog.php with the correct path.


Thursday, 1 March 2012

My Book Live with Squid and fakeauth

Next for the new My Book Live was to install and configure Squid. Since MBL runs Debian, it is fairly straight forward to get Squid up and running. But here we also want to setup fakeauth which complicate things a little bit.

The reason for using fakeauth is to authenticate users on a home network without the need for any central authentication database. It is far from secure, but for the sole purpose of identifying the user it is good enough.


Install Squid3
First we need to update the package repositories from where we wish to install.
Simply run the following in the shell:
echo 'deb http://ftp.us.debian.org/debian/ squeeze main
deb-src http://ftp.us.debian.org/debian/ squeeze main
#deb http://ftp.us.debian.org/debian/ sid main
#deb http://ftp.us.debian.org/debian/ experimental main' > /etc/apt/sources.list
Then execute the next line to update the local cache and install Squid 3:
apt-get update && apt-get -y install squid3
This will download and install Squid3 followed by starting it up listening on port 3128.


Install fakeauth
Fakeauth is part of Squid3 but it is not bundled with the installable package. This forces us to build it from source.

Let's start by installing dependencies:
apt-get -y install gcc g++ make patch
Next we download and unpack the source for Squid 3.1.6:
cd ~/ && apt-get source squid3
tar xzf squid3_3.*.orig.tar.gz && cd squid-*/
Below we build fakeauth with a patch required for the MBL:
./configure
cd compat/ && make
cd ../lib && make
cd ../helpers/ntlm_auth/fakeauth/
echo '279a280,281
>     // Fix for platform
>     auth->flags = le32toh(auth->flags);
282c284
<     debug("ntlmDecodeAuth: usr o(%d) l(%d)\n", auth->user.offset, auth->user.len);
---
>     debug("ntlmDecodeAuth: usr o(%d) l(%d)\n", le32toh(auth->user.offset), le16toh(auth->user.len));
' | patch fakeauth_auth.c
make
This would produce an executable file called fakeauth_auth in the current directory compatible with MBL.
Copy this file to Squid's library path to keep it all in one place:
cp fakeauth_auth /usr/lib/squid3/
chmod 755 /usr/lib/squid3/fakeauth_auth

Configure fakeauth
Fakeauth is a NTLM authentication helper which we configure in /etc/squid3/squid.conf by adding the following:
auth_param ntlm program /usr/lib/squid3/fakeauth_auth -S
auth_param ntlm children 5
auth_param ntlm keep_alive on
This will enable the module but won't restrict any users accessing the proxy.

At approx. line 760 add the following two lines to enable fakeauth module:
(Note that http_access deny needs to be above any other http_access allow to work properly)
acl dummyAuth proxy_auth REQUIRED
http_access deny !dummyAuth all
This basically tells Squid only to allow clients that support NTLM. This will only work for Windows users so additional rules needs to be added to allow other clients to bypass the authentication.

After restarting Squid (/etc/init.d/squid3 restart) the username should now appear in the logfile when using Squid as a proxy:
# tail /var/log/squid3/access.log
1221111156.178    170 10.0.0.9 TCP_MISS/204 351 GET http://www.google.com/csi? charlie DIRECT/74.125.237.112 image/gif





My Book Live not working with iTunes 10.5

I recently bought a My Book Live and wanted to use it as a streaming server for Apple appliances.
During initial setup there was a prompt for a new firmware which I installed straight away. Then after moving some music over to the shared drive, I tried to connect to it in iTunes which obviously failed.

After searching around for a while I found a lot of people experiencing similar issue with not only the MBL, but with a lot of other NAS' as well.
Amongst the suggested resolutions were:
  1. Make sure "Media Serving" option on the share was set to "All"
  2. Under Settings > Media > iTunes, execute the "rescan"

None of them worked, iTunes just showed the spinning wheel as it tried to connect.

Then spending a bit more time checking debug logs from forked-daapd (the iTunes Server) and the network traffic, everything looked good, it just wouldn't work.

Since the firmware already been updated, I didn't have much hope when visiting WD's support page to look for a new firmware to support the iTunes 10.5. Surprisingly there was a much newer version available. After installing it and the library was rescanned, it worked perfectly. Well, for music at least.
As it turns out the iTunes 10.5 and My Book Live currently only support streaming of music. For videos, iTunes 10.2 must be used.


Resolution
Install latest firmware (MyBookLive 02.10.09-124 : Core F/W).

For instructions on how to update the firmware, please follow the instructions provided by WD: http://wdc.custhelp.com/app/answers/detail/a_id/5735


Note
Clearly Apple is causing all these issues for the sole purpose of saying, "You should be using our products!". Time will tell if it is right or wrong.