Testng Data Provider Before Class Assignments

TestNG

Table of Contents

Introduction

TestNG is a testing framework designed to simplify a broad range of testing needs, from unit testing (testing a class in isolation of the others) to integration testing (testing entire systems made of several classes, several packages and even several external frameworks, such as application servers).

Writing a test is typically a three-step process:

  • Write the business logic of your test and insert TestNG annotations in your code.
  • Add the information about your test (e.g. the class name, the groups you wish to run, etc...) in a file or in build.xml.
  • Run TestNG.
You can find a quick example on the Welcome page.

The concepts used in this documentation are as follows:

  • A suite is represented by one XML file. It can contain one or more tests and is defined by the tag.
  • A test is represented by and can contain one or more TestNG classes.
  • A TestNG class is a Java class that contains at least one TestNG annotation. It is represented by the tag and can contain one or more test methods.
  • A test method is a Java method annotated by in your source.
A TestNG test can be configured by annotations which allows to perform some Java logic before and after a certain point, these points being either of the items listed above.

The rest of this manual will explain the following:

  • A list of all the annotations with a brief explanation. This will give you an idea of the various functionalities offered by TestNG but you will probably want to consult the section dedicated to each of these annotations to learn the details.
  • A description of the testng.xml file, its syntax and what you can specify in it.
  • A detailed list of the various features and how to use them with a combination of annotations and testng.xml.

Annotations

Here is a quick overview of the annotations available in TestNG along with their attributes.
Configuration information for a TestNG class:

@BeforeSuite: The annotated method will be run before all tests in this suite have run.
@AfterSuite: The annotated method will be run after all tests in this suite have run.
@BeforeTest: The annotated method will be run before any test method belonging to the classes inside the <test> tag is run.
@AfterTest: The annotated method will be run after all the test methods belonging to the classes inside the <test> tag have run.
@BeforeGroups: The list of groups that this configuration method will run before. This method is guaranteed to run shortly before the first test method that belongs to any of these groups is invoked.
@AfterGroups: The list of groups that this configuration method will run after. This method is guaranteed to run shortly after the last test method that belongs to any of these groups is invoked.
@BeforeClass: The annotated method will be run before the first test method in the current class is invoked.
@AfterClass: The annotated method will be run after all the test methods in the current class have been run.
@BeforeMethod: The annotated method will be run before each test method.
@AfterMethod: The annotated method will be run after each test method.

Behaviour of annotations in superclass of a TestNG class

The annotations above will also be honored (inherited) when placed on a superclass of a TestNG class. This is useful for example to centralize test setup for multiple test classes in a common superclass.

In that case, TestNG guarantees that the "@Before" methods are executed in inheritance order (highest superclass first, then going down the inheritance chain), and the "@After" methods in reverse order (going up the inheritance chain).

For before methods (beforeSuite, beforeTest, beforeTestClass and beforeTestMethod, but not beforeGroups): If set to true, this configuration method will be run regardless of what groups it belongs to.
For after methods (afterSuite, afterClass, ...): If set to true, this configuration method will be run even if one or more methods invoked previously failed or was skipped.
The list of groups this method depends on.
The list of methods this method depends on.
Whether methods on this class/method are enabled.
The list of groups this class/method belongs to.
If true, this method will belong to groups specified in the @Test annotation at the class level.
 
Marks a method as supplying data for a test method. The annotated method must return an Object[][] where each Object[] can be assigned the parameter list of the test method. The @Test method that wants to receive data from this DataProvider needs to use a dataProvider name equals to the name of this annotation.
The name of this data provider. If it's not supplied, the name of this data provider will automatically be set to the name of the method.
If set to true, tests generated using this data provider are run in parallel. Default value is false.
 
Marks a method as a factory that returns objects that will be used by TestNG as Test classes. The method must return Object[].
 
Defines listeners on a test class.
An array of classes that extend .
 
Describes how to pass parameters to a @Test method.
The list of variables used to fill the parameters of this method.
 
@TestMarks a class or a method as part of the test.
If set to true, this test method will always be run even if it depends on a method that failed.
The name of the data provider for this test method.
The class where to look for the data provider. If not specified, the data provider will be looked on the class of the current test method or one of its base classes. If this attribute is specified, the data provider method needs to be static on the specified class.
The list of groups this method depends on.
The list of methods this method depends on.
The description for this method.
Whether methods on this class/method are enabled.
The list of exceptions that a test method is expected to throw. If no exception or a different than one on this list is thrown, this test will be marked a failure.
The list of groups this class/method belongs to.
The number of times this method should be invoked.
The maximum number of milliseconds this test should take for the cumulated time of all the invocationcounts. This attribute will be ignored if invocationCount is not specified.
The priority for this test method. Lower priorities will be scheduled first.
The percentage of success expected from this method
If set to true, all the methods on this test class are guaranteed to run in the same thread, even if the tests are currently being run with parallel="methods". This attribute can only be used at the class level and it will be ignored if used at the method level. Note: this attribute used to be called (now deprecated).
The maximum number of milliseconds this test should take.
The size of the thread pool for this method. The method will be invoked from multiple threads as specified by invocationCount.
Note: this attribute is ignored if invocationCount is not specified

testng.xml

You can invoke TestNG in several different ways:

  • With a file
  • With ant
  • From the command line

This section describes the format of (you will find documentation on ant and the command line below).

The current DTD for can be found on the main Web site:  testng-1.0.dtd (for your convenience, you might prefer to browse the HTML version).

Here is an example file:

testng.xml

<!DOCTYPE suite SYSTEM "http://testng.org/testng-1.0.dtd" > <suite name="Suite1" verbose="1" > <test name="Nopackage" > <classes> <class name="NoPackageTest" /> </classes> </test> <test name="Regression1"> <classes> <class name="test.sample.ParameterSample"/> <class name="test.sample.ParameterTest"/> </classes> </test> </suite> You can specify package names instead of class names:

testng.xml

<!DOCTYPE suite SYSTEM "http://testng.org/testng-1.0.dtd" > <suite name="Suite1" verbose="1" > <test name="Regression1" > <packages> <package name="test.sample" /> </packages> </test> </suite>

In this example, TestNG will look at all the classes in the package and will retain only classes that have TestNG annotations.

You can also specify groups and methods to be included and excluded:

testng.xml

<test name="Regression1"> <groups> <run> <exclude name="brokenTests" /> <include name="checkinTests" /> </run> </groups> <classes> <class name="test.IndividualMethodsTest"> <methods> <include name="testMethod" /> </methods> </class> </classes> </test>

You can also define new groups inside and specify additional details in attributes, such as whether to run the tests in parallel, how many threads to use, whether you are running JUnit tests, etc... 

By default, TestNG will run your tests in the order they are found in the XML file. If you want the classes and methods listed in this file to be run in an unpredictable order, set the attribute to

testng.xml

<test name="Regression1" preserve-order="false"> <classes> <class name="test.Test1"> <methods> <include name="m1" /> <include name="m2" /> </methods> </class> <class name="test.Test2" /> </classes> </test>

Please see the DTD for a complete list of the features, or read on.

Running TestNG

TestNG can be invoked in different ways: This section only explains how to invoke TestNG from the command line. Please click on one of the links above if you are interested in one of the other ways.

Assuming that you have TestNG in your class path, the simplest way to invoke TestNG is as follows:

java org.testng.TestNG testng1.xml [testng2.xml testng3.xml ...] You need to specify at least one XML file describing the TestNG suite you are trying to run. Additionally, the following command-line switches are available:
OptionArgumentDocumentation
-configfailurepolicy|Whether TestNG should to execute the remaining tests in the suite or them if an @Before* method fails. Default behavior is .
-dA directoryThe directory where the reports will be generated (defaults to ).
-dataproviderthreadcountThe default number of threads to use for data providers when running tests in parallel.This sets the default maximum number of threads to use for data providers when running tests in parallel. It will only take effect if the parallel mode has been selected (for example, with the -parallel option). This can be overridden in the suite definition.
-excludegroupsA comma-separated list of groups.The list of groups you want to be excluded from this run.
-groupsA comma-separated list of groups.The list of groups you want to run (e.g. ).
-listenerA comma-separated list of Java classes that can be found on your classpath.Lets you specify your own test listeners. The classes need to implement
-methodsA comma separated list of fully qualified class name and method. For example .Lets you specify individual methods to run.
-methodselectorsA comma-separated list of Java classes and method priorities that define method selectors.Lets you specify method selectors on the command line. For example:
-parallelmethods|tests|classesIf specified, sets the default mechanism used to determine how to use parallel threads when running tests. If not set, default mechanism is not to use parallel threads at all. This can be overridden in the suite definition.
-reporterThe extended configuration for a custom report listener.Similar to the option, except that it allows the configuration of JavaBeans-style properties on the reporter instance.
Example:
You can have as many occurrences of this option, one for each reporter that needs to be added.
-sourcedirA semi-colon separated list of directories.The directories where your javadoc annotated test sources are. This option is only necessary if you are using javadoc type annotations. (e.g. or ).
-suitenameThe default name to use for a test suite.This specifies the suite name for a test suite defined on the command line. This option is ignored if the suite.xml file or the source code specifies a different suite name. It is possible to create a suite name with spaces in it if you surround it with double-quotes "like this".
-testclassA comma-separated list of classes that can be found in your classpath.A list of class files separated by commas (e.g. ).
-testjarA jar file.Specifies a jar file that contains test classes. If a file is found at the root of that jar file, it will be used, otherwise, all the test classes found in this jar file will be considered test classes.
-testnameThe default name to use for a test.This specifies the name for a test defined on the command line. This option is ignored if the suite.xml file or the source code specifies a different test name. It is possible to create a test name with spaces in it if you surround it with double-quotes "like this".
-testnamesA comma separated list of test names.Only tests defined in a <test> tag matching one of these names will be run.
-testrunfactoryA Java classes that can be found on your classpath.Lets you specify your own test runners. The class needs to implement .
-threadcountThe default number of threads to use when running tests in parallel.This sets the default maximum number of threads to use for running tests in parallel. It will only take effect if the parallel mode has been selected (for example, with the -parallel option). This can be overridden in the suite definition.
-xmlpathinjarThe path of the XML file inside the jar file.This attribute should contain the path to a valid XML file inside the test jar (e.g. ). The default is , which means a file called "" at the root of the jar file. This option will be ignored unless is specified.

This documentation can be obtained by invoking TestNG without any arguments.

You can also put the command line switches in a text file, say , and tell TestNG to use that file to retrieve its parameters:

C:> more c:\command.txt -d test-output testng.xml C:> java org.testng.TestNG @c:\command.txt

Additionally, TestNG can be passed properties on the command line of the Java Virtual Machine, for example

java -Dtestng.test.classpath="c:/build;c:/java/classes;" org.testng.TestNG testng.xml Here are the properties that TestNG understands:
PropertyTypeDocumentation
testng.test.classpathA semi-colon separated series of directories that contain your test classes.If this property is set, TestNG will use it to look for your test classes instead of the class path. This is convenient if you are using the tag in your XML file and you have a lot of classes in your classpath, most of them not being test classes.

Example: java org.testng.TestNG -groups windows,linux -testclass org.test.MyTest The ant task and testng.xml allow you to launch TestNG with more parameters (methods to include, specifying parameters, etc...), so you should consider using the command line only when you are trying to learn about TestNG and you want to get up and running quickly.

Important: The command line flags that specify what tests should be run will be ignored if you also specify a file, with the exception of and , which will override all the group inclusions/exclusions found in .

Test methods, Test classes and Test groups

Test methods

Test methods are annotated with . Methods annotated with that happen to return a value will be ignored, unless you set to in your : <suite allow-return-values="true"> or <test allow-return-values="true">

Test groups

TestNG allows you to perform sophisticated groupings of test methods. Not only can you declare that methods belong to groups, but you can also specify groups that contain other groups. Then TestNG can be invoked and asked to include a certain set of groups (or regular expressions) while excluding another set.  This gives you maximum flexibility in how you partition your tests and doesn't require you to recompile anything if you want to run two different sets of tests back to back.

Groups are specified in your file and can be found either under the or tag. Groups specified in the tag apply to all the tags underneath. Note that groups are accumulative in these tags: if you specify group "a" in and "b" in , then both "a" and "b" will be included.

For example, it is quite common to have at least two categories of tests

  • Check-in tests.  These tests should be run before you submit new code.  They should typically be fast and just make sure no basic functionality was broken.
     
  • Functional tests.  These tests should cover all the functionalities of your software and be run at least once a day, although ideally you would want to run them continuously.
Typically, check-in tests are a subset of functional tests.  TestNG allows you to specify this in a very intuitive way with test groups.  For example, you could structure your test by saying that your entire test class belongs to the "functest" group, and additionally that a couple of methods belong to the group "checkintest":

Test1.java

public class Test1 { @Test(groups = { "functest", "checkintest" }) public void testMethod1() { } @Test(groups = {"functest", "checkintest"} ) public void testMethod2() { } @Test(groups = { "functest" }) public void testMethod3() { } } Invoking TestNG with

testng.xml

<test name="Test1"> <groups> <run> <include name="functest"/> </run> </groups> <classes> <class name="example1.Test1"/> </classes> </test>

will run all the test methods in that classes, while invoking it with will only run and .

Here is another example, using regular expressions this time.  Assume that some of your test methods should not be run on Linux, your test would look like:

Test1.java

@Test public class Test1 { @Test(groups = { "windows.checkintest" }) public void testWindowsOnly() { } @Test(groups = {"linux.checkintest"} ) public void testLinuxOnly() { } @Test(groups = { "windows.functest" ) public void testWindowsToo() { } } You could use the following testng.xml to launch only the Windows methods:

testng.xml

<test name="Test1"> <groups> <run> <include name="windows.*"/> </run> </groups> <classes> <class name="example1.Test1"/> </classes> </test>
Note: TestNG uses regular expressions, and not wildmats. Be aware of the difference (for example, "anything" is matched by ".*" -- dot star -- and not "*").

Method groups

You can also exclude or include individual methods:

testng.xml

<test name="Test1"> <classes> <class name="example1.Test1"> <methods> <include name=".*enabledTestMethod.*"/> <exclude name=".*brokenTestMethod.*"/> </methods> </class> </classes> </test> This can come in handy to deactivate a single method without having to recompile anything, but I don't recommend using this technique too much since it makes your testing framework likely to break if you start refactoring your Java code (the regular expressions used in the tags might not match your methods any more).

Groups of groups

Groups can also include other groups. These groups are called "MetaGroups".  For example, you might want to define a group "all" that includes "checkintest" and "functest".  "functest" itself will contain the groups "windows" and "linux" while "checkintest will only contain "windows".  Here is how you would define this in your property file:

testng.xml

<test name="Regression1"> <groups> <define name="functest"> <include name="windows"/> <include name="linux"/> </define> <define name="all"> <include name="functest"/> <include name="checkintest"/> </define> <run> <include name="all"/> </run> </groups> <classes> <class name="test.sample.Test1"/> </classes> </test>

Exclusion groups

TestNG allows you to include groups as well as exclude them.

For example, it is quite usual to have tests that temporarily break because of a recent change, and you don't have time to fix the breakage yet.  4 However, you do want to have clean runs of your functional tests, so you need to deactivate these tests but keep in mind they will need to be reactivated.

A simple way to solve this problem is to create a group called "broken" and make these test methods belong to it.  For example, in the above example, I know that testMethod2() is now broken so I want to disable it:

Java

@Test(groups = {"checkintest", "broken"} ) public void testMethod2() { } All I need to do now is to exclude this group from the run:

testng.xml

<test name="Simple example"> <groups> <run> <include name="checkintest"/> <exclude name="broken"/> </run> </groups> <classes> <class name="example1.Test1"/> </classes> </test>

This way, I will get a clean test run while keeping track of what tests are broken and need to be fixed later.

Note:  you can also disable tests on an individual basis by using the "enabled" property available on both @Test and @Before/After annotations.

Partial groups

You can define groups at the class level and then add groups at the method level:

All.java

@Test(groups = { "checkin-test" }) public class All { @Test(groups = { "func-test" ) public void method1() { ... } public void method2() { ... } } In this class, method2() is part of the group "checkin-test", which is defined at the class level, while method1() belongs to both "checkin-test" and "func-test".

Parameters

Test methods don't have to be parameterless.  You can use an arbitrary number of parameters on each of your test method, and you instruct TestNG to pass you the correct parameters with the annotation.

There are two ways to set these parameters:  with or programmatically.

Parameters from
If you are using simple values for your parameters, you can specify them in your :

Java

@Parameters({ "first-name" }) @Test public void testSingleString(String firstName) { System.out.println("Invoked testString " + firstName); assert "Cedric".equals(firstName); } In this code, we specify that the parameter of your Java method should receive the value of the XML parameter called .  This XML parameter is defined in :

testng.xml

<suite name="My suite"> <parameter name="first-name" value="Cedric"/> <test name="Simple example"> <-- ... -->

The same technique can be used for and annotations:

@Parameters({ "datasource", "jdbcDriver" }) @BeforeMethod public void beforeTest(String ds, String driver) { m_dataSource = ...; // look up the value of datasource m_jdbcDriver = driver; } This time, the two Java parameter ds and driver will receive the value given to the properties and respectively. 

Parameters can be declared optional with the annotation:

@Parameters("db") @Test public void testNonExistentParameter(@Optional("mysql") String db) { ... } If no parameter named "db" is found in your file, your test method will receive the default value specified inside the annotation: "mysql".

The annotation can be placed at the following locations:

  • On any method that already has a , or annotation.
  • On at most one constructor of your test class.  In this case, TestNG will invoke this particular constructor with the parameters initialized to the values specified in whenever it needs to instantiate your test class.  This feature can be used to initialize fields inside your classes to values that will then be used by your test methods.

Notes:

  • The XML parameters are mapped to the Java parameters in the same order as they are found in the annotation, and TestNG will issue an error if the numbers don't match.
  • Parameters are scoped. In , you can declare them either under a tag or under . If two parameters have the same name, it's the one defined in that has precedence. This is convenient if you need to specify a parameter applicable to all your tests and override its value only for certain tests.
Parameters with DataProviders

Specifying parameters in might not be sufficient if you need to pass complex parameters, or parameters that need to be created from Java (complex objects, objects read from a property file or a database, etc...). In this case, you can use a Data Provider to supply the values you need to test.  A Data Provider is a method on your class that returns an array of array of objects.  This method is annotated with :

Java

//This method will provide data to any test method that declares that its Data Provider //is named "test1" @DataProvider(name = "test1") public Object[][] createData1() { return new Object[][] { { "Cedric", new Integer(36) }, { "Anne", new Integer(37)}, }; } //This test method declares that its data should be supplied by the Data Provider //named "test1" @Test(dataProvider = "test1") public void verifyData1(String n1, Integer n2) { System.out.println(n1 + " " + n2); } will print Cedric 36 Anne 37 A method specifies its Data Provider with the attribute.  This name must correspond to a method on the same class annotated with with a matching name.

By default, the data provider will be looked for in the current test class or one of its base classes. If you want to put your data provider in a different class, it needs to be a static method or a class with a non-arg constructor, and you specify the class where it can be found in the attribute:

StaticProvider.java

public class StaticProvider { @DataProvider(name = "create") public static Object[][] createData() { return new Object[][] { new Object[] { new Integer(42) } }; } } public class MyTest { @Test(dataProvider = "create", dataProviderClass = StaticProvider.class) public void test(Integer n) { // ... } } The data provider supports injection too. TestNG will use the test context for the injection. The Data Provider method can return one of the following two types:
  • An array of array of objects () where the first dimension's size is the number of times the test method will be invoked and the second dimension size contains an array of objects that must be compatible with the parameter types of the test method. This is the cast illustrated by the example above.
  • An . The only difference with is that an lets you create your test data lazily. TestNG will invoke the iterator and then the test method with the parameters returned by this iterator one by one. This is particularly useful if you have a lot of parameter sets to pass to the method and you don't want to create all of them upfront.
Here is an example of this feature: @DataProvider(name = "test1") public Iterator<Object[]> createData() { return new MyIterator(DATA); } If you declare your as taking a as first parameter, TestNG will pass the current test method for this first parameter. This is particularly useful when several test methods use the same and you want it to return different values depending on which test method it is supplying data for.

For example, the following code prints the name of the test method inside its :

@DataProvider(name = "dp") public Object[][] createData(Method m) { System.out.println(m.getName()); // print test method name return new Object[][] { new Object[] { "Cedric" }}; } @Test(dataProvider = "dp") public void test1(String s) { } @Test(dataProvider = "dp") public void test2(String s) { } and will therefore display: test1 test2 Data providers can run in parallel with the attribute : @DataProvider(parallel = true) // ... Parallel data providers running from an XML file share the same pool of threads, which has a size of 10 by default. You can modify this value in the tag of your XML file: <suite name="Suite1" data-provider-thread-count="20" > ... If you want to run a few specific data providers in a different thread pool, you need to run them from a different XML file.
Parameters in reports

Parameters used to invoke your test methods are shown in the HTML reports generated by TestNG. Here is an example:

Dependencies

Sometimes, you need your test methods to be invoked in a certain order.  Here are a few examples:

  • To make sure a certain number of test methods have completed and succeeded before running more test methods.
  • To initialize your tests while wanting this initialization methods to be test methods as well (methods tagged with will not be part of the final report).
TestNG allows you to specify dependencies either with annotations or in XML.
Dependencies with annotations

You can use the attributes or , found on the annotation.

There are two kinds of dependencies:
  • Hard dependencies. All the methods you depend on must have run and succeeded for you to run. If at least one failure occurred in your dependencies, you will not be invoked and marked as a SKIP in the report.
  • Soft dependencies. You will always be run after the methods you depend on, even if some of them have failed. This is useful when you just want to make sure that your test methods are run in a certain order but their success doesn't really depend on the success of others. A soft dependency is obtained by adding in your annotation.
Here is an example of a hard dependency: @Test public void serverStartedOk() {} @Test(dependsOnMethods = { "serverStartedOk" }) public void method1() {}

In this example, is declared as depending on method serverStartedOk(), which guarantees that serverStartedOk() will always be invoked first.

You can also have methods that depend on entire groups:

@Test(groups = { "init" }) public void serverStartedOk() {} @Test(groups = { "init" }) public void initEnvironment() {} @Test(dependsOnGroups = { "init.*" }) public void method1() {}

In this example, method1() is declared as depending on any group matching the regular expression "init.*", which guarantees that the methods and will always be invoked before . 

Note:  as stated before, the order of invocation for methods that belong in the same group is not guaranteed to be the same across test runs.

If a method depended upon fails and you have a hard dependency on it (, which is the default), the methods that depend on it are not marked as but as .  Skipped methods will be reported as such in the final report (in a color that is neither red nor green in HTML), which is important since skipped methods are not necessarily failures.

Both and accept regular expressions as parameters.  For , if you are depending on a method which happens to have several overloaded versions, all the overloaded methods will be invoked.  If you only want to invoke one of the overloaded methods, you should use .

For a more advanced example of dependent methods, please refer to this article, which uses inheritance to provide an elegant solution to the problem of multiple dependencies.

By default, dependent methods are grouped by class. For example, if method depends on method and you have several instances of the class that contains these methods (because of a factory of a data provider), then the invocation order will be as follows: a(1) a(2) b(2) b(2) TestNG will not run until all the instances have invoked their method.

This behavior might not be desirable in certain scenarios, such as for example testing a sign in and sign out of a web browser for various countries. In such a case, you would like the following ordering:

signIn("us") signOut("us") signIn("uk") signOut("uk") For this ordering, you can use the XML attribute . This attribute is valid either on <suite> or <test>: <suite name="Factory" group-by-instances="true"> or <test name="Factory" group-by-instances="true">
Dependencies in XML
Alternatively, you can specify your group dependencies in the file. You use the tag to achieve this: <test name="My suite"> <groups> <dependencies> <group name="c" depends-on="a b" /> <group name="z" depends-on="c" /> </dependencies> </groups> </test> The attribute contains a space-separated list of groups.

Factories

Factories allow you to create tests dynamically. For example, imagine you want to create a test method that will access a page on a Web site several times, and you want to invoke it with different values:

TestWebServer.java

public class TestWebServer { @Test(parameters = { "number-of-times" }) public void accessPage(int numberOfTimes) { while (numberOfTimes-- > 0) { // access the web page } } }

testng.xml

<test name="T1">   <parameter name="number-of-times" value="10"/>   <class name= "TestWebServer" /> </test> <test name="T2">   <parameter name="number-of-times" value="20"/>   <class name= "TestWebServer"/> </test> <test name="T3">   <parameter name="number-of-times" value="30"/>   <class name= "TestWebServer"/> </test> This can become quickly impossible to manage, so instead, you should use a factory:

WebTestFactory.java

public class WebTestFactory { @Factory public Object[] createInstances() { Object[] result = new Object[10]; for (int i = 0; i < 10; i++) { result[i] = new WebTest(i * 10); } return result; } } and the new test class is now:

WebTest.java

public class WebTest { private int m_numberOfTimes; public WebTest(int numberOfTimes) { m_numberOfTimes = numberOfTimes; } @Test public void testServer() { for (int i = 0; i < m_numberOfTimes; i++) { // access the web page } } }

Your only needs to reference the class that contains the factory method, since the test instances themselves will be created at runtime:

<class name="WebTestFactory" />

Or, if building a test suite instance programatically, you can add the factory in the same manner as for tests:

TestNG testNG = new TestNG(); testNG.setTestClasses(WebTestFactory.class); testNG.run();

The factory method can receive parameters just like and and it must return .  The objects returned can be of any class (not necessarily the same class as the factory class) and they don't even need to contain TestNG annotations (in which case they will be ignored by TestNG).

Factories can also be used with data providers, and you can leverage this functionality by putting the annotation either on a regular method or on a constructor. Here is an example of a constructor factory:

@Factory(dataProvider = "dp") public FactoryDataProviderSampleTest(int n) { super(n); } @DataProvider static public Object[][] dp() { return new Object[][] { new Object[] { 41 }, new Object[] { 42 }, }; } The example will make TestNG create two test classes, on with the constructor invoked with the value 41 and the other with 42.

Class level annotations

The annotation can be put on a class instead of a test method:

Test1.java

@Test public class Test1 { public void test1() { } public void test2() { } } The effect of a class level annotation is to make all the public methods of this class to become test methods even if they are not annotated. You can still repeat the annotation on a method if you want to add certain attributes.

For example:

Test1.java

@Test public class Test1 { public void test1() { } @Test(groups = "g1") public void test2() { } } will make both and test methods but on top of that, now belongs to the group "g1".

Ignoring tests

TestNG lets you ignore all the methods :
  • In a class (or)
  • In a particular package (or)
  • In a package and all of its child packages
using the new annotation .

When used at the method level annotation is functionally equivalent to . Here's a sample that shows how to ignore all tests within a class.

TestcaseSample.java

import org.testng.annotations.Ignore; import org.testng.annotations.Test; @Ignore public class TestcaseSample { @Test public void testMethod1() { } @Test public void testMethod2() { } } The annotation has a higher priority than individual method annotations. When is placed on a class, all the tests in that class will be disabled. To ignore all tests in a particular package, you just need to create and add the annotation to it. Here's a sample :

package-info.java

@Ignore package com.testng.master; import org.testng.annotations.Ignore; This causes all the methods to be ignored in the package and all of its sub-packages.

Parallelism and time-outs

You can instruct TestNG to run your tests in separate threads in various ways.
Parallel suites
This is useful if you are running several suite files (e.g. ") and you want each of these suites to be run in a separate thread. You can use the following command line flag to specify the size of a thread pool: java org.testng.TestNG -suitethreadpoolsize 3 testng1.xml testng2.xml testng3.xml The corresponding ant task name is .
Parallel tests, classes and methods
The parallel attribute on the <suite> tag can take one of following values: <suite name="My suite" parallel="methods" thread-count="5"> <suite name="My suite" parallel="tests" thread-count="5"> <suite name="My suite" parallel="classes" thread-count="5"> <suite name="My suite" parallel="instances" thread-count="5">
  • : TestNG will run all your test methods in separate threads. Dependent methods will also run in separate threads but they will respect the order that you specified.

  • : TestNG will run all the methods in the same <test> tag in the same thread, but each <test> tag will be in a separate thread. This allows you to group all your classes that are not thread safe in the same <test> and guarantee they will all run in the same thread while taking advantage of TestNG using as many threads as possible to run your tests.

  • : TestNG will run all the methods in the same class in the same thread, but each class will be run in a separate thread.

  • : TestNG will run all the methods in the same instance in the same thread, but two methods on two different instances will be running in different threads.

Additionally, the attribute thread-count allows you to specify how many threads should be allocated for this execution.

Note: the attribute works in both parallel and non-parallel mode.

You can also specify that a method should be invoked from different threads. You can use the attribute to achieve this result: @Test(threadPoolSize = 3, invocationCount = 10, timeOut = 10000) public void testServer() { In this example, the function will be invoked ten times from three different threads. Additionally, a time-out of ten seconds guarantees that none of the threads will block on this thread forever.

Rerunning failed tests

Every time tests fail in a suite, TestNG creates a file called in the output directory. This XML file contains the necessary information to rerun only these methods that failed, allowing you to quickly reproduce the failures without having to run the entirety of your tests.  Therefore, a typical session would look like this: java -classpath testng.jar;%CLASSPATH% org.testng.TestNG -d test-outputs testng.xml java -classpath testng.jar;%CLASSPATH% org.testng.TestNG -d test-outputs test-outputs\testng-failed.xml

Note that will contain all the necessary dependent methods so that you are guaranteed to run the methods that failed without any SKIP failures.

JUnit tests

TestNG can run JUnit 3 and JUnit 4 tests.  All you need to do is put the JUnit jar file on the classpath, specify your JUnit test classes in the property and set the property to true:

testng.xml

<test name="Test1" junit="true"> <classes> <!-- ... -->

The behavior of TestNG in this case is similar to JUnit depending on the JUnit version found on the class path:

  • JUnit 3:
    • All methods starting with test* in your classes will be run
    • If there is a method setUp() on your test class, it will be invoked before every test method
    • If there is a method tearDown() on your test class, it will be invoked before after every test method
    • If your test class contains a method suite(), all the tests returned by this method will be invoked
  • JUnit 4:
    • TestNG will use the runner to run your tests

Running TestNG programmatically

You can invoke TestNG from your own programs very easily:

Java

TestListenerAdapter tla = new TestListenerAdapter(); TestNG testng = new TestNG(); testng.setTestClasses(new Class[] { Run2.class }); testng.addListener(tla); testng.run(); This example creates a object and runs the test class . It also adds a . You can either use the adapter class or implement yourself. This interface contains various callback methods that let you keep track of when a test starts, succeeds, fails, etc...

Similarly, you can invoke TestNG on a file or you can create a virtual file yourself. In order to do this, you can use the classes found the package : , , etc... Each of these classes correspond to their XML tag counterpart.

For example, suppose you want to create the following virtual file:

<suite name="TmpSuite" > <test name="TmpTest" > <classes> <class name="test.failures.Child" /> <classes> </test> </suite> You would use the following code: XmlSuite suite = new XmlSuite(); suite.setName("TmpSuite"); XmlTest test = new XmlTest(suite); test.setName("TmpTest"); List<XmlClass> classes = new ArrayList<XmlClass>(); classes.add(new XmlClass("test.failures.Child")); test.setXmlClasses(classes) ; And then you can pass this to TestNG: List<XmlSuite> suites = new ArrayList<XmlSuite>(); suites.add(suite); TestNG tng = new TestNG(); tng.setXmlSuites(suites); tng.run();

Please see the JavaDocs for the entire API.

BeanShell and advanced group selection

If the and tags in are not enough for your needs, you can use a BeanShell expression to decide whether a certain test method should be included in a test run or not. You specify this expression just under the tag:

testng.xml

<test name="BeanShell test"> <method-selectors> <method-selector> <script language="beanshell"><![CDATA[ groups.containsKey("test1") ]]></script> </method-selector> </method-selectors> <!-- ... --> When a tag is found in , TestNG will ignore subsequent and of groups and methods in the current tag:  your BeanShell expression will be the only way to decide whether a test method is included or not.

Here are additional information on the BeanShell script:

  • It must return a boolean value.  Except for this constraint, any valid BeanShell code is allowed (for example, you might want to return during week days and false during weekends, which would allow you to run tests differently depending on the date).
     
  • TestNG defines the following variables for your convenience:
      :  the current test method.
      org.testng.ITestNGMethod testngMethod:  the description of the current test method.
      :  a map of the groups the current test method belongs to.
     
  • You might want to surround your expression with a declaration (as shown above) to avoid tedious quoting of reserved XML characters).
     

Annotation Transformers

TestNG allows you to modify the content of all the annotations at runtime. This is especially useful if the annotations in the source code are right most of the time, but there are a few situations where you'd like to override their value.

In order to achieve this, you need to use an Annotation Transformer.

An Annotation Transformer is a class that implements the following interface:

public interface IAnnotationTransformer { /** * This method will be invoked by TestNG to give you a chance * to modify a TestNG annotation read from your test classes. * You can change the values you need by calling any of the * setters on the ITest interface. * * Note that only one of the three parameters testClass, * testConstructor and testMethod will be non-null. * * @param annotation The annotation that was read from your * test class. * @param testClass If the annotation was found on a class, this * parameter represents this class (null otherwise). * @param testConstructor If the annotation was found on a constructor, * this parameter represents this constructor (null otherwise). * @param testMethod If the annotation was found on a method, * this parameter represents this method (null otherwise). */ public void transform(ITest annotation, Class testClass, Constructor testConstructor, Method testMethod); } Like all the other TestNG listeners, you can specify this class either on the command line or with ant: java org.testng.TestNG -listener MyTransformer testng.xml or programmatically: TestNG tng = new TestNG(); tng.setAnnotationTransformer(new MyTransformer()); // ... When the method is invoked, you can call any of the setters on the parameter to alter its value before TestNG proceeds further.

For example, here is how you would override the attribute but only on the test method of one of your test classes:

public class MyTransformer implements IAnnotationTransformer { public void transform(ITest annotation, Class testClass, Constructor testConstructor, Method testMethod) { if ("invoke".equals(testMethod.getName())) { annotation.setInvocationCount(5); } } } only lets you modify a annotation. If you need to modify another TestNG annotation (a configuration annotation, or ), use an .

Method Interceptors

Once TestNG has calculated in what order the test methods will be invoked, these methods are split in two groups:
  • Methods run sequentially. These are all the test methods that have dependencies or dependents. These methods will be run in a specific order.
  • Methods run in no particular order. These are all the methods that don't belong in the first category. The order in which these test methods are run is random and can vary from one run to the next (although by default, TestNG will try to group test methods by class).
In order to give you more control on the methods that belong to the second category, TestNG defines the following interface: public interface IMethodInterceptor { List<IMethodInstance> intercept(List<IMethodInstance> methods, ITestContext context); } The list of methods passed in parameters are all the methods that can be run in any order. Your method is expected to return a similar list of , which can be either of the following:
  • The same list you received in parameter but in a different order.
  • A smaller list of objects.
  • A bigger list of objects.
Once you have defined your interceptor, you pass it to TestNG as a listener. For example:

Shell

java -classpath "testng-jdk15.jar:test/build" org.testng.TestNG -listener test.methodinterceptors.NullMethodInterceptor -testclass test.methodinterceptors.FooTest For the equivalent syntax, see the attribute in the ant documentation.

For example, here is a Method Interceptor that will reorder the methods so that test methods that belong to the group "fast" are always run first:

public List<IMethodInstance> intercept(List<IMethodInstance> methods, ITestContext context) { List<IMethodInstance> result = new ArrayList<IMethodInstance>(); for (IMethodInstance m : methods) { Test test = m.getMethod().getConstructorOrMethod().getAnnotation(Test.class); Set<String> groups = new HashSet<String>(); for (String group : test.groups()) { groups.add(group); } if (groups.contains("fast")) { result.add(0, m); } else { result.add(m); } } return result; }

TestNG Listeners

There are several interfaces that allow you to modify TestNG's behavior. These interfaces are broadly called "TestNG Listeners". Here are a few listeners: When you implement one of these interfaces, you can let TestNG know about it with either of the following ways:
Specifying listeners with or in Java
Here is how you can define listeners in your file: <suite> <listeners> <listener class-name="com.example.MyListener" /> <listener class-name="com.example.MyMethodInterceptor" /> </listeners> ... Or if you prefer to define these listeners in Java: @Listeners({ com.example.MyListener.class, com.example.MyMethodInterceptor.class }) public class MyTest { // ... } The annotation can contain any class that extends except and . The reason is that these listeners need to be known very early in the process so that TestNG can use them to rewrite your annotations, therefore you need to specify these listeners in your file.

Note that the annotation will apply to your entire suite file, just as if you had specified it in a file. If you want to restrict its scope (for example, only running on the current class), the code in your listener could first check the test method that's about to run and decide what to do then. Here's how it can be done.

  1. First define a new custom annotation that can be used to specify this restriction: @Retention(RetentionPolicy.RUNTIME) @Target ({ElementType.TYPE}) public @interface DisableListener {}
  2. Add an edit check as below within your regular listeners: public void beforeInvocation(IInvokedMethod iInvokedMethod, ITestResult iTestResult) { ConstructorOrMethod consOrMethod =iInvokedMethod.getTestMethod().getConstructorOrMethod(); DisableListener disable = consOrMethod.getMethod().getDeclaringClass().getAnnotation(DisableListener.class); if (disable != null) { return; } // else resume your normal operations }
  3. Annotate test classes wherein the listener is not to be invoked: @DisableListener @Listeners({ com.example.MyListener.class, com.example.MyMethodInterceptor.class }) public class MyTest { // ... }
Specifying listeners with
Finally, the JDK offers a very elegant mechanism to specify implementations of interfaces on the class path via the class.

With ServiceLoader, all you need to do is create a jar file that contains your listener(s) and a few configuration files, put that jar file on the classpath when you run TestNG and TestNG will automatically find them.

Here is a concrete example of how it works.

Let's start by creating a listener (any TestNG listener should work):

package test.tmp; public class TmpSuiteListener implements ISuiteListener { @Override public void onFinish(ISuite suite) { System.out.println("Finishing"); } @Override public void onStart(ISuite suite) { System.out.println("Starting"); } } Compile this file, then create a file at the location , which will name the implementation(s) you want for this interface.

You should end up with the following directory structure, with only two files:

$ tree |____META-INF | |____services | | |____org.testng.ITestNGListener |____test | |____tmp | | |____TmpSuiteListener.class $ cat META-INF/services/org.testng.ITestNGListener test.tmp.TmpSuiteListener Create a jar of this directory: $ jar cvf ../sl.jar . added manifest ignoring entry META-INF/ adding: META-INF/services/(in = 0) (out= 0)(stored 0%) adding: META-INF/services/org.testng.ITestNGListener(in = 26) (out= 28)(deflated -7%) adding: test/(in = 0) (out= 0)(stored 0%) adding: test/tmp/(in = 0) (out= 0)(stored 0%) adding: test/tmp/TmpSuiteListener.class(in = 849) (out= 470)(deflated 44%) Next, put this jar file on your classpath when you invoke TestNG: $ java -classpath sl.jar:testng.jar org.testng.TestNG testng-single.yaml Starting f2 11 2 PASSED: f2("2") Finishing This mechanism allows you to apply the same set of listeners to an entire organization just by adding a jar file to the classpath, instead of asking every single developer to remember to specify these listeners in their testng.xml file.

Dependency injection

TestNG supports two different kinds of dependency injection: native (performed by TestNG itself) and external (performed by a dependency injection framework such as Guice).
Native dependency injection
TestNG lets you declare additional parameters in your methods. When this happens, TestNG will automatically fill these parameters with the right value. Dependency injection can be used in the following places:
  • Any @Before method or @Test method can declare a parameter of type .
  • Any @AfterMethod method can declare a parameter of type , which will reflect the result of the test method that was just run.
  • Any @Before and @After methods (except @BeforeSuite and @AfterSuite) can declare a parameter of type , which contain the current tag.
  • Any @BeforeMethod (and @AfterMethod) can declare a parameter of type . This parameter will receive the test method that will be called once this @BeforeMethod finishes (or after the method as run for @AfterMethod).
  • Any @BeforeMethod can declare a parameter of type . This parameter will receive the list of parameters that are about to be fed to the upcoming test method, which could be either injected by TestNG, such as or come from a .
  • Any @DataProvider can declare a parameter of type or . The latter parameter will receive the test method that is about to be invoked.
You can turn off injection with the annotation: public class NoInjectionTest { @DataProvider(name = "provider") public Object[][] provide() throws Exception { return new Object[][] { { CC.class.getMethod("f") } }; } @Test(dataProvider = "provider") public void withoutInjection(@NoInjection Method m) { Assert.assertEquals(m.getName(), "f"); } @Test(dataProvider = "provider") public void withInjection(Method m) { Assert.assertEquals(m.getName(), "withInjection"); } } The below table summarises the parameter types that can be natively injected for the various TestNG annotations:

Annotation ITestContext  XmlTest  Method  Object[]  ITestResult 
BeforeSuiteYesNoNoNoNo
BeforeTestYesYesNoNoNo
BeforeGroupsYesYesNoNoNo
BeforeClassYesYesNoNoNo
BeforeMethodYesYesYesYesYes
TestYesNoNoNoNo
DataProviderYesNoYesNoNo
AfterMethodYesYesYesYesYes
AfterClassYesYesNoNoNo
AfterGroupsYesYesNoNoNo
AfterTestYesYesNoNoNo
AfterSuiteYesNoNoNoNo
Guice dependency injection
If you use Guice, TestNG gives you an easy way to inject your test objects with a Guice module: @Guice(modules = GuiceExampleModule.class) public class GuiceTest extends SimpleBaseTest { @Inject ISingleton m_singleton; @Test public void singletonShouldWork() { m_singleton.doSomething(); } } In this example, is expected to bind the interface to some concrete class: public class GuiceExampleModule implements Module { @Override public void configure(Binder binder) { binder.bind(ISingleton.class).to(ExampleSingleton.class).in(Singleton.class); } } If you need more flexibility in specifying which modules should be used to instantiate your test classes, you can specify a module factory: @Guice(moduleFactory = ModuleFactory.class) public class GuiceModuleFactoryTest { @Inject ISingleton m_singleton; @Test public void singletonShouldWork() { m_singleton.doSomething(); } } The module factory needs to implement the interface IModuleFactory: public interface IModuleFactory { /** * @param context The current test context * @param testClass The test class * * @return The Guice module that should be used to get an instance of this * test class. */ Module createModule(ITestContext context, Class<?> testClass); } Your factory will be passed an instance of the test context and the test class that TestNG needs to instantiate. Your method should return a Guice Module that will know how to instantiate this test class. You can use the test context to find out more information about your environment, such as parameters specified in , etc... You will get even more flexibility and Guice power with and suite parameters. allow you to chose the used to create the parent injector. The default one is . Other allowed values are and . Here is how you can define parent-module in your test.xml file: TestNG will create this module only once for given suite. Will also use this module for obtaining instances of test specific Guice modules and module factories, then will create child injector for each test class. With such approach you can declare all common bindings in parent-module also you can inject binding declared in parent-module in module and module factory. Here is an example of this functionality: package com.example; public class ParentModule extends AbstractModule { @Override protected void conigure() { bind(MyService.class).toProvider(MyServiceProvider.class); bind(MyContext.class).to(MyContextImpl.class).in(Singleton.class); } } package com.example; public class TestModule extends AbstractModule { private final MyContext myContext; @Inject TestModule(MyContext myContext) { this.myContext = myContext } @Override protected void configure() { bind(MySession.class).toInstance(myContext.getSession()); } } package com.example; @Test @Guice(modules = TestModule.class) public class TestClass { @Inject MyService myService; @Inject MySession mySession; public void testServiceWithSession() { myService.serve(mySession); } } As you see ParentModule declares binding for MyService and MyContext classes. Then MyContext is injected using constructor injection into TestModule class, which also declare binding for MySession. Then parent-module in test XML file is set to ParentModule class, this enables injection in TestModule. Later in TestClass you see two injections: * MyService - binding taken from ParentModule * MySession - binding taken from TestModule This configuration ensures you that all tests in this suite will be run with same session instance, the MyContextImpl object is only created once per suite, this give you possibility to configure common environment state for all tests in suite.

Listening to method invocations

The listener allows you to be notified whenever TestNG is about to invoke a test (annotated with ) or configuration (annotated with any of the or annotation) method. You need to implement the following interface: public interface IInvokedMethodListener extends ITestNGListener { void beforeInvocation(IInvokedMethod method, ITestResult testResult); void afterInvocation(IInvokedMethod method, ITestResult testResult); } and declare it as a listener, as explained in the section about TestNG listeners.

Overriding test methods

TestNG allows you to override and possibly skip the invocation of test methods. One example of where this is useful is if you need to your test methods with a specific security manager. You achieve this by providing a listener that implements .

Here is an example with JAAS:

public class MyHook implements IHookable { public void run(final IHookCallBack icb, ITestResult testResult) { // Preferably initialized in a @Configuration method mySubject = authenticateWithJAAs(); Subject.doAs(mySubject, new PrivilegedExceptionAction() { public Object run() { icb.callback(testResult); } }; } }

Altering suites (or) tests

Sometimes you may need to just want to alter a suite (or) a test tag in a suite xml in runtime without having to change the contents of a suite file.

A classic example for this would be to try and leverage your existing suite file and try using it for simulating a load test on your "Application under test". At the minimum you would end up duplicating the contents of your <test> tag multiple times and create a new suite xml file and work with. But this doesn't seem to scale a lot.

TestNG allows you to alter a suite (or) a test tag in your suite xml file at runtime via listeners. You achieve this by providing a listener that implements . Please refer to Listeners section to learn about listeners.

Here is an example that shows how the suite name is getting altered in runtime:

public class AlterSuiteNameListener implements IAlterSuiteListener { @Override public void alter(List<XmlSuite> suites) { XmlSuite suite = suites.get(0); suite.setName(getClass().getSimpleName()); } } This listener can only be added with either of the following ways:
  • Through the tag in the suite xml file.
  • Through a Service Loader
This listener cannot be added to execution using the annotation.

Test results

Success, failure and assert

A test is considered successful if it completed without throwing any exception or if  it threw an exception that was expected (see the documentation for the attribute found on the annotation).

Your test methods will typically be made of calls that can throw an exception, or of various assertions (using the Java "assert" keyword).  An "assert" failing will trigger an AssertionErrorException, which in turn will mark the method as failed (remember to use -ea on the JVM if you are not seeing the assertion errors).

Here is an example test method:

@Test public void verifyLastName() { assert "Beust".equals(m_lastName) : "Expected name Beust, for" + m_lastName; } TestNG also include JUnit's Assert class, which lets you perform assertions on complex objects: import static org.testng.AssertJUnit.*; //... @Test public void verify() { assertEquals("Beust", m_lastName); }

Note that the above code use a static import in order to be able to use the method without having to prefix it by its class.

Logging and results

The results of the test run are created in a file called in the directory specified when launching SuiteRunner.  This file points to various other HTML and text files that contain the result of the entire test run.

It's very easy to generate your own reports with TestNG with Listeners and Reporters:

  • Listeners implement the interface and are notified in real time of when a test starts, passes, fails, etc...
  • Reporters implement the interface and are notified when all the suites have been run by TestNG. The IReporter instance receives a list of objects that describe the entire test run.
For example, if you want to generate a PDF report of your test run, you don't need to be notified in real time of the test run so you should probably use an . If you'd like to write a real-time reporting of your tests, such as a GUI with a progress bar or a text reporter displaying dots (".") as each test is invoked (as is explained below), is your best choice.
Logging Listeners
Here is a listener that displays a "." for each passed test, a "F" for each failure and a "S" for each skip: public class DotTestListener extends TestListenerAdapter { private int m_count = 0; @Override public void onTestFailure(ITestResult tr) { log("F"); } @Override public void onTestSkipped(ITestResult tr) { log("S"); } @Override public void onTestSuccess(ITestResult tr) { log("."); } private void log(String string) { System.out.print(string); if (++m_count % 40 == 0) { System.out.println(""); } } } In this example, I chose to extend , which implements with empty methods, so I don't have to override other methods from the interface that I have no interest in. You can implement the interface directly if you prefer.

Here is how I invoke TestNG to use this new listener:

Shell

java -classpath testng.jar;%CLASSPATH% org.testng.TestNG -listener org.testng.reporters.DotTestListener test\testng.xml and the output:

Shell

........................................ ........................................ ........................................ ........................................ ........................................ ......................... =============================================== TestNG JDK 1.5 Total tests run: 226, Failures: 0, Skips: 0 =============================================== Note that when you use , TestNG will automatically determine the type of listener you want to use.
Logging Reporters
The interface only has one method: public void generateReport(List<ISuite> suites, String outputDirectory) This method will be invoked by TestNG when all the suites have been run and you can inspect its parameters to access all the information on the run that was just completed.
JUnitReports

TestNG contains a listener that takes the TestNG results and outputs an XML file that can then be fed to JUnitReport. Here is an example, and the ant task to create this report:

build.xml

<target name="reports"> <junitreport todir="test-report"> <fileset dir="test-output"> <include name="*/*.xml"/> </fileset> <report format="noframes" todir="test-report"/> </junitreport> </target>
Note:  a current incompatibility between the JDK 1.5 and JUnitReports prevents the frame version from working, so you need to specify "noframes" to get this to work for now.
Reporter API

If you need to log messages that should appear in the generated HTML reports, you can use the class :

    Reporter.log("M3 WAS CALLED");

XML Reports

TestNG offers an XML reporter capturing TestNG specific information that is not available in JUnit reports. This is particularly useful when the user's test environment needs to consume XML results with TestNG-specific data that the JUnit format can't provide. This reporter can be injected into TestNG via the command line with .

Here's a sample usage: .

The full set of options that can be passed is detailed in the below table. Make sure to use :

  • - to separate the reporter name from its properties
  • - to separate key/value pairs for properties
  • - to separate multiple key/value pairs

Below is a sample of the output of such a reporter:

<testng-results> <suite name="Suite1"> <groups> <group name="group1"> <method signature="com.test.TestOne.test2()" name="test2" class="com.test.TestOne"/> <method signature="com.test.TestOne.test1()" name="test1" class="com.test.TestOne"/> </group> <group name="group2"> <method signature="com.test.TestOne.test2()" name="test2" class="com.test.TestOne"/> </group> </groups> <test name="test1"> <class name="com.test.TestOne"> <test-method status="FAIL" signature="test1()" name="test1" duration-ms="0" started-at="2007-05-28T12:14:37Z" description="someDescription2" finished-at="2007-05-28T12:14:37Z"> <exception class="java.lang.AssertionError"> <short-stacktrace> <![CDATA[ java.lang.AssertionError ... Removed 22 stack frames ]]> </short-stacktrace> </exception> </test-method> <test-method status="PASS" signature="test2()" name="test2" duration-ms="0" started-at="2007-05-28T12:14:37Z" description="someDescription1" finished-at="2007-05-28T12:14:37Z"> </test-method> <test-method status="PASS" signature="setUp()" name="setUp" is-config="true" duration-ms="15" started-at="2007-05-28T12:14:37Z" finished-at="2007-05-28T12:14:37Z"> </test-method> </class> </test> </suite> </testng-results>

This reporter is injected along with the other default listeners so you can get this type of output by default. The listener provides some properties that can tweak the reporter to fit your needs. The following table contains a list of these properties with a short explanation:

PropertyCommentDefault value
outputDirectoryA indicating the directory where should the XML files be output.The TestNG output directory
timestampFormatSpecifies the format of date fields that are generated by this reporteryyyy-MM-dd'T'HH:mm:ss'Z'
fileFragmentationLevelAn integer having the values 1, 2 or 3, indicating the way that the XML files are generated:
1 - will generate all the results in one file. 2 - each suite is generated in a separate XML file that is linked to the main file. 3 - same as 2 plus separate files for test-cases that are referenced from the suite files.
1
splitClassAndPackageNamesThis boolean specifies the way that class names are generated for the element. For example, you will get for false and for true. false
generateGroupsAttributeA boolean indicating if a attribute should be generated for the element. This feature aims at providing a straight-forward method of retrieving the groups that include a test method without having to surf through the elements. false
generateTestResultAttributes A boolean indicating if an tag should be generated for each element, containing the test result attributes (See about setting test result attributes). Each attribute representation will be written in a tag. false
stackTraceOutputMethodSpecifies the type of stack trace that is to be generated for exceptions and has the following values:
0 - no stacktrace (just Exception class and message). 1 - a short version of the stack trace keeping just a few lines from the top 2 - the complete stacktrace with all the inner exceptions 3 - both short and long stacktrace
2
generateDependsOnMethodsUse this attribute to enable/disable the generation of a attribute for the element. true
generateDependsOnGroupsEnable/disable the generation of a attribute for the element. true

In order to configure this reporter you can use the option in the command line or the Ant task with the nested element. For each of these you must specify the class . Please note that you cannot configure the built-in reporter because this one will only use default settings. If you need just the XML report with custom settings you will have to add it manually with one of the two methods and disable the default listeners.

YAML

TestNG supports YAML as an alternate way of specifying your suite file. For example, the following XML file:

and here is its YAML version:

name: SingleSuite threadCount: 4 parameters: { n: 42 } tests: - name: Regression2 parameters: { count: 10 } excludedGroups: [ broken ] classes: - test.listeners.ResultEndMillisTest Here is TestNG's own suite file, and its YAML counterpart.

You might find the YAML file format easier to read and to maintain. YAML files are also recognized by the TestNG Eclipse plug-in. You can find more information about YAML and TestNG in this blog post.

Dry Run for your tests

When launched in dry run mode, TestNG will display a list of the test methods that would be invoked but without actually calling them.

You can enable dry run mode for TestNG by passing the JVM argument

 

Back to my home page.

Or check out some of my other projects:

Introduction to Unit Testing Framework

The various type of software testings include:

  • Unit Test: Test individual component/class in isolation.
  • Integration Test: Test a group of associated components/classes.
  • Acceptance Test (or Functional Test): operate on a fully integrated system, testing against the user interface (e.g., HTML for browser or XML/JSON for web services).
  • Regression Test: Tests to ensure the a change (such as enhancement, patches or configuration change) does not break the system or introduce new faults.

Unit Testing is concerned about testing individual programs/classes to ascertain that each program/class runs as per specification. Prior to the arrival of "unit testing framework", programmers tends to write test expressions which print to the console or a trace file (the amount of output is sometimes controlled by a trace-level or debug-level flag). This approach is not satisfactory because it requires human judgment to analyze the results produced. Too many print statements cause the dreaded Scroll Blindness.

JDK 1.4 provides an assertion feature (read Assertion), which enables you to test (or assert) your assumptions about your program logic (such as pre-conditions, post-conditions, and invariants). Nonetheless, assertion is primitive compared with the unit testing framework.

With a proper Unit Testing framework, you can automate the entire unit testing process. Your job becomes designing proper test cases to excite the program. Furthermore, the unit testing process can be integrated into the build process. In this case, the build process not only checks for syntax errors, but semantic errors as well.

Extreme Programming

Extreme programming (@ www.xprogramming.com) advocates "write test first, before writing codes".

xUnit Unit Testing Framework

xUnit is the family name given to a group of unit testing frameworks that share the same architecture, such as JUnit (for Java), NUnit (for .NET), CppUnit (for C++), PHPUnit (for PHP) and many others.

The xUnit architecture has these common components:

  • Test case / Test suites:
  • Test fixture:
  • Test runner:
  • Test result formatter:
  • Assertions:

JUnit

JUnit (@ http://junit.org/) is an open-source Java Unit Testing Framework designed by Kent Beck, Erich Gamma. It is the de facto standard for Java Unit Testing. JUnit is not included in JDK, but included in most of the IDEs such as Eclipse and NetBeans.

Installing and Using JUnit

Installing JUnit: Goto http://junit.org/ ⇒ "Download and Install Guide" ⇒ Download the "" and "". You could download the API documentation as well as the source code.

Using JUnit: To use the JUnit, include JUnit jar-files "" and "" in your .

Using JUnit under Eclipse

Include JUnit Library in your Java Project: Create a new Java project ⇒ right-click on the project ⇒ Properties ⇒ Java Build Path ⇒ "Libraries" tab ⇒ Add Library ⇒ JUnit ⇒ In "JUnit library version", choose "JUnit 4" ⇒ In "current location" use the eclipse's JUnit or your own download. [Alternatively, when you create a new test case or test suite (as describe below), Eclipse will prompt you to include the JUnit library.]

Create Test case (or Test Suite): To create a new JUnit test case (or test suite, which contains many test cases) ⇒ File ⇒ Others ⇒ Java ⇒ JUnit ⇒ JUnit test case (or JUnit test suite).

Run Test case (or Test Suite): To run a test case (or test suite), right-click the file ⇒ Run As ⇒ JUnit Test.

JUnit 4

There are two versions of JUnit, version 3 and version 4, which are radically different. JUnit 4 uses the annotation feature (since JDK 1.5) to streamline the process and drop the strict naming conventions of test methods.

Getting Started with an Example

Suppose that we wish to carry out unit testing on the following Java program, which uses methods to perform arithmetic operations on two integers. Take note that divide throws an for divisor of zero.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 public class Calculator { public static int add(int number1, int number2) { return number1 + number2; } public static int sub(int number1, int number2) { return number1 - number2; } public static int mul(int number1, int number2) { return number1 * number2; } public static int divInt(int number1, int number2) { if (number2 == 0) { throw new IllegalArgumentException("Cannot divide by 0!"); } return number1 / number2; } public static double divReal(int number1, int number2) { if (number2 == 0) { throw new IllegalArgumentException("Cannot divide by 0!"); } return (double) number1 / number2; } }
First Test Case

Let's do it under Eclipse.

  1. Create a new Eclipse Java project called "".
  2. Create a new class called "" under "" folder, with the above program code.
  3. Create a new folder called "" for storing test scripts ⇒ Right-click on the project ⇒ New ⇒ Folder ⇒ In folder name, enter "". Make "" a source folder by right-click on "" ⇒ Build Path ⇒ Use as source folder.
  4. Create the first test case called "" ⇒ Right-click on folder "" ⇒ New ⇒ Other ⇒ Java ⇒ JUnit ⇒ JUnit Test Case ⇒ New JUnit 4 test ⇒ In Name, enter "". Enter the following codes:
    1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 import static org.junit.Assert.*; import org.junit.Test; public class AddSubTest { @Test public void testAddPass() { assertEquals("error in add()", 3, Calculator.add(1, 2)); assertEquals("error in add()", -3, Calculator.add(-1, -2)); assertEquals("error in add()", 9, Calculator.add(9, 0)); } @Test public void testAddFail() { assertNotEquals("error in add()", 0, Calculator.add(1, 2)); } @Test public void testSubPass() { assertEquals("error in sub()", 1, Calculator.sub(2, 1)); assertEquals("error in sub()", -1, Calculator.sub(-2, -1)); assertEquals("error in sub()", 0, Calculator.sub(2, 2)); } @Test public void testSubFail() { assertNotEquals("error in sub()", 0, Calculator.sub(2, 1)); } }
  5. To run the test case, right-click on the file ⇒ Run as ⇒ JUnit Test. The test result is shown in the JUnit panel. 4 tests were run and all succeeded. Study the test results.
  6. Try modify one of the test to force a test failure and observe the test result, e.g., @Test public void testAddPass() { assertEquals("error in add()", 0, Calculator.add(1, 2)); ..... }
Explanation
  • A test case contains a number of tests, marked with annotation of "". Each of the test is run independently from the other tests.
  • Inside the test method, we can use static methods (in class ) to assert the expected and actual test outcomes, such as: public static void assertEquals([String message,] long expected, long actual) public static void assertEquals([String message,] double expected, double actual, double epsilon) public static void assertEquals([String message,] Object expected, Object actual) public static void assertNotEquals(.....) public static void assertSame([String message,] Object expected, Object actual) public static void assertNotSame(.....) public static void assertTrue([String message,] boolean condition) public static void assertFalse([String message,] boolean condition) public static void assertNull([String message,] Object object) public static void assertNotNull(......) public static void assertArrayEquals([String message,], int[] expecteds, int[] actuals) public static void assertArrayEquals([String message,], byte[] expecteds, byte[] actuals) public static void assertArrayEquals([String message,], char[] expecteds, char[] actuals) public static void assertArrayEquals([String message,], long[] expecteds, long[] actuals) public static void assertArrayEquals([String message,], byte[] expecteds, byte[] actuals) public static void assertArrayEquals([String message,], short[] expecteds, short[] actuals) public static void assertArrayEquals([String message,], Object[] expecteds, Object[] actuals) public static <T> void assertThat([String message,], T actual, org.hamcrest.Matcher<T> matcher)
Run Test Standalone via Test Runner

To run your test standalone (outside Eclipse), you could write a Test Runner as follows:

1 2 3 4 5 6 7 8 9 10 11 12 13 import org.junit.runner.JUnitCore; import org.junit.runner.Result; import org.junit.runner.notification.Failure; public class RunTestStandalone { public static void main(String[] args) { Result result = JUnitCore.runClasses(AddSubTest.class); for (Failure failure : result.getFailures()) { System.out.println(failure.toString()); } System.out.println(result.wasSuccessful()); } }

You can include more than one test cases using .

Run Test in Command-line

JUnit also provides a console version of test-runner called org.junit.runner.JUnitCore for you to run the tests in command-line, with the following syntax:

$ java org.junit.runner.JUnitCore TestClass1 [TestClass2 ...]
  1. Copy all your classes into one folder (for simplicity).
  2. Set the to include the JUnit jar-files: $ export CLASSPATH=.:$CLASSPATH:/path/to/junit/junit-4.11.jar:/path/to/junit/hamcrest-core-1.3.jar > set CLASSPATH=.;%CLASSPATH%;x:\path\to\junit\junit-4.11.jar;x:\path\to\junit\hamcrest-core-1.3.jar
  3. Compile all the source files: $ cd /path/to/source-files $ javac Calculator.java AddSubTest.java
  4. Run the test: $ java org.junit.runner.JUnitCore AddSubTest JUnit version 4.11 .... Time: 0.006 OK (4 tests)
Second Test Case

Let's write another test case to test the divide methods, which throw an Exception for divisor of zero. Furthermore, the method returns a double which cannot be compared with absolute precision.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 import static org.junit.Assert.*; import org.junit.Test; public class DivTest { @Test public void testDivIntPass() { assertEquals("error in divInt()", 3, Calculator.divInt(9, 3)); assertEquals("error in divInt()", 0, Calculator.divInt(1, 9)); } @Test public void testDivIntFail() { assertNotEquals("error in divInt()", 1, Calculator.divInt(9, 3)); } @Test(expected = IllegalArgumentException.class) public void testDivIntByZero() { Calculator.divInt(9, 0); } @Test public void testDivRealPass() { assertEquals("error in divInt()", 0.333333, Calculator.divReal(1, 3), 1e-6); assertEquals("error in divInt()", 0.111111, Calculator.divReal(1, 9), 1e-6); } @Test(expected = IllegalArgumentException.class) public void testDivRealByZero() { Calculator.divReal(9, 0); } }

Run the test and observe the test result. Change 's expected value from 0.333333 to 0.333330 and check the test result.

Explanation
  • To test for exception, use annotation with attribute .
  • To compare doubles, use a tolerance (epsilon) as shown.
First Test Suite

A test suite comprises many test cases.

To create a test suite under Eclipse ⇒ right-click on the folder ⇒ New ⇒ other ⇒ Java ⇒ JUnit ⇒ JUnit Test Suite ⇒ In Name, enter "" ⇒ Select test cases to be included - AddSubTest and DivTest.

The following test script will be created:

1 2 3 4 5 6 7 8 9 import org.junit.runner.RunWith; import org.junit.runners.Suite; import org.junit.runners.Suite.SuiteClasses; @RunWith(Suite.class) @SuiteClasses({ AddSubTest.class, DivTest.class }) public class AllTests { }

Take note that the test suite class is marked by annotation and with an empty class body.

To run the test suite ⇒ right-click on the file ⇒ Run as ⇒ JUnit Test. Observe the test results produced.

You can also run the test suite via Test Runner , just like running test cases (as described earlier).

Testing Java Classes By Example

Instead of testing static methods in a Java class, let's carry out unit test on a proper self-contained and encapsulated Java class with its own private variables and public operations.

Suppose that we have a class called that represents a number, and capable of performing arithmetic operations.

Again, we shall work under Eclipse.

  1. Create a Java project called ""
  2. Create a new Java class called "", as follow:
    1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 public class MyNumber { int number; public MyNumber() { this.number = 0; } public MyNumber(int number) { this.number = number; } public int getNumber() { return number; } public void setNumber(int number) { this.number = number; } public MyNumber add(MyNumber rhs) { this.number += rhs.number; return this; } public MyNumber div(MyNumber rhs) { if (rhs.number == 0) throw new IllegalArgumentException("Cannot divide by 0!"); this.number /= rhs.number; return this; } }
  3. Create a new source folder called "" for storing test scripts. Make it a source folder by right-click ⇒ Build Path ⇒ Use as source folder.
  4. Create the first test case called MyNumberTest (under "" folder), as follows:
    1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 import static org.junit.Assert.*; import org.junit.After; import org.junit.Before; import org.junit.Test; public class MyNumberTest { private MyNumber number1, number2; @Before public void setUp() throws Exception { System.out.println("Run @Before"); number1 = new MyNumber(11); number2 = new MyNumber(22); } @After public void tearDown() throws Exception { System.out.println("Run @After"); } @Test public void testGetterSetter() { System.out.println("Run @Test testGetterSetter"); int value = 33; number1.setNumber(value); assertEquals("error in getter/setter", value, number1.getNumber()); } @Test public void testAdd() { System.out.println("Run @Test testAdd"); assertEquals("error in add()", 33, number1.add(number2).getNumber()); assertEquals("error in add()", 55, number1.add(number2).getNumber()); } @Test public void testDiv() { System.out.println("Run @Test testDiv"); assertEquals("error in div()", 2, number2.div(number1).getNumber()); assertEquals("error in div()", 0, number2.div(number1).getNumber()); } @Test(expected = IllegalArgumentException.class) public void testDivByZero() { System.out.println("Run @Test testDivByZero"); number2.setNumber(0); number1.div(number2); } }
  5. Run the test and observe the result. Modify some lines to make the test fails and observe the result.

    The output, used for illustrating the sequence of operations, is as follows:

    Run @Before Run @Test testDivByZero Run @After Run @Before Run @Test testAdd Run @After Run @Before Run @Test testDiv Run @After Run @Before Run @Test testGetterSetter Run @After
Test Fixtures, @Before and @After

A test fixtures is a fixed state of a set of objects used as a baseline for running tests. The purpose of a test fixture is to ensure that there is a well known and fixed environment in which tests are run so that results are repeatable.

In JUnit 4, fixtures are setup via the and annotations.

  • The annotated method (known as ) will be run before EACH test method (annotated with ) to set up the fixtures.
  • The annotation method (known as tearDown()) will be run after EACH test.

We typically declare text fixtures as private instance variables; initialize via annotated method; and clean-up via annotation method. Each test method runs on its own set of text fixtures with the same initial states. This ensures isolation between the test methods.

@BeforeClass and @AfterClass

Beside the and , there is also and .

  • The annotated method will be run once before any test, which can be used to perform one-time initialization tasks such as setting up database connection.
  • The annotated method will be run once after all tests, which can be used to perform housekeeping tasks such as closing database connection.

JUnit 4's Annotations

JUnit 4 defines the following six annotations in package .

AnnotationDescription
The annotated method is to be run as a test method.
The annotated method is to be run before EACH of the test method.
The annotated method is to be run after EACH of the test method.
The annotated method is to be run ONCE before any of the test method.
The annotated method is to be run ONCE after all the test methods.
Ignore (don't run) the test method. This is a convenient way to mark out a test method (e.g. after some code changes that fail this test.)
[TODO]
Example of @Ignore
@Ignore("Under Construction") @Test public void someTest() { ...... }

JUnit - Exceptions Test

To test if the code throws a desired exception, use annotation , as illustrated in the previous example.

JUnit - Timing Test

To handle or test timeout, use annotation . For example,

1 2 3 4 5 6 7 8 import org.junit.Test; public class TimeoutTest { @Test(timeout=1000) public void test() { while (true) {} } }
java.lang.Exception: test timed out after 1000 milliseconds ......

JUnit - Parameterized Test

JUnit 4 introduces parameterized test which allows you to run the same test over and over again using different values. To use parameterized test:

  1. Annotate the test class with .
  2. Create a public static method annotated with that returns a list () as test data set.
  3. Create a public constructor that takes its input from the method to setup the test fixtures defined as instance variables. The constructor will be run before EACH test.
  4. Create your tests case(s) using the instance variables as the source of the test data.

For example,

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 import static org.junit.Assert.*; import org.junit.After; import org.junit.Before; import org.junit.Test; import org.junit.runner.RunWith; import org.junit.runners.Parameterized; import org.junit.runners.Parameterized.Parameters; import java.util.Arrays; @RunWith(Parameterized.class) public class MyNumberParameterizedTest { private MyNumber number1, number2; private int result; @Parameters public static Iterable<Object[]> data() { System.out.println("Run @Parameters"); return Arrays.asList(new Object[][] { { new MyNumber(1), new MyNumber(2), 3 }, { new MyNumber(-1), new MyNumber(-2), -3 }, { new MyNumber(3), new MyNumber(-3), 0 }, }); } public MyNumberParameterizedTest(MyNumber number1, MyNumber number2, int result) { System.out.println("Run constructor"); this.number1 = number1; this.number2 = number2; this.result = result; System.out.println("-- number1=" + number1.getNumber() + " number2=" + number2.getNumber() + " result=" + result); } @Before public void setUp() throws Exception { System.out.println("Run @Before"); System.out.println("-- number1=" + number1.getNumber() + " number2=" + number2.getNumber()); } @Test public void test() { System.out.println("Run @Test"); assertEquals(result, number1.add(number2).getNumber()); } @After public void tearDown() throws Exception { } }
Run @Parameters Run constructor -- number1=1 number2=2 result=3 Run @Before -- number1=1 number2=2 Run @Test Run constructor -- number1=-1 number2=-2 result=-3 Run @Before -- number1=-1 number2=-2 Run @Test Run constructor -- number1=3 number2=-3 result=0 Run @Before -- number1=3 number2=-3 Run @Test

The output trace suggests that method is run once. For EACH test, the constructor is run first, followed by , and methods.

Another Example

import org.junit.*; import java.util.ArrayList; import org.junit.runner.Result; public class ArrayListTest { private ArrayList<String> lst; @Before public void init() throws Exception { lst = new ArrayList<String>(); lst.add("alpha"); lst.add("beta"); } @Test public void insertTest() { Assert.assertEquals("wrong size", 2, lst.size()); lst.add(1, "charlie"); Assert.assertEquals("wrong size", 3, lst.size()); Assert.assertEquals("wrong element", "alpha", lst.get(0)); Assert.assertEquals("wrong element", "charlie", lst.get(1)); Assert.assertEquals("wrong element", "beta", lst.get(2)); } @Test public void replaceTest() { Assert.assertEquals("wrong size", 2, lst.size()); lst.set(1, "charlie"); Assert.assertEquals("wrong size", 2, lst.size()); Assert.assertEquals("wrong element", "alpha", lst.get(0)); Assert.assertEquals("wrong element", "charlie", lst.get(1)); } public static void main(String[] args) { Result r = org.junit.runner.JUnitCore.runClasses(ArrayListTest.class); System.out.println(r.wasSuccessful()); } }

To run the test, you can either include a method as above, or use the command-line.

JUnit Package org.junit

The core package for JUnit 4 is , which is simple and elegantly designed.

  • class: contains methods , , , , , , , , .
  • class: contains methods , , , .
  • : mark the method as a test method.
  • : The test is expected to trigger this exception.
  • : Treat the test as fail, if it exceeds the specified milliseconds.
  • and : mark the method as to be run before and after EACH test method, for initializing and cleaning-up test fixtures.
  • and : mark the method as to be run before and after ONCE for the class.
  • : ignore this test method (annotated with ).
  • : [TODO]

JUnit 3.8 (deprecated?)

JUnit 3.8, which uses strict naming convention to denote various entities, is probably deprecated. I suggest that you move to JUnit 4, which is more intuitive by using annotation.

Let's begin with an Example

Below is a Java program to be tested. Note that there is a logical error in the program.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 public class Grade { public static char getLetterGrade(int mark) { assert (mark >= 0 && mark <= 100) : "mark is out-of-range: " + mark; if (mark >= 75) { return 'A'; } else if (mark >= 60) { return 'B'; } else if (mark > 50) { return 'C'; } else { return 'F'; } } }

The unit-test program (using JUnit framework) is as follows. Black-box test cases are set up to test typical values as well as boundary values.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 import junit.framework.Test; import junit.framework.TestCase; import junit.framework.TestSuite; public class GradeUnitTest extends TestCase { public GradeUnitTest(String name) { super(name); } protected void setUp() throws Exception { super.setUp(); } protected void tearDown() throws Exception { super.tearDown(); } public void testTypical() { assertEquals("wrong grade", 'A', Grade.getLetterGrade(95)); assertEquals("wrong grade", 'B', Grade.getLetterGrade(72)); assertEquals("wrong grade", 'C', Grade.getLetterGrade(55)); assertEquals("wrong grade", 'F', Grade.getLetterGrade(30)); } public void testBoundaries() { assertEquals("wrong grade", 'A', Grade.getLetterGrade(75)); assertEquals("wrong grade", 'A', Grade.getLetterGrade(100)); assertEquals("wrong grade", 'B', Grade.getLetterGrade(60)); assertEquals("wrong grade", 'B', Grade.getLetterGrade(74)); assertEquals("wrong grade", 'C', Grade.getLetterGrade(50)); assertEquals("wrong grade", 'C', Grade.getLetterGrade(59)); assertEquals("wrong grade", 'F', Grade.getLetterGrade(0)); assertEquals("wrong grade", 'F', Grade.getLetterGrade(49)); } public static Test suite() { return new TestSuite(GradeUnitTest.class); } public static void main(String[] args) { junit.textui.TestRunner.run(GradeUnitTest.class); } }

Compile and execute the program (with JUnit JAR file included in the ) as follows. Note that one of the unit-test cases catches the logical error.

> set CLASSPATH=.;%CLASSPATH%;c:\junit\junit-3.8.2.jar > javac GradeUnitTest.java > java GradeUnitTest ..F Time: 0.006 There was 1 failure: 1) testBoundaries(GradeUnitTest)junit.framework.AssertionFailedError: wrong grade expected:<C> but was:<F> at GradeUnitTest.testBoundaries(GradeUnitTest.java:23) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at GradeUnitTest.main(GradeUnitTest.java:34) FAILURES!!! Tests run: 2, Failures: 1, Errors: 0

JUnit Terminology

  • Class : A class that contains test methods should derive from this superclass. Each can include many test methods.
  • Test Methods: A test methods must be named . This is because JUnit uses the reflection mechanism to find and run these methods. Inside a test method, you can use a variation of the method (e.g., , , ) to compare the expected and actual results.
  • Test Fixture: The set of objects that a test method operates on. You declare these objects as a private variable, and initialize them by overriding the or via the constructor. You can perform clean-up operations by overriding . Each test method runs on its own instance with its own set of text fixtures. This ensures isolation between the test methods.
  • Class : You can combine many (e.g., written by different person) into a test suite, and run them at once.
  • Class : for running the or .

Writing Tests

Step 1: Extend a subclass of :

import junit.framework.*; public class JUnit38DemoArrayList extends TestCase { public JUnit38DemoArrayList(String name) { super(name); } }

Step 2: If two or more test methods use a common set of test objects (called test fixtures), declare the test fixtures as instance variables. For example, suppose we are testing the class .

private ArrayList<String> lst;

Step 3: Initialize the text fixture. You can override or use the constructor. Each test method runs on its own instance. This provides isolation between test methods. Each test method invoke the constructor to construct an instance of the , followed by , run the steps coded inside the test method, and the before exiting. The test methods may run concurrently. For example, let's initialize our test fixture with two elements.

protected void setUp() throws Exception { lst = new ArrayList<String>(); lst.add("alpha"); lst.add("beta"); } protected void tearDown() throws Exception { super.tearDown(); }

Step 4: Write the test methods for this . All the test methods must be named , as JUnit uses reflection to find these test methods. For example,

public void testInsert() { assertEquals("wrong size", 2, lst.size()); lst.add(1, "charlie"); assertEquals("wrong size", 3, lst.size()); assertEquals("wrong element", "alpha", lst.get(0)); assertEquals("wrong element", "charlie", lst.get(1)); assertEquals("wrong element", "beta", lst.get(2)); } public void testReplace() { assertEquals("wrong size", 2, lst.size()); lst.set(1, "charlie"); assertEquals("wrong size", 2, lst.size()); assertEquals("wrong element", "alpha", lst.get(0)); assertEquals("wrong element", "charlie", lst.get(1)); }

Step 5: You can now run the , using JUnit-provided . There are two versions of : text-based , and GUI-based . To use the text-based , you could include a method as follows:

public static void main(String[] args) { junit.textui.TestRunner.run(JUnit38DemoArrayList.class); }

The expected outputs are:

.. Time: 0.001 OK (2 tests)

You can also invoke the from command-line:

> java junit.textui.TestRunner JUnit38DemoArrayList

You can invoke the GUI-based from command-line:

> java junit.swingui.TestRunner JUnit38DemoArrayList

Step 6: If there are many s (could be written by different people), you can put them together into a and run all the s at once. To do so, in each of the s, create a static method to extract all the test methods as follows:

public static Test suite() { return new TestSuite(JUnit38DemoArrayList.class); }

Next, write a class to include all the s into a .

import java.framework.*; public class AllTests { public static void main(String[] args) { junit.textui.TestRunner.run(suite()); } public static Test suite() { TestSuite suite = new TestSuite(); suite.addTest(JUnit38DemoArrayList.suite()); //suite.addTest(OtherTestCase1.suite()); //suite.addTest(OtherTestCase2.suite()); return suite; } }

Automating Unit Testing with ANT or Maven

[TODO] To tidy up.

Apache's ANT is the de facto standard for automated building of Java applications (similar to Unix's "" utility). You can download ANT from ant.apache.org (download the ZIP version, and unzip it to a directory of your choice).

We shall use ANT to automate building and testing. First, create a configuration file called as follows:

<?xml version="1.0"?> <!-- to save as "build.xml" --> <project name="Black-Box Unit Test Demo" default="run" basedir="."> <!-- build all classes in this directory --> <!-- To run this: use "ant build" --> <!-- need to include junit.jar in the classpath --> <target name="build"> <javac srcdir="${basedir}"/> <echo message="Build done" /> </target> <!-- Test and build all files --> <!-- To run this: use "ant" (default) or "ant run" --> <target name="run" depends="build"> <java taskname="Test" classname="GradeTestCase" fork="true" failonerror="true" /> <echo message="Unit Test done" /> </target> <!-- delete all class files --> <!-- To run this: use "ant clean" --> <target name="clean"> <delete> <fileset dir="${basedir}" includes="*.class" /> </delete> <echo message="clean done" /> </target> </project>

To build using the above build file, run "". (By default, it executes "", which is depends on "", "" get executed to compile the program, then "" get expected to run the testing. To run only the compilation, use "". To run only the cleanup, use "".)

prompt> ant Buildfile: build.xml build: [javac] Compiling 4 source files [echo] Build done run: [Test] ..F [Test] Time: 0.005 [Test] There was 1 failure: [Test] 1) testBoundaries(GradeTestCase)junit.framework.AssertionFailedError: expected:<C> but was:<F> [Test] at GradeTestCase.testBoundaries(GradeTestCase.java:23) [Test] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) [Test] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) [Test] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) [Test] at GradeTestCase.main(GradeTestCase.java:34) [Test] [Test] FAILURES!!! [Test] Tests run: 2, Failures: 1, Errors: 0 [Test] [echo] Unit Test done

[TODO] to be continued...

Unit Testing Best Practices

Writing Test Cases

How to test a program to ensure correctly? There are two techniques: white-box testing and black-box testing. White-box testing inspect the program codes and test the program logic. Black-box testing does not inspect the program codes, but looking at the input-output, treating the program as a black box.

For black-box testing, the most common approach is to partition the inputs, and design test cases for each input partition. The test cases could test on a typical input value as well as the boundaries.

For example, the following program converts a given mark (0-100) to a letter grade ('A' to 'F'). There is a logical error in the program.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 import static org.junit.Assert.assertEquals; import org.junit.Test; public class GradeLetters { public static char getLetterGrade(int mark) { assert (mark >= 0 && mark <= 100) : "mark is out-of-range: " + mark; if (mark >= 75) { return 'A'; } else if (mark >= 60) { return 'B'; } else if (mark > 50) { return 'C'; } else { return 'F'; } } @Test public void testTypical() { assertEquals("wrong grade", 'A', GradeLetters.getLetterGrade(95)); assertEquals("wrong grade", 'B', GradeLetters.getLetterGrade(72)); assertEquals("wrong grade", 'C', GradeLetters.getLetterGrade(55)); assertEquals("wrong grade", 'F', GradeLetters.getLetterGrade(30)); } @Test public void testBoundaries() { assertEquals("wrong grade", 'A', GradeLetters.getLetterGrade(75)); assertEquals("wrong grade", 'A', GradeLetters.getLetterGrade(100)); assertEquals("wrong grade", 'B', GradeLetters.getLetterGrade(60)); assertEquals("wrong grade", 'B', GradeLetters.getLetterGrade(74)); assertEquals("wrong grade", 'C', GradeLetters.getLetterGrade(50)); assertEquals("wrong grade", 'C', GradeLetters.getLetterGrade(59)); assertEquals("wrong grade", 'F', GradeLetters.getLetterGrade(0)); assertEquals("wrong grade", 'F', GradeLetters.getLetterGrade(49)); } }

Try to run the above tests to find the logical error. Take note that does not accept as arguments, but upcast to . That is, the output show the 's numeric code.

Unit Testing Best Practices (From JUnit FAQ)

The followings are extracted from JUnit FAQ:

  1. When should the tests be written?
    Tests should be written before the code. Good tests tell you how to best design the system for its intended use. They also prevent tendencies to over-build the software. When all the tests pass, you know you're done. Whenever a customer reports a bug, first write the necessary unit test(s) to expose the bug(s) and fix them. This make it almost impossible for the same bug to resurface later.
  2. Do I have to write a test for everything?
    No, just test things that could reasonably break. Don't write tests that turn out to be testing the operating system or environment or the compiler. For example, public class AClass { int x; public AClass(int x) { this.x = x; } int getX() { return x; } void setX() { this.x = x; } } A test that testing is merely testing for , i,e, testing the compiler! This can't break unless the compiler or the interpreter break!
  3. How often should I run my tests?
    Run unit test as often as possible, ideally every time the code is changed. Run all your acceptance, integration, stress, and unit tests at least once per day (for your nightly-built).

TestNG

TestNG (Test Next Generation) (@ http://testng.org/) is a testing framework inspired from JUnit and NUnit (the xUnit family), but introduces new functionalities like dependency testing, grouping concept to make testing easier and more powerful.

TestNG is designed to cover all types of tests: unit, integration, functional, and etc.

Installing TestNG

Installing TestNG: From TestNG download site (@ http://testng.org/doc/download.html), download the "". Unzip the downloaded file. The binaries is kept in ""

To install TestNG Eclipse Plug-in ⇒ Launch Eclipse ⇒ Help ⇒ Install New Software ⇒ In Work with, enter http://beust.com/eclipse ⇒ Add ⇒ Select TestNG.

Using TestNG: To use TestNG, include the jar-files in the .

In Eclipse, right-click on the project ⇒ Add Library ⇒ TestNG.

API Documentation: The TestNG API documentation is available @ http://testng.org/javadocs/.

Getting Started with TestNG with Example

I shall assume that you are familiar with JUnit 4.

Let's use TestNG (instead of JUnit) to test the class written in the earlier section.

In Eclipse, right-click on the project ⇒ New ⇒ TestNG ⇒ TestNG class.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 import static org.testng.Assert.*; import org.testng.annotations.*; public class MyNumberTestNGTest { private MyNumber number1, number2; @BeforeClass public void oneTimeSetUp() { System.out.println("@BeforeClass - oneTimeSetUp"); } @AfterClass public void oneTimeTearDown() { System.out.println("@AfterClass - oneTimeTearDown"); } @BeforeMethod public void setUp() { number1 = new MyNumber(); number2 = new MyNumber(); System.out.println("@BeforeMethod - setUp before each test"); } @AfterMethod public void tearDown() { System.out.println("@AfterMethod - tearDown before each test"); } @Test public void testAdd() { System.out.println("@Test - testAdd"); number1.setNumber(1); number2.setNumber(2); assertEquals(number1.add(number2).getNumber(), 3); } @Test(expectedExceptions = IllegalArgumentException.class) public void testDiv() { System.out.println("@Test - testDiv with exception"); number1.setNumber(1); number2.setNumber(0); number1.div(number2); } }

To run the test case under Eclipse, right-click on the file ⇒ Run as ⇒ TestNG Test.

@BeforeClass - oneTimeSetUp @BeforeMethod - setUp before each test @Test - testAdd @AfterMethod - tearDown before each test @BeforeMethod - setUp before each test @Test - testDiv with exception @AfterMethod - tearDown before each test @AfterClass - oneTimeTearDown PASSED: testAdd PASSED: testDiv =============================================== Default test Tests run: 2, Failures: 0, Skips: 0 ===============================================

As seen from the output, the annotated method is run ONCE for one-time setup; the is run ONCE for one-time tear down. The and (called and in JUnit 4) are run before and after EACH .

So far, everything is similar to JUnit 4, except some name changes.

Running TestNG Test Cases

There are several ways to run test case for TestNG:

  • With an XML description file.
  • With ANT build tool.
Via TestNG XML Description File

TestNG (compared with JUnit) introduces an XML description to describe test suite/test cases to provide more flexibility in running tests.

Prepare the following XML Description file (says ""), which describes a test suite comprising of test cases. Each test case comprises of many Java classes.

1 2 3 4 5 6 7 8 9 <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE suite SYSTEM "http://testng.org/testng-1.0.dtd" > <suite name="MyNumberTestSuite"> <test name="MyNumberTest"> <classes> <class name="MyNumberTestNGTest"/> </classes> </test> </suite>

You can run the test suite in command-line:

// Set CLASSPATH to include the TestNG jar-file // For Linux/Mac with bash shell $ export CLASSPATH=.:$CLASSPATH:/path/to/testng-6.8/testng-6.8.jar // For Windows > set CLASSPATH=.;%CLASSPATH%;x:\path\to\testng-6.8\testng-6.8.jar // Compile the Java test classes $ javac MyNumberTestNGTest.java // Run the test thru XML description file $ java org.testng.TestNG testing.xml @BeforeClass - oneTimeSetUp @BeforeMethod - setUp before each test @Test - testAdd @AfterMethod - tearDown before each test @BeforeMethod - setUp before each test @Test - testDiv with exception @AfterMethod - tearDown before each test @AfterClass - oneTimeTearDown =============================================== MyNumberTestSuite Total tests run: 2, Failures: 0, Skips: 0 ===============================================

In Eclipse, to run a suite description file ⇒ Run ⇒ Run Configurations ⇒ Suite ⇒ Select the desired XML description file.

The XML description file has this syntax:

  1. The root tag is <suite>.
  2. The <suite> tag can contain one or more <test> tags.
  3. The <test> tag can contain one or more <classes> tags.
  4. The <classes> tag can contain one or more <method> tags.
Via ANT Script

[TODO]

TestNG's Annotations

NameDescription
Mark a method (or class) as a test method (or class).
Run ONCE before and after all tests in this suite.
Run ONCE before and after all tests in this class.
Run before and after EACH @Test method.
[TODO]
[TODO]
Mark this test method is to get its parameters from the XML description file.
Mark the method, which return an , as data source for a test method.
[TODO]
[TODO]

TestNG - Exception Test

Mark the test method that is expected to throw an exception with as seen in the above example.

JUnit 4 uses annotation .

TestNG - Ignore Test

To ignore a test, mark it with annotation .

JUnit 4 uses an dedicated annotation to override the annotation.

TestNG - Timing Test

To set a timeout (milliseconds) for a test, use annotation (exactly the same as JUnit 4).

TestNG - Parameterized Test

Via @Parameters and the XML Description File <parameter> Tag

The test class is as follows:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 import static org.testng.Assert.*; import org.testng.annotations.*; public class TestNGParameterizedTest1 { private MyNumber number1 = new MyNumber(); private MyNumber number2 = new MyNumber(); @Test @Parameters(value={"value1", "value2", "result"}) public void testAdd(int value1, int value2, int result) { System.out.println("value1=" + value1 + " value2=" + value2 + " result=" + result); number1.setNumber(value1); number2.setNumber(value2); assertEquals(number1.add(number2).getNumber(), result); } }

Mark the parameterized test method with annotation , where xxx is a String[]. The values will be passed into the arguments of the method in the same order.

The parameters are fed from the XML description file with the <parameter> tag. For example,

1 2 3 4 5 6 7 8 9 10 11 12 <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE suite SYSTEM "http://testng.org/testng-1.0.dtd" > <suite name="MyNumberTestSuite"> <test name="MyNumberTest"> <parameter name="value1" value="11" /> <parameter name="value2" value="22" /> <parameter name="result" value="33" /> <classes> <class name="TestNGParameterizedTest1"/> </classes> </test> </suite>
Via the @DataProvider

The annotation can only used to pass simple type (such as and ). To pass objects, you need to use annotation.

For example,

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 import static org.testng.Assert.*; import org.testng.annotations.*; public class TestNGParameterizedTest2 { @Test(dataProvider = "testAddDataProvider") public void testAdd(MyNumber number1, MyNumber number2, int result) { System.out.println("number1=" + number1.getNumber() + " number2=" + number2.getNumber() + " result=" + result); assertEquals(number1.add(number2).getNumber(), result); } @DataProvider(name = "testAddDataProvider") public Object[][] parameterIntTestProvider() { return new Object[][]{ {new MyNumber(11), new MyNumber(22), 33}, {new MyNumber(111), new MyNumber(222), 333}, {new MyNumber(1111), new MyNumber(2222), 3333} }; } }
number1=11 number2=22 result=33 number1=111 number2=222 result=333 number1=1111 number2=2222 result=3333 PASSED: testAdd(MyNumber@1e53fc13, MyNumber@1bca52f3, 33) PASSED: testAdd(MyNumber@74b1896c, MyNumber@33b54d4e, 333) PASSED: testAdd(MyNumber@15e19d13, MyNumber@f0f559e, 3333) =============================================== Default test Tests run: 3, Failures: 0, Skips: 0 ===============================================

TestNG - Method Dependency Test

TestNG (compared with JUnit) introduces test dependency. For example,

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 import static org.testng.Assert.*; import org.testng.annotations.*; public class TestNGDependsTest { @Test public void method1() { System.out.println("@Test: method1"); assertTrue(true); } @Test(dependsOnMethods={"method1"}) public void method2() { System.out.println("@Test: method2"); assertTrue(true); } @Test(dependsOnMethods={"method1","method2"}) public void method3() { System.out.println("@Test: method3"); } }
@Test: method1 @Test: method2 @Test: method3 PASSED: method1 PASSED: method2 PASSED: method3 =============================================== Default test Tests run: 3, Failures: 0, Skips: 0 ===============================================

In , if we change to to fail the test, and will not be run, but marked as skip (instead of fail as in JUnit 4), as shown in the following outputs:

@Test: method1 FAILED: method1 java.lang.AssertionError: expected [true] but found [false] SKIPPED: method2 SKIPPED: method3 =============================================== Default test Tests run: 3, Failures: 1, Skips: 2 ===============================================

TestNG - Group Test and Dependency

Each test method can be assigned to one or more groups. We can select one or more groups to test via XML description file. For example,

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 import org.testng.annotations.*; public class TestNGGroupTest { @Test(groups = {"init"}) public void method1() { System.out.println("@Test: method1"); } @Test(groups = {"init", "post-init"}) public void method2() { System.out.println("@Test: method2"); } @Test(groups = {"main"}) public void method3() { System.out.println("@Test: method3"); } }

The XML description file to run methods in group "init" only.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE suite SYSTEM "http://testng.org/testng-1.0.dtd" > <suite name="MyNumberTestSuite"> <test name="MyNumberTest"> <groups> <run> <include name="init" /> </run> </groups> <classes> <class name="TestNGGroupTest"/> </classes> </test> </suite>
Dependency on Groups

Instead of specifying dependency on individual method names as in the previous section, we can place related method (e.g., init methods) in groups, and specifying dependency on groups of methods. For example,

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 import static org.testng.Assert.*; import org.testng.annotations.*; public class TestNGGroupDependsTest { @Test(groups = {"init"}) public void method1() { System.out.println("@Test: method1"); assertTrue(true); } @Test(groups = {"init", "post-init"}) public void method2() { System.out.println("@Test: method2"); } @Test(groups = {"main"}, dependsOnGroups={"init"}) public void method3() { System.out.println("@Test: method3"); } }

If we use in , will be run. However, if we use in , will be skipped.

REFERENCES & RESOURCES

One thought on “Testng Data Provider Before Class Assignments

Leave a Reply

Your email address will not be published. Required fields are marked *