There are various types of tests and everyone has its own terminology. You'll find below the XWiki terminology and best practices related to testing.

Check the general development flow to understand better how testing fits in the larger picture.

Here's the general methodology used:

  • Code committed must have associated automated tests. There's a check in the build (in the quality profile) to ensure that committed code do not reduce the total test coverage for that module. The CI runs the quality profile for each commit and fails the build if the test coverage is reduced.
  • Developers run unit and integration tests on their machines daily and frequently.
  • Functional tests are executed by the CI, at each commit.
  • Performance tests are executed manually several times per year (usually for LTS releases and sometimes for stable releases too).
  • Automated tests are currently running in a single environment (HSQLDB/Jetty/Firefox). Manual tests are executed at each release by dedicated contributors and they make sure to run them on various combinations of database, servlet containers and browsers/versions.
  • Responsiveness tests (e.g. verify that XWiki displays fine on a Mobile) are currently executed manually from time to time by dedicated contributors.
  • Accessibility tests are performed automatically at each commit.
  • HTML5 validity tests are performed automatically at each commit.

Unit Testing

A unit test only tests a single class in isolation from other classes. Since in the XWiki project we write code using Components, this means a unit test is testing a Component in isolation from other Components.

Java Unit Testing

  • These are tests performed in isolation using Mock Objects. More specifically we're using JUnit 5.x and Mockito (we were using JMock 2.x before and still have a lot of tests not converted to Mockito yet)
  • These tests must not interact with the environment (Database, Container, File System, etc) and do not need any setup to execute
  • These tests must not output anything to stdout or stderr (or it'll fail the build). Note that if the code under tests output logs, they need to be captured and possibly asserted. For example:
     * Capture logs.

    static LogCaptureExtension logCapture = new LogCaptureExtension(LogLevel.WARN);
    assertEquals("Error getting resource [bad resource] because of invalid path format. Reason: [invalid url]",

    Note that the JUnit5 extension must be used with @RegisterExtension and using a static variable or if you want to remove the static keyword, you'll need to annotate the test class with @TestInstance(TestInstance.Lifecycle.PER_CLASS).

  • Your Maven module must depend on the xwiki-commons-tool-test-simple (for tests not testing Components) or xwiki-commons-tool-test-component (for tests testing Components) modules.
    • If you're testing Components, use @ComponentTest, @MockComponent & @InjectMockComponents:
      • Example 1
           // Real component, not mock
        public class MyTest
           private List<String> list;

           private Component1Role component1;

           private Component4Impl component4;

           public void beforeComponent()
               // Called before @InjectMockComponents creates mocks for all @Inject in the component implementation.

           public void before(MockitoComponentManager componentManager)

           public void test1(MockitoComponentManager componentManager)

           public void test2(ComponentManager componentManager)

           public void test3()
      • Example 2
        public class DefaultVelocityConfigurationTest
           private DefaultVelocityConfiguration configuration;

           public void configure(ComponentManager componentManager) throws Exception
                ConfigurationSource source = componentManager.getInstance(ConfigurationSource.class);
                when(source.getProperty("", Properties.class)).thenReturn(new Properties());
                when(source.getProperty("", Properties.class)).thenReturn(new Properties());

           public void testDefaultToolsPresent() throws Exception
               // Verify for example that the List tool is present.
               assertEquals(ListTool.class.getName(), this.configuration.getTools().get("listtool"));
      • Example 3:
        public class DefaultVelocityContextFactoryTest
           private VelocityConfiguration configuration;

           private ComponentManager componentManager;

           private DefaultVelocityContextFactory factory;

           public void configure() throws Exception
                Properties properties = new Properties();
                properties.put("listtool", ListTool.class.getName());

           public void createDifferentContext() throws Exception
    • If you're testing non Components, just use pure JUnit5 and Mockito (you're on your own!). Don't forget that you can use Mockito's @Mock annotation too.
  • To test oldcore code, use the @OldcoreTest annotation. This will internally initialize a MockitoOldCore object that can be used in the tests (you get access to it by accepting it as a method parameter). For example:
    public class WikiUIExtensionComponentBuilderTest implements WikiUIExtensionConstants
       private WikiUIExtensionComponentBuilder builder;
       public void configure(MockitoComponentManager componentManager, MockitoOldcore oldcore) throws Exception
           // Required by BaseObjectReference
           DocumentReferenceResolver<String> resolver =
               .thenReturn(new DocumentReference("xwiki", "XWiki", "UIExtensionClass"));

            DelegateComponentManager wikiComponentManager = new DelegateComponentManager();
            componentManager.registerComponent(ComponentManager.class, "wiki", wikiComponentManager);

           // Components accessed through dynamic lookup.
           VelocityManager velocityManager = componentManager.registerMockComponent(VelocityManager.class);

            ModelContext modelContext = componentManager.registerMockComponent(ModelContext.class);

            componentManager.registerMockComponent(Transformation.class, "macro");

           // The document holding the UI extension object.
           this.componentDoc = mock(XWikiDocument.class, "xwiki:XWiki.MyUIExtension");
            oldcore.getDocuments().put(DOC_REF, componentDoc);

       public void buildGlobalComponentsWithoutPR()
            BaseObject extensionObject = createExtensionObject("id", "extensionPointId", "content", "parameters", "global");
            Throwable exception = assertThrows(WikiComponentException.class, () -> {
            assertEquals("Registering global UI extensions requires programming rights", exception.getMessage());

Best practices

  • Name the Test class with the name of the class under test suffixed with Test. For example the JUnit test class for XWikiMessageTool should be named XWikiMessageToolTest
  • Name the test methods with the method to test followed by a qualifier describing the test. For example importWithHeterogeneousEncodings().
  • When you're testing for an exception, use the following strategy, shown on an example:
    Throwable exception = assertThrows(IllegalArgumentException.class, () -> {
       // Content throwing exception here
    assertEquals("Invalid character: the input contains U+0000.", exception.getMessage());


  • When Mocking, to ignore all debug logging calls and to allow all calls to is*Enabled you could write:
    // Ignore all calls to debug() and enable all logs so that we can assert info(), warn() and error() calls.
    allowing(any(Logger.class)).method("is.*Enabled"); will(returnValue(true));

Old Testing with JUnit4

This is now deprecated and you should use JUnit5 and the tools mentioned in the section just above.

Old documentation:

  • These tests must not output anything to stdout or stderr (or it'll fail the build). Note that if the code under tests output logs, they need to be captured and possibly asserted. For example:
     * Capture logs.

    public AllLogRule logRule = new AllLogRule();
    assertEquals("Error getting resource [bad resource] because of invalid path format. Reason: [invalid url]",
  • If you're testing Components, use MockitoComponentMockingRule (its javadoc explains how to use it in details)
    • Example 1: Canonical use case
      public class DefaultDiffManagerTest
         public final MockitoComponentMockingRule<DiffManager> mocker =
             new MockitoComponentMockingRule(DefaultDiffManager.class);
         public void testDiffStringList() throws Exception
             // Null

              DiffResult<String> result = this.mocker.getComponentUnderTest().diff(null, null, null);
    • Example 2: Example showing how to not mock one dependency being injected (ObservationManager is excluded in the example)
      @ComponentList({DefaultLoggerManager.class, DefaultObservationManager.class, LogbackEventGenerator.class})
      public class DefaultLoggerManagerTest
         public final MockitoComponentMockingRule<DefaultLoggerManager> mocker =
             new MockitoComponentMockingRule<DefaultLoggerManager>(DefaultLoggerManager.class);
  • If you're testing non Components, just use pure JUnit and Mockito (you're on your own!)
  • Examples:

JavaScript Unit Testing

  • These are tests that do not rely on the DOM, written as "behavioral specifications", using the Jasmine test framework.
  • In development mode, you can start the test runner process by running mvn jasmine:bdd in your command line. This will start a test runner at http://localhost:8234, that will run all tests in the src/test/javascript directory. Write your tests there and hit refresh to see the results directly in your browser.
  • For tests that need a DOM, see Functional Testing

Integration Testing

An integration test tests several classes together but without a running XWiki instance. For example if you have one Component which has dependencies on other Components and they are tested together this is called Integration testing.

Java Integration Testing

  • These tests are written using JUnit 5.x and Mockito (we were using JMock 2.x before and still have a lot of tests not converted to Mockito yet).
  • Your Maven module must depend on the xwiki-commons-test module.
  • Use the MockitoComponentMockingRule Rule (its javadoc explains how to use it in details).
  • Examples:
    • example2
    • Other example:
      public class DefaultExtensionLicenseManagerTest
         public final ComponentManagerRule componentManager = new ComponentManagerRule();
         public void setUp() throws Exception
             this.licenseManager = this.componentManager.getInstance(ExtensionLicenseManager.class);
    • Another example (using the new @BeforeComponent annotation):
      public class DefaultExtensionManagerConfigurationTest
         public final MockitoComponentManagerRule componentManager = new MockitoComponentManagerRule();

         private ExtensionManagerConfiguration configuration;

         private MemoryConfigurationSource source;

         public void registerComponents() throws Exception
             // Register a Mocked Environment since we need to provide one.

             // Register some in-memory Configuration Source for the test
             this.source = this.componentManager.registerMemoryConfigurationSource();

         public void setUp() throws Exception
             this.configuration = this.componentManager.getInstance(ExtensionManagerConfiguration.class);

      @BeforeComponent is used by ComponentManagerRule and methods annotated with it are called before other components are registered (i.e. before processing of @AllComponents and @ComponentList annotations).

      MockitoComponentManagerRule extends ComponentManagerRule and just adds some helper methods to register a mock component.

Best practices

Java Rendering Testing

We have a special framework for making it easy to write Rendering tests, see the Rendering Testing Framework

XAR Testing

Since XWiki 7.3M1 It's now possible to write integration tests for wiki pages on the filesystem (in XML format).

The way those tests work is that the XML file representing the wiki pages are loaded from the filesystem into XWikiDocument instances and a stubbed environment is defined so that XWikiDocument can then ben rendered in the desired syntax.

To write such a test:

  • Make your POM depend on the org.xwiki.platform:xwiki-platform-test-page module
  • Write a Java test class that extends PageTest
  • Possibly add some extra component registration that you need through existing annotations (*ComponentList annotations) or through the custom ComponentList annotation. Note that when extending PageTest this automatically brings some base components registration (the list defined in PageComponentList and ReferenceComponentList). Example of other existing annotations:
    • XWikiSyntax20ComponentList: XWiki Syntax 2.0-related components
    • XWikiSyntax21ComponentList: XWiki Syntax 2.1-related components
    • XHTML10ComponentList: XHTML 1.0-related components
  • You then verify the rendering of a page by setting the output syntax you wish to have, the query parameters and then call renderPage(). For example:
    request.put("section", "Links");
    request.put("xpage", "print");
    String result = renderPage(new DocumentReference("xwiki", "XWiki", "XWikiSyntax"));
    assertTrue("...explanation if test fails", result.contains("...content that we wish to verify..."));
  • If you need to have some other pages loaded from sources too (for example if the page you're rendering contains an include macro that loads another page), you'll use the

    loadPage() API as in:loadPage(new DocumentReference("xwiki", "XWiki", "XWikiSyntaxLinks"));

Here's a full example:

public class WebRssTest extends PageTest
   private ScriptQuery query;

   public void setUp() throws Exception
        request.put("outputSyntax", "plain");
        request.put("xpage", "plain");

        QueryManagerScriptService qmss = mock(QueryManagerScriptService.class);
        oldcore.getMocker().registerComponent(ScriptService.class, "query", qmss);
        query = mock(ScriptQuery.class);
        when(qmss.xwql("where 1=1 order by desc")).thenReturn(query);

   public void webRssFiltersHiddenDocuments() throws Exception
       // Render the page to test
       renderPage(new DocumentReference("xwiki", "Main", "WebRss"));

       // This is the real test!!
       // We want to verify that the hidden document filter is called when executing the XWQL
       // query to get the list of modified pages

   public void webRssDisplay() throws Exception
        when(query.execute()).thenReturn(Arrays.<Object>asList("Space1.Page1", "Space2.Page2"));

        FeedPlugin plugin = new FeedPlugin("feed", FeedPlugin.class.getName(), context);
        FeedPluginApi pluginApi = new FeedPluginApi(plugin, context);
        when(xwiki.getPluginApi("feed", context)).thenReturn(pluginApi);

       // Render the page to test
       String xml = renderPage(new DocumentReference("xwiki", "Main", "WebRss"));


Functional Testing

A functional test requires a running XWiki instance.

GUI tests

We now have 2 frameworks for running GUI tests:

  • One based on Selenium2/Webdriver which is the framework to use when writing new functional UI tests.
  • One based on Selenium1 which is now deprecated and shouldn't be used. We encourage committers to port tests written for it to the Selenium2 framework. Especially when committers bring modification to the old tests we encourage them to rewrite the tests as new Selenium2 tests.

Selenium2-based Framework


  • To debug a test simply start XE somewhere and then debug your JUnit tests as a normal JUnit test in your IDE.
    • Note that functional test Maven modules will create an XWiki instance in target/xwiki thanks to the execution of the XWiki Packager plugin, so it's simpler to start this XWiki instance when debugging.
  • In order to debug more easily flickering tests you can simply add the @Intermittent annotation on your test method and it'll be executed 100 times in a row (you can also specify @Intermittent(repetition=N)to repeat it N times). This is achieved thanks to the Tempus Fugit framework that we've integrated.
  • To run a specific test, pass the pattern property (it's a regex on the test class name) as in: mvn install -Dpattern=TestClass#testName (this will run the testName test from TestClass)
  • To run the tests on your own running instance (instead of letting mvn close your instance and start a fresh one), use -Dxwiki.test.verifyRunningXWikiAtStart=true. It could be useful to verify that you have not broken the tests on your instance before committing your changes.
  • By default the Firefox browser will be used but if you wish to run with another browser just pass the browser parameter as in:
    • Firefox (default): -Dbrowser=*firefox
    • Internet Explorer: -Dbrowser=*iexplore
    • Chrome: -Dbrowser=*chrome
    • PhantomJS: -Dbrowser=*phantomjs
  • You may reach a compatibility problem between the version of the browser and the version of Selenium.  For example, at the moment, Firefox 32.0 is required.  You may install it on your computer and refer to it with -Dwebdriver.firefox.bin="/path/to/your/firefox-32.0"
  • We check for pages that require Programming Rights automatically by bundling a Listener component that listens to the ScriptEvaluatingEvent event and that drops Programming Rights, in an effort to make the tests fail and so that the developer can notice he requires PR and fix that. In some cases where it's necessary, this can be disabled by setting the configuration < testProgrammingRights>false</testProgrammingRights> in the configuration of the Package Mojo, and/or by using a System Property to control where the check is performed, e.g. <xwikiPropertiesAdditionalProperties>test.prchecker.excludePattern=.*:XWiki\.XWikiPreferences</xwikiPropertiesAdditionalProperties>. Since XWiki 9.8RC1
  • If the xwiki.test.startXWiki system property is set to true then the test itself will start/stop XWiki. If set to false then it's then the responsibility of the build to start/stop XWiki. Useful when starting.stopping XWiki inside a Docker container handled by the Maven build for example. Since XWiki 10.0

Best practices

  • Tests are located in xwiki-platform inside the module that they are testing. Note that in the past we were putting functional tests in xwiki-platform-distribution/xwiki-platform-distribution-flavor/xwiki-platform-distribution-flavor-test but we have started to move them inside the specific modules they are testing.
  • Name the tests using <prefix>
  • Use a test suite to ensure that XWiki is started/stopped only once for all tests in the module:
    public class AllITs
  • Use the maven-failsafe-plugin for functional UI tests:
  • Locate tests in src/test/java (i.e the default maven location for tests)
  • Tests should be written using the Page Objects Pattern:
  • Since functional tests take a long time to execute (XWiki to start, Browser to start, navigation, etc) it's important to write tests that execute as fast as possible. Here are some tips:
    • Write scenarios (i.e. write only a few test methods, even only one if you can so that you can have a single fixture) instead of a lot of individual tests as you'd do for unit tests.
    • Use getUtil() to perform quick actions that are not part of what you wish to test (i.e that are part of the test fixture). For example to create a page you need in your fixture, write:
      instead of
      WikiEditPage wep = new WikiEditPage();
      wep.switchToEdit(SPACE_NAME, DOC_NAME);
      ViewPage vp = wep.clickSaveAndView();
    • Your Maven module for test may depend on a specific profile (as a best practice, you can use integration-tests).  Doing this, it will be built only when specifically asked with -Pintegration-tests (see below for an example of the pom.xml)
  • Never, ever, wait on a timer! Instead wait on elements.
  • If you need to create, update or delete a page during your tests, use a space name specific to your test scenario with getTestClassName(). For example:
    getUtil().createPage(getTestClassName(), "Page", "Content", "Title");
  • Tests must not depend on one another. In other words, it should be possible to execute tests in any order and running only one test class should work fine.
  • Tests that need to change existing configuration (e.g. change the default language, set specific rights, etc) must put back the configuration as it was. This is true only in flavor tests or when several functional tests of different domains are executed one after another. However functional tests located in xwiki-platform specific modules are only running the tests for their module and thus it's not important, and saves times, if they don't clean up.
  • Tests are allowed to create new documents and don't need to remove them at the end of the test.

Examples of functional tests:

The Office Importer tests

In XWiki 7.3, we have introduced in xwiki-platform some functional tests for the Office Importer Application. To enable them, you need to enable the profile office-tests. An OpenOffice (or LibreOffice) server is needed on your system. You might also need to set an environment variable that points to the office home path, if not standard. This variable is called XWIKI_OFFICE_HOME and can be set like this:

## For Unix systems:
export XWIKI_OFFICE_HOME="/opt/libreoffice3.6"

You should set this environment variable in your CI agents.

Old Selenium1-based Framework

  • We were using Selenium RC to perform functional tests for GUI. We had created some JUnit extension to easily write Selenium tests in Java.
  • To run these tests on your local machine go to xwiki-enteprise/xwiki-enterprise-test/xwiki-enterprise-test-selenium and type mvn install.
  • To run a specific test, pass the pattern property as in: mvn install -Dpattern=DeletePageTest (this will run the DeletePageTest - Note that you don't have to specify the extension). In addition if you wish to execute only a specific method from a Test Case class, you can pass the patternMethod property as in: mvn install -Dpattern=DeletePageTest -DpatternMethod=testDeletePageCanDoRedirect.
  • To enable debug output from the selenium server start maven with the -Ddebug=true switch and then all messages from the selenium server are saved to: xwiki-enteprise/xwiki-enterprise-test/xwiki-enterprise-test-selenium/target/selenium/server.log.
  • To debug a functional Selenium test in your favourite Java IDE go in xwiki-enteprise/xwiki-enterprise-test/xwiki-enterprise-test-selenium and run maven with -Dmaven.surefire.debug. Maven will wait till you connect your IDE to the running JVM on port 5005. You can put a breakpoint in your IDE and debug the test.

Browser version

Currently we only run our functional tests on the Firefox browser.
The browser you use for functional tests needs to match the selenium version, otherwise unpredictable results may occur.

  • The current Selenium version we are using is 2.44.0
    • (valid for 18.March.2015. Actual version used can be verified here under the selenium.version property)
  • The Firefox version we use in you continuous integration agents is 32.0.1.
    • (valid for 18.March.2015. Ask on the list or on IRC for the actual used version since it's not publicly verifiable) 
    • To determine browser compatibility with the Selenium version used, scan Selenium's changelog and look for entries like "* Updating Native events to support Firefox 24, 31, 32 and 33". That shows the supported browser version for the particular Selenium version.

If you wish to run tests with the exact configuration as XWiki's Continous Integration server uses, you need to install and use locally the same Firefox version. To do so, you have to:

  1. Download the corresponding Firefox release
  2. Unzip it locally
  3. Use the webdriver.firefox.bin java system property to specify the location of your firefox version
    1. Depending on how you are starting the functional tests, you`d have to either add the system property in your maven build (surefire plugin configuration) or in your IDE (run configuration)
    2. Read Selenium's FirefoxDriver documentation for more information and options

XHTML, CSS & WCAG Validations

Performance Testing

  • These are memory leakage tests, load tests and speed of execution tests.
  • They are performed manually and in an ad hoc fashion for now. They are executed for some stable versions and for all super stable versions.
  • See Methodology and reports.

See the Profiling topic for details on how to use a profiler for detecting performance issues.

Mutation Testing

In order to test the quality of the tests emoticon_smile we use Pitest/Descartes. Pitest will generate a mutation score which is a measure of how well the tests are catching mutations done on the code being tested.

If you add new tests, verify that the mutation score doesn't go down by running:

mvn install -Pquality -Dxwiki.pitest.skip=false

To know more about the check, see how it's implemented in the build.

Manual testing

Here's the spirit we'd like to have from the XWiki Manual Testing Team: The Black Team.

Besides automated testing, ensuring software quality requires that features be tested by actual users (see Manual testing). In order to manage manual testing, a test plan is required. 

Tools Reference

We use the following tools in our automated tests:

  • JUnit: test framework
  • Mockito, JMock: mocking environment of unit test to isolate it
  • Hamcrest: extra assertions beyond what JUnit provides
  • GreenMail: for testing email sending
  • WireMock: for simulating HTTP connections to remote servers
  • JMeter: for performance tests
  • Dumbbench: for manual performance tests

Test Coverage

We now have a SonarQube instance showing Test coverage for both unit tests and integration tests. However it doesn't aggregate coverage data across top level modules (commons, rendering, platform, enterprise, etc).

We support both Jacoco and Clover to generate test coverage reports.

To generate test coverage reports make sure you can build your module and then pick one of the following strategies below depending on what you wish to generate.

After running Clover you'll have some instrumented JARs in your local repository so you should be careful not to use those later on for deploying for example. And if you copy them in an existing xwiki setup in WEB-INF/lib you'll get an error at runtime saying that the Clover JAR is missing...

Single Maven Reactor

Using Jacoco

  • Go in the first top level module (e.g. xwiki-commons) and run: mvn clean jacoco:prepare-agent install -Djacoco.destFile=/tmp/jacoco.exec -Djacoco.append=false -Plegacy,integration-tests -Dxwiki.revapi.skip=true
  • Go in all the other top level modules you wish to add and run: mvn clean jacoco:prepare-agent install -Djacoco.destFile=/tmp/jacoco.exec -Djacoco.append=true -Plegacy,integration-tests  -Dxwiki.revapi.skip=true
  • Then whenever you wish to generate the full test report, run:
    Jacoco supports generating report from a single module, it even supports generating aggregated report from several modules using the report-aggregate mojo introduced in 0.7.7 (but note that I don't know if that includes coverage induced by module B on module A). However jacoco doesn't seem to support the ability to execute several maven reactors (for example for building code located in various github repositories), using the same jacoco exec file and then generate a report for that. See also!topic/jacoco/odVzr7P5i6w

    If I try to run mvn jacoco:report -Djacoco.destFile=/tmp/jacoco.exec -N I get the error: Skipping JaCoCo execution due to missing execution data file..

Using Clover

Go to the top level module containing children modules for which to generate a Clover report. Note that you won't be able to aggregate Clover data across different Maven runs with this strategy so you really need a single parent module.

  1. Run the following command:
    mvn clean clover:setup install clover:aggregate clover:clover -Pclover,integration-tests,dev,jetty,hsqldb -Dxwiki.revapi.skip=true
    You might need to run the "install" goal instead of the "test" one if your local Maven repository doesn't already contain some test jars (apparently and for some reason Maven won't download them from the remote repository under some conditions).

Multiple Maven Reactors

Using Jacoco


Using Clover

Use a single Clover database to which you add coverage information as you build modules one after another. This strategy is especially useful when you wish to manually run some modules and ensure that coverage data aggregate in a single place so that when you generate the report you have the result of all your runs.

  1. Instrument the source code with Clover for all modules that you want to include in the report, using:
    mvn clover:setup install -Pclover,integration-tests,dev,jetty,hsqldb -Dmaven.clover.cloverDatabase=/path/to/user/home/.xwiki/clover/clover.db  -Dxwiki.revapi.skip=true

    When tests are executed they'll generate coverage data in the specified Clover database. Since there's a single Clover there's no need to merge Clover databases as in strategy 1 above.

  2. To generate the Clover report, execute the following from any module:
    mvn clover:clover -N -Dmaven.clover.cloverDatabase=/path/to/user/home/.xwiki/clover/clover.db
  3. Remember to clean your clover database when you're done.
If you don't wish failing tests to stop the generation of the coverage report, you should pass -Dmaven.test.failure.ignore=true on the command line.

Here are typical steps you'd follow to generate full TPC for XWiki:

  • Clean your local repo and remove any previous clover DBs:
    rm -R ~/.m2/repository/org/xwiki
    rm -R ~/.m2/repository/com/xpn
    rm -R ~/.xwiki/clover
  • Generate coverage data for XWiki Commons:
    cd xwiki-commons
    mvn clean -Pclover,integration-tests -o
    mvn clover:setup install -Pclover,integration-tests -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -Dmaven.test.failure.ignore=true  -Dxwiki.revapi.skip=true -o -nsu
  • Generate Clover report just for Commons:
    mvn clover:clover -N -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -o -nsu
  • Generate coverage data for XWiki Rendering:
    cd xwiki-rendering
    mvn clean -Pclover,integration-tests -o
    mvn clover:setup install -Pclover,integration-tests -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -Dmaven.test.failure.ignore=true  -Dxwiki.revapi.skip=true -o -nsu
  • Generate Clover report for Commons and Rendering:
    mvn clover:clover -N -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -o -nsu
  • Generate coverage data for XWiki Platform:
    cd xwiki-platform
    mvn clean -Pclover,integration-tests -o
    mvn clover:setup install -Pclover,integration-tests -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -Dmaven.test.failure.ignore=true  -Dxwiki.revapi.skip=true -o -nsu
  • Generate Clover report for Commons, Rendering and Platform:
    mvn clover:clover -N -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -o -nsu
  • Generate coverage data for XWiki Enterprise:
    cd xwiki-enterprise
    mvn clean -Pclover,integration-tests,jetty,hsqldb -o
    mvn clover:setup install -Pclover,integration-tests,jetty,hsqldb -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db  -Dmaven.test.failure.ignore=true -Dxwiki.revapi.skip=true -o -nsu
  • Generate full Clover report (for Commons, Rendering, Platform and Enterprise):
    mvn clover:clover -N -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -o -nsu

Using Clover + Jenkins

  • Install Jenkins 2.0+ and the following plugins:
    • Pipeline plugin
    • XVnc plugin
  • Make sure to have Maven ("mvn" executable) + VNCServer ("vncserver" executable) installed on the build agent
  • Build agent must be running unix
  • Setup a Pipeline job and make it point to the following pipeline script.

Compute Differences between reports

To learn more about the script below and why it's done this way, check this blog post.

Put the following in a wiki page (adjust the URLs first if need be):

def saveMetrics(def packageName, def metricsElement, def map) {
 def coveredconditionals = metricsElement.@coveredconditionals.toDouble()
 def coveredstatements = metricsElement.@coveredstatements.toDouble()
 def coveredmethods = metricsElement.@coveredmethods.toDouble()
 def conditionals = metricsElement.@conditionals.toDouble()
 def statements = metricsElement.@statements.toDouble()
 def methods = metricsElement.@methods.toDouble()
 def mapEntry = map.get(packageName)
 if (mapEntry) {
    coveredconditionals = coveredconditionals + mapEntry.get('coveredconditionals')
    coveredstatements = coveredstatements + mapEntry.get('coveredstatements')
    coveredmethods = coveredmethods + mapEntry.get('coveredmethods')
    conditionals = conditionals + mapEntry.get('conditionals')
    statements = statements + mapEntry.get('statements')
    methods = methods + mapEntry.get('methods')
 def metrics = [:]
  metrics.put('coveredconditionals', coveredconditionals)
  metrics.put('coveredstatements', coveredstatements)
  metrics.put('coveredmethods', coveredmethods)
  metrics.put('conditionals', conditionals)
  metrics.put('statements', statements)
  metrics.put('methods', methods)
  map.put(packageName, metrics)
def scrapeData(url) {
 def root = new XmlSlurper().parseText(url.toURL().text)
 def map = [:]
  root.project.package.each() { packageElement ->
   def packageName = packageElement.@name
    saveMetrics(packageName.text(), packageElement.metrics, map)
  root.testproject.package.each() { packageElement ->
   def packageName = packageElement.@name
    saveMetrics(packageName.text(), packageElement.metrics, map)
 return map
def computeTPC(def map) {
 def tpcMap = [:]
 def totalcoveredconditionals = 0
 def totalcoveredstatements = 0
 def totalcoveredmethods = 0
 def totalconditionals = 0
 def totalstatements = 0
 def totalmethods = 0
  map.each() { packageName, metrics ->
   def coveredconditionals = metrics.get('coveredconditionals')
    totalcoveredconditionals += coveredconditionals
   def coveredstatements = metrics.get('coveredstatements')
    totalcoveredstatements += coveredstatements
   def coveredmethods = metrics.get('coveredmethods')
    totalcoveredmethods += coveredmethods
   def conditionals = metrics.get('conditionals')
    totalconditionals += conditionals
   def statements = metrics.get('statements')
    totalstatements += statements
   def methods = metrics.get('methods')
    totalmethods += methods
   def elementsCount = conditionals + statements + methods
   def tpc
   if (elementsCount == 0) {
      tpc = 0
   } else {
      tpc = ((coveredconditionals + coveredstatements + coveredmethods)/(conditionals + statements + methods)).trunc(4) * 100
    tpcMap.put(packageName, tpc)
  tpcMap.put("ALL", ((totalcoveredconditionals + totalcoveredstatements + totalcoveredmethods)/
(totalconditionals + totalstatements + totalmethods)).trunc(4) * 100)
 return tpcMap

// map1 = old
def map1 = computeTPC(scrapeData('')).sort()

// map2 = new
def map2 = computeTPC(scrapeData('')).sort()

  println "= Added Packages"
println "|=Package|=TPC New"
map2.each() { packageName, tpc ->
 if (!map1.containsKey(packageName)) {
    println "|${packageName}|${tpc}"
println "= Differences"
println "|=Package|=TPC Old|=TPC New"
map2.each() { packageName, tpc ->
 def oldtpc = map1.get(packageName)
 if (oldtpc && tpc != oldtpc) {
   def css = oldtpc > tpc ? '(% style="color:red;" %)' : '(% style="color:green;" %)'
    println "|${packageName}|${oldtpc}|${css}${tpc}"
println "= Removed Packages"
println "|=Package|=TPC Old"
map1.each() { packageName, tpc ->
 if (!map2.containsKey(packageName)) {
    println "|${packageName}|${tpc}"

Example Reports

Created by Vincent Massol on 2018/04/29 18:50

Get Connected