There are various types of tests and everyone has its own terminology. You'll find below the XWiki terminology and best practices related to testing.

Check the general development flow to understand better how testing fits in the larger picture.

Here's the general methodology used:

  • Code committed must have associated automated tests. There's a check in the build (in the quality profile) to ensure that committed code do not reduce the total test coverage for that module. The CI runs the quality profile for each commit and fails the build if the test coverage is reduced.
    • In addition we also check for Test quality by computing Mutation Testing scores for all tests and fail the build if the global quality in a given module is reduced by a new commit (this is checked by some CI jobs every day).
  • Developers run unit and integration tests on their machines daily and frequently.
  • Functional tests are executed by the CI, at each commit.
  • Performance tests are executed manually several times per year (usually for LTS releases and sometimes for stable releases too).
  • Most automated functional tests are currently running in a single environment (HSQLDB/Jetty/Firefox). However, we've started to use Docked-based testing to test automatically using several configurations and are migrating more and more tests to that. There are jobs in XWiki's CI that execute tests in all configurations regularly (not at each commit though but as much as possible).
    • Manual tests are also executed at each release by dedicated contributors and they manually execute tests that are not covered by the automated tests.
  • Responsiveness tests (e.g. verify that XWiki displays fine on a Mobile) are currently executed manually from time to time by dedicated contributors.
  • Accessibility tests are performed automatically at each commit.
  • HTML5 validity tests are performed automatically at each commit.

Unit Testing

Java Unit Testing

See Java Unit Testing

JavaScript Unit Testing

  • These are tests that do not rely on the DOM, written as "behavioral specifications", using the Jasmine test framework.
  • In development mode, you can start the test runner process by running mvn jasmine:bdd in your command line. This will start a test runner at http://localhost:8234, that will run all tests in the src/test/javascript directory. Write your tests there and hit refresh to see the results directly in your browser.
  • For tests that need a DOM, see Functional Testing

Integration Testing

An integration test tests several classes together but without a running XWiki instance. For example if you have one Component which has dependencies on other Components and they are tested together this is called Integration testing.

Java Integration Testing

  • These tests are written using JUnit 5.x and Mockito (we were using JMock 2.x before and still have a lot of tests not converted to Mockito yet).
  • Your Maven module must depend on the xwiki-commons-test module.
  • Use the MockitoComponentMockingRule Rule (its javadoc explains how to use it in details).
  • Examples:
    • example2
    • Other example:
      public class DefaultExtensionLicenseManagerTest
         public final ComponentManagerRule componentManager = new ComponentManagerRule();
         public void setUp() throws Exception
             this.licenseManager = this.componentManager.getInstance(ExtensionLicenseManager.class);
    • Another example (using the new @BeforeComponent annotation):
      public class DefaultExtensionManagerConfigurationTest
         public final MockitoComponentManagerRule componentManager = new MockitoComponentManagerRule();

         private ExtensionManagerConfiguration configuration;

         private MemoryConfigurationSource source;

         public void registerComponents() throws Exception
             // Register a Mocked Environment since we need to provide one.

             // Register some in-memory Configuration Source for the test
             this.source = this.componentManager.registerMemoryConfigurationSource();

         public void setUp() throws Exception
             this.configuration = this.componentManager.getInstance(ExtensionManagerConfiguration.class);

      @BeforeComponent is used by ComponentManagerRule and methods annotated with it are called before other components are registered (i.e. before processing of @AllComponents and @ComponentList annotations).

      MockitoComponentManagerRule extends ComponentManagerRule and just adds some helper methods to register a mock component.

Best practices

Java Rendering Testing

We have a special framework for making it easy to write Rendering tests, see the Rendering Testing Framework

XAR Testing

Since XWiki 7.3M1 It's now possible to write integration tests for wiki pages on the filesystem (in XML format).
Since XWiki 10.5RC1 Tests extending PageTest must now using JUnit5.

The way those tests work is that the XML file representing the wiki pages are loaded from the filesystem into XWikiDocument instances and a stubbed environment is defined so that XWikiDocument can then ben rendered in the desired syntax.

To write such a test:

  • Make your POM depend on the org.xwiki.platform:xwiki-platform-test-page module
  • Write a Java test class that extends PageTest
  • Possibly add some extra component registration that you need through existing annotations (*ComponentList annotations) or through the custom ComponentList annotation. Note that when extending PageTest this automatically brings some base components registration (the list defined in PageComponentList and ReferenceComponentList). Example of other existing annotations:
    • XWikiSyntax20ComponentList: XWiki Syntax 2.0-related components
    • XWikiSyntax21ComponentList: XWiki Syntax 2.1-related components
    • XHTML10ComponentList: XHTML 1.0-related components
  • You then verify the rendering of a page by setting the output syntax you wish to have, the query parameters and then call renderPage(). For example:
    request.put("section", "Links");
    request.put("xpage", "print");
    String result = renderPage(new DocumentReference("xwiki", "XWiki", "XWikiSyntax"));
    assertTrue("...explanation if test fails", result.contains("...content that we wish to verify..."));
  • If you need to have some other pages loaded from sources too (for example if the page you're rendering contains an include macro that loads another page), you'll use the

    loadPage() API as in:loadPage(new DocumentReference("xwiki", "XWiki", "XWikiSyntaxLinks"));

Here's a full example:

public class WebRssTest extends PageTest
   private ScriptQuery query;

   public void setUp() throws Exception
        request.put("outputSyntax", "plain");
        request.put("xpage", "plain");

        QueryManagerScriptService qmss = mock(QueryManagerScriptService.class);
        oldcore.getMocker().registerComponent(ScriptService.class, "query", qmss);
        query = mock(ScriptQuery.class);
        when(qmss.xwql("where 1=1 order by desc")).thenReturn(query);

   public void webRssFiltersHiddenDocuments() throws Exception
       // Render the page to test
       renderPage(new DocumentReference("xwiki", "Main", "WebRss"));

       // This is the real test!!
       // We want to verify that the hidden document filter is called when executing the XWQL
       // query to get the list of modified pages

   public void webRssDisplay() throws Exception
        when(query.execute()).thenReturn(Arrays.<Object>asList("Space1.Page1", "Space2.Page2"));

        FeedPlugin plugin = new FeedPlugin("feed", FeedPlugin.class.getName(), context);
        FeedPluginApi pluginApi = new FeedPluginApi(plugin, context);
        when(xwiki.getPluginApi("feed", context)).thenReturn(pluginApi);

       // Render the page to test
       String xml = renderPage(new DocumentReference("xwiki", "Main", "WebRss"));


Functional Testing

A functional test requires a running XWiki instance.

GUI tests

We now have 3 frameworks for running GUI tests:

  • One based on Selenium3 and Docker. This is now the recommended framework to use for new tests.
  • One based on Selenium2/Webdriver which is now deprecated and shouldn't be used. We encourage committers to port tests written for it to the Selenium3 framework. Especially when committers bring modification to the old tests we encourage them to rewrite the tests as new Selenium3 tests.
  • A last one based on Selenium1 which is also deprecated and shouldn't be used. We encourage committers to port tests written for it to the Selenium3 framework. Especially when committers bring modification to the old tests we encourage them to rewrite the tests as new Selenium3 tests.

Selenium3-based Framework

  • Based on TestContainers and uses Docker to execute the tests under various Databases, Servlet Engines and Browsers.
  • The only requirements for running these tests is to have Docker installed locally and to have the user under which you run your IDE and the Maven build be able to use the docker executable.
  • Configurations options:


    • When passing the options as system properties, the format is -Dxwiki.test.ui.<Option Name>=value
    • When passing as Java annotation attributes, in functional tests, the format is @UITest(<Option Name> = <Value>)
    Option NameDefault ValueValid ValuesDescription
    • firefox (Browser.FIREFOX)
    • chrome (Browser.CHROME)
    The browser used in the tests. Note that the version of the browser used is controlled by the version of Selenium that is defined in the pom.xml of xwiki-platform-test-docker.
    • mysql (Database.MYSQL)
    • postgres (Database.POSTGRESQL)
    • hsqldb_embedded (Database.HSQLDB_EMBEDDED)
    The database used in the tests
    • tomcat (ServletEngine.TOMCAT)
    • jetty (ServletEngine.JETTY)
    • jetty_standalone (ServletEngine.JETTY_STANDALONE)
    • external (ServletEngine.EXTERNAL)
    The Servlet Engine used in the tests. Use external to use your own started and alreayd provisioned XWiki instance.
    • true
    • false
    When active, displays more logs in the console (especially container startup logs)
    • true
    • false
    By default, database data is not saved between test executions. Note that if you decide to save database data, they'll be saved under a docker user by Docker and you need your local user to be able to remove them if you want to clean your Maven target directory.
    • true
    • false
    When offline, the custom XWiki WAR generation and the XWiki provisioning are done solely from your local Maven repository. Otherwise, when artifacts are not present locally or newer SNAPSHOT versions are available, they'll be fetched from Maven remote repositories.
    databaseTagLatest supported version by XWikiAny Docker tag available from Dockerhub for the container imageVersion of the database to use. Isn't supported for HSQLDB Embedded since it doesn't run in a Docker container.
    servletEngineTagLatest supported version by XWikiAny Docker tag available from Dockerhub for the container imageVersion of the Servlet Engine to use. Isn't supported for Jetty Standalone since it doesn't run in a Docker container.
    jdbcDriverVersionLatest version validated by the XWiki dev teamAny version for the specified database that is available on Maven Central, e.g. for MySQL the groupId/artifactId is mysql/mysql-connector-java. 
    • true
    • false
    When active a VNC container is started to record a video of the tests and more generally to allow connecting to the UI running the tests. Useful when debugging.
    propertiesNo default (empty)Check the *.vm files for the configuration files to see the list of possible values. Velocity properties that are applied when generating XWiki's configuration files: xwiki.cfg, and hibernate.cfg.xml. Example: Tell XWiki that minification is off: or @UITest(properties = { "xwikiPropertiesAdditionalProperties=debug.minify=false" }).
  • Tests get injected XWikiWebDriver and TestUtils instances as test method parameters.
  • Test results is recording in a FLV file in the target directory.
  • A screenshot of the UI when the test is failing is also taken in the target directory.
  • When test execute it's possible to connect to the running VNC server and see the UI by using a VNC client and connect to the VNC URL printed in the console, e.g. vnc://vnc:[email protected]:32936.
  • The version for the platform dependencies is specified using the platform.version property in the pom.xml file executing the tests.

General implementation architecture:

See Vincent's blog for more details


public class SeleniumTest
   public void test(XWikiWebDriver driver, TestUtils setup)
            containsString("XWiki - The Advanced Open Source Enterprise and Application Wiki"));
        driver.findElement(By.linkText("XWiki's concept")).click();

Selenium2-based Framework


  • To debug a test simply start XE somewhere and then debug your JUnit tests as a normal JUnit test in your IDE.
    • Note that functional test Maven modules will create an XWiki instance in target/xwiki thanks to the execution of the XWiki Packager plugin, so it's simpler to start this XWiki instance when debugging.
  • In order to debug more easily flickering tests you can simply add the @Intermittent annotation on your test method and it'll be executed 100 times in a row (you can also specify @Intermittent(repetition=N)to repeat it N times). This is achieved thanks to the Tempus Fugit framework that we've integrated.
  • To run a specific test, pass the pattern property (it's a regex on the test class name) as in: mvn install -Dpattern=TestClass#testName (this will run the testName test from TestClass)
  • To run the tests on your own running instance (instead of letting Maven close your instance and start a fresh one), use -Dxwiki.test.verifyRunningXWikiAtStart=true. It could be useful to verify that you have not broken the tests on your instance before committing your changes.
  • By default the Firefox browser will be used but if you wish to run with another browser just pass the browser parameter as in:
    • Firefox (default): -Dbrowser=*firefox
    • Internet Explorer: -Dbrowser=*iexplore
    • Chrome: -Dbrowser=*chrome
    • PhantomJS: -Dbrowser=*phantomjs
  • You may reach a compatibility problem between the version of the browser and the version of Selenium.  For example, at the moment, Firefox 32.0 is required.  You may install it on your computer and refer to it with -Dwebdriver.firefox.bin="/path/to/your/firefox-32.0"
  • We check for pages that require Programming Rights automatically by bundling a Listener component that listens to the ScriptEvaluatingEvent event and that drops Programming Rights, in an effort to make the tests fail and so that the developer can notice he requires PR and fix that. In some cases where it's necessary, this can be disabled by setting the configuration < testProgrammingRights>false</testProgrammingRights> in the configuration of the Package Mojo, and/or by using a System Property to control where the check is performed, e.g. <xwikiPropertiesAdditionalProperties>test.prchecker.excludePattern=.*:XWiki\.XWikiPreferences</xwikiPropertiesAdditionalProperties>. Since XWiki 9.8RC1
  • If the xwiki.test.startXWiki system property is set to true then the test itself will start/stop XWiki. If set to false then it's then the responsibility of the build to start/stop XWiki. Useful when starting.stopping XWiki inside a Docker container handled by the Maven build for example. Since XWiki 10.0

Best practices

  • Tests are located in xwiki-platform inside the module that they are testing. Note that in the past we were putting functional tests in xwiki-platform-distribution/xwiki-platform-distribution-flavor/xwiki-platform-distribution-flavor-test but we have started to move them inside the specific modules they are testing.
  • Name the tests using <prefix>
  • Use a test suite to ensure that XWiki is started/stopped only once for all tests in the module:
    public class AllITs
  • Use the maven-failsafe-plugin for functional UI tests:
  • Locate tests in src/test/java (i.e the default maven location for tests)
  • Tests should be written using the Page Objects Pattern:
  • Since functional tests take a long time to execute (XWiki to start, Browser to start, navigation, etc) it's important to write tests that execute as fast as possible. Here are some tips:
    • Write scenarios (i.e. write only a few test methods, even only one if you can so that you can have a single fixture) instead of a lot of individual tests as you'd do for unit tests.
    • Use getUtil() to perform quick actions that are not part of what you wish to test (i.e that are part of the test fixture). For example to create a page you need in your fixture, write:
      instead of
      WikiEditPage wep = new WikiEditPage();
      wep.switchToEdit(SPACE_NAME, DOC_NAME);
      ViewPage vp = wep.clickSaveAndView();
    • Your Maven module for test may depend on a specific profile (as a best practice, you can use integration-tests).  Doing this, it will be built only when specifically asked with -Pintegration-tests (see below for an example of the pom.xml)
  • Never, ever, wait on a timer! Instead wait on elements.
  • If you need to create, update or delete a page during your tests, use a space name specific to your test scenario with getTestClassName(). For example:
    getUtil().createPage(getTestClassName(), "Page", "Content", "Title");
  • Tests must not depend on one another. In other words, it should be possible to execute tests in any order and running only one test class should work fine.
  • Tests that need to change existing configuration (e.g. change the default language, set specific rights, etc) must put back the configuration as it was. This is true only in flavor tests or when several functional tests of different domains are executed one after another. However functional tests located in xwiki-platform specific modules are only running the tests for their module and thus it's not important, and saves times, if they don't clean up.
  • Tests are allowed to create new documents and don't need to remove them at the end of the test.

Examples of functional tests:

The Office Importer tests

In XWiki 7.3, we have introduced in xwiki-platform some functional tests for the Office Importer Application. To enable them, you need to enable the profile office-tests. An OpenOffice (or LibreOffice) server is needed on your system. You might also need to set an environment variable that points to the office home path, if not standard. This variable is called XWIKI_OFFICE_HOME and can be set like this:

## For Unix systems:
export XWIKI_OFFICE_HOME="/opt/libreoffice3.6"

You should set this environment variable in your CI agents.

Old Selenium1-based Framework

  • We were using Selenium RC to perform functional tests for GUI. We had created some JUnit extension to easily write Selenium tests in Java.
  • To run these tests on your local machine go to xwiki-enteprise/xwiki-enterprise-test/xwiki-enterprise-test-selenium and type mvn install.
  • To run a specific test, pass the pattern property as in: mvn install -Dpattern=DeletePageTest (this will run the DeletePageTest - Note that you don't have to specify the extension). In addition if you wish to execute only a specific method from a Test Case class, you can pass the patternMethod property as in: mvn install -Dpattern=DeletePageTest -DpatternMethod=testDeletePageCanDoRedirect.
  • To enable debug output from the selenium server start maven with the -Ddebug=true switch and then all messages from the selenium server are saved to: xwiki-enteprise/xwiki-enterprise-test/xwiki-enterprise-test-selenium/target/selenium/server.log.
  • To debug a functional Selenium test in your favourite Java IDE go in xwiki-enteprise/xwiki-enterprise-test/xwiki-enterprise-test-selenium and run maven with -Dmaven.surefire.debug. Maven will wait till you connect your IDE to the running JVM on port 5005. You can put a breakpoint in your IDE and debug the test.

Browser version

Currently we only run our functional tests on the Firefox browser.
The browser you use for functional tests needs to match the selenium version, otherwise unpredictable results may occur.

  • The current Selenium version we are using is 2.44.0
    • (valid for 18.March.2015. Actual version used can be verified here under the selenium.version property)
  • The Firefox version we use in you continuous integration agents is 32.0.1.
    • (valid for 18.March.2015. Ask on the list or on IRC for the actual used version since it's not publicly verifiable) 
    • To determine browser compatibility with the Selenium version used, scan Selenium's changelog and look for entries like "* Updating Native events to support Firefox 24, 31, 32 and 33". That shows the supported browser version for the particular Selenium version.

If you wish to run tests with the exact configuration as XWiki's Continous Integration server uses, you need to install and use locally the same Firefox version. To do so, you have to:

  1. Download the corresponding Firefox release
  2. Unzip it locally
  3. Use the webdriver.firefox.bin java system property to specify the location of your firefox version
    1. Depending on how you are starting the functional tests, you`d have to either add the system property in your maven build (surefire plugin configuration) or in your IDE (run configuration)
    2. Read Selenium's FirefoxDriver documentation for more information and options

XHTML, CSS & WCAG Validations

Performance Testing

  • These are memory leakage tests, load tests and speed of execution tests.
  • They are performed manually and in an ad hoc fashion for now. They are executed for some stable versions and for all super stable versions.
  • See Methodology and reports.

See the Profiling topic for details on how to use a profiler for detecting performance issues.

Manual testing

Here's the spirit we'd like to have from the XWiki Manual Testing Team: The Black Team.

Besides automated testing, ensuring software quality requires that features be tested by actual users (see Manual testing). In order to manage manual testing, a test plan is required. 

Tools Reference

We use the following tools in our automated tests:

  • JUnit: test framework
  • Mockito, JMock: mocking environment of unit test to isolate it
  • Hamcrest: extra assertions beyond what JUnit provides
  • GreenMail: for testing email sending
  • WireMock: for simulating HTTP connections to remote servers
  • JMeter: for performance tests
  • Dumbbench: for manual performance tests

Test Coverage

We now have a SonarQube instance showing Test coverage for both unit tests and integration tests. However it doesn't aggregate coverage data across top level modules (commons, rendering, platform, enterprise, etc).

We support both Jacoco and Clover to generate test coverage reports.

To generate test coverage reports make sure you can build your module and then pick one of the following strategies below depending on what you wish to generate.

Single Maven Reactor

Using Jacoco

  • Go in the first top level module (e.g. xwiki-commons) and run: mvn clean jacoco:prepare-agent install -Djacoco.destFile=/tmp/jacoco.exec -Djacoco.append=false -Plegacy,integration-tests -Dxwiki.revapi.skip=true -Dmaven.test.failure.ignore=true
  • Go in all the other top level modules you wish to add and run: mvn clean jacoco:prepare-agent install -Djacoco.destFile=/tmp/jacoco.exec -Djacoco.append=true -Plegacy,integration-tests  -Dxwiki.revapi.skip=true -Dmaven.test.failure.ignore=true
  • Then whenever you wish to generate the full test report, run: mvn jacoco:report -Djacoco.dataFile=/tmp/jacoco.exec


    Jacoco supports generating report from a single module, it even supports generating aggregated report from several modules using the report-aggregate mojo introduced in 0.7.7 (but note that I don't know if that includes coverage induced by module B on module A). However jacoco doesn't seem to support the ability to execute several maven reactors (for example for building code located in various github repositories), using the same jacoco exec file and then generate a report for that. See also!topic/jacoco/odVzr7P5i6w

Using Clover

Go to the top level module containing children modules for which to generate a Clover report. Note that you won't be able to aggregate Clover data across different Maven runs with this strategy so you really need a single parent module.

  1. Run the following command (adjust the local repo path):
    mvn clean clover:setup install clover:aggregate clover:clover -Pclover,integration-tests,dev,jetty,hsqldb -Dxwiki.revapi.skip=true -Dmaven.repo.local=/Users/vmassol/.m2/repository-clover -Dmaven.test.failure.ignore=true -nsu
    • You might need to run the "install" goal instead of the "test" one if your local Maven repository doesn't already contain some test jars (apparently and for some reason Maven won't download them from the remote repository under some conditions).
    • Note that we use -Dmaven.repo.local to use a different Maven local repository so that instrumented source code and built binaries don't find their way in your standard local repository (which you could then deploy by mistake, or that'll simply fail your tests later on when you don't run with the Clover profile since instrumented artifacts require the Clover JAR to be present in the classpath at runtime...).

Multiple Maven Reactors

Using Jacoco


Using Clover

Use a single Clover database to which you add coverage information as you build modules one after another. This strategy is especially useful when you wish to manually run some modules and ensure that coverage data aggregate in a single place so that when you generate the report you have the result of all your runs.

  1. Instrument the source code with Clover for all modules that you want to include in the report, using (adjust the paths):
    mvn clover:setup install -Pclover,integration-tests,dev,jetty,hsqldb -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -Dxwiki.revapi.skip=true -Dmaven.repo.local=/Users/vmassol/.m2/repository-clover -nsu

    When tests are executed they'll generate coverage data in the specified Clover database. Since there's a single Clover there's no need to merge Clover databases as in strategy 1 above.

  2. To generate the Clover report, execute the following from any module (adjust path):
    mvn clover:clover -N -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -Dmaven.repo.local=/Users/vmassol/.m2/repository-clover -nsu
  3. Remember to clean your clover database when you're done.

If you don't wish failing tests to stop the generation of the coverage report, you should pass -Dmaven.test.failure.ignore=true on the command line.

Here are typical steps you'd follow to generate full TPC for XWiki:

  • Clean your local repo and remove any previous clover DBs:
    rm -R ~/.m2/repository-clover/org/xwiki
    rm -R ~/.m2/repository-clover/com/xpn
    rm -R ~/.xwiki/clover
  • Generate coverage data for XWiki Commons:
    cd xwiki-commons
    mvn clean -Pclover,integration-tests -Dmaven.repo.local=/Users/vmassol/.m2/repository-clover -nsu
    mvn clover:setup install -Pclover,integration-tests -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -Dmaven.repo.local=/Users/vmassol/.m2/repository-clover -Dmaven.test.failure.ignore=true  -Dxwiki.revapi.skip=true -nsu
  • Generate Clover report just for Commons:
    mvn clover:clover -N -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -Dmaven.repo.local=/Users/vmassol/.m2/repository-clover -nsu
  • Generate coverage data for XWiki Rendering:
    cd xwiki-rendering
    mvn clean -Pclover,integration-tests -Dmaven.repo.local=/Users/vmassol/.m2/repository-clover -nsu
    mvn clover:setup install -Pclover,integration-tests -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -Dmaven.repo.local=/Users/vmassol/.m2/repository-clover -Dmaven.test.failure.ignore=true  -Dxwiki.revapi.skip=true -nsu
  • Generate Clover report for Commons and Rendering:
    mvn clover:clover -N -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -Dmaven.repo.local=/Users/vmassol/.m2/repository-clover -nsu
  • Generate coverage data for XWiki Platform:
    cd xwiki-platform
    mvn clean -Pclover,integration-tests -Dmaven.repo.local=/Users/vmassol/.m2/repository-clover -nsu
    mvn clover:setup install -Pclover,integration-tests -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -Dmaven.repo.local=/Users/vmassol/.m2/repository-clover -Dmaven.test.failure.ignore=true  -Dxwiki.revapi.skip=true -nsu
  • Generate full Clover report (for Commons, Rendering and Platform):
    mvn clover:clover -N -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -Dmaven.repo.local=/Users/vmassol/.m2/repository-clover -nsu

Using Clover + Jenkins

  • Install Jenkins 2.0+ and the following plugins:
    • Pipeline plugin
    • XVnc plugin
  • Make sure to have Maven ("mvn" executable) + VNCServer ("vncserver" executable) installed on the build agent
  • Build agent must be running unix
  • Setup a Pipeline job and make it point to the following pipeline script.

Note that the Clover pipeline script will generate a report with differences from the previous passing report showing the contributions (positive and negative) of each module to the global TPC.

Example Reports

Created by VincentMassol on 2007/03/07 13:33

Get Connected