There are various types of tests and everyone has its own terminology. You'll find below the XWiki terminology and best practices related to testing.

Check the general development flow to understand better how testing fits in the larger picture.

Here's the general methodology used:

  • Code committed must have associated automated tests. There's a check in the build (in the quality profile) to ensure that committed code do not reduce the total test coverage for that module. The CI runs the quality profile for each commit and fails the build if the test coverage is reduced.
  • Developers run unit and integration tests on their machines daily and frequently.
  • Functional tests are executed by the CI, at each commit.
  • Performance tests are executed manually several times per year (usually for LTS releases and sometimes for stable releases too).
  • Automated tests are currently running in a single environment (HSQLDB/Jetty/Firefox). Manual tests are executed at each release by dedicated contributors and they make sure to run them on various combinations of database, servlet containers and browsers/versions.
  • Responsiveness tests (e.g. verify that XWiki displays fine on a Mobile) are currently executed manually from time to time by dedicated contributors.
  • Accessibility tests are performed automatically at each commit.
  • HTML5 validity tests are performed automatically at each commit.

Unit Testing

A unit test only tests a single class in isolation from other classes. Since in the XWiki project we write code using Components, this means a unit test is testing a Component in isolation from other Components.

Java Unit Testing

  • These are tests performed in isolation using Mock Objects. More specifically we're using JUnit 4.x and Mockito (we were using JMock 2.x before and still have a lot of tests not converted to Mockito yet)
  • These tests must not interact with the environment (Database, Container, File System, etc) and do not need any setup to execute
  • These tests must not output anything to stdout or stderr (or it'll fail the build). Note that if the code under tests output logs, they need to be captured and possibly asserted. For example:
    ...
    /**
     * Capture logs.
     */

    @Rule
    public AllLogRule logRule = new AllLogRule();
    ...
    assertEquals("Error getting resource [bad resource] because of invalid path format. Reason: [invalid url]",
       this.logRule.getMessage(0));
    ...
  • Your Maven module must depend on the xwiki-commons-tool-test-simple (for tests not testing Components) or xwiki-commons-tool-test-component (for tests testing Components) modules.
    • If you're testing Components, use MockitoComponentMockingRule (its javadoc explains how to use it in details)
      • Example 1: Canonical use case
        public class DefaultDiffManagerTest
        {
           @Rule
           public final MockitoComponentMockingRule<DiffManager> mocker =
               new MockitoComponentMockingRule(DefaultDiffManager.class);
        ...
           @Test
           public void testDiffStringList() throws Exception
           {
               // Null

                DiffResult<String> result = this.mocker.getComponentUnderTest().diff(null, null, null);
        ...
      • Example 2: Example showing how to not mock one dependency being injected (ObservationManager is excluded in the example)
        @ComponentList({DefaultLoggerManager.class, DefaultObservationManager.class, LogbackEventGenerator.class})
        public class DefaultLoggerManagerTest
        {
           @Rule
           public final MockitoComponentMockingRule<DefaultLoggerManager> mocker =
               new MockitoComponentMockingRule<DefaultLoggerManager>(DefaultLoggerManager.class);
        ...
    • If you're testing non Components, just use pure JUnit and Mockito (you're on your own!)
  • Examples:

Best practices

  • Name the Test class with the name of the class under test suffixed with Test. For example the JUnit test class for XWikiMessageTool should be named XWikiMessageToolTest
  • Name the test methods with the method to test followed by a qualifier describing the test. For example importWithHeterogeneousEncodings().

Tips

  • When Mocking, to ignore all debug logging calls and to allow all calls to is*Enabled you could write:
    // Ignore all calls to debug() and enable all logs so that we can assert info(), warn() and error() calls.
    ignoring(any(Logger.class)).method("debug");
    allowing(any(Logger.class)).method("is.*Enabled"); will(returnValue(true));

JavaScript Unit Testing

  • These are tests that do not rely on the DOM, written as "behavioral specifications", using the Jasmine test framework.
  • In development mode, you can start the test runner process by running mvn jasmine:bdd in your command line. This will start a test runner at http://localhost:8234, that will run all tests in the src/test/javascript directory. Write your tests there and hit refresh to see the results directly in your browser.
  • For tests that need a DOM, see Functional Testing

Integration Testing

An integration test tests several classes together but without a running XWiki instance. For example if you have one Component which has dependencies on other Components and they are tested together this is called Integration testing.

Java Integration Testing

  • These tests are written using JUnit 4.x and Mockito (we were using JMock 2.x before and still have a lot of tests not converted to Mockito yet).
  • Your Maven module must depend on the xwiki-commons-test module.
  • Use the MockitoComponentMockingRule Rule (its javadoc explains how to use it in details).
  • Examples:
    • example2
    • Other example:
      @ComponentList({
          DefaultExtensionLicenseManager.class
      })
      public class DefaultExtensionLicenseManagerTest
      {
         @Rule
         public final ComponentManagerRule componentManager = new ComponentManagerRule();
      ...
         @Before
         public void setUp() throws Exception
         {
             this.licenseManager = this.componentManager.getInstance(ExtensionLicenseManager.class);
         }
      ...
    • Another example (using the new @BeforeComponent annotation):
      @ComponentList({
          DefaultExtensionManagerConfiguration.class,
          DefaultLoggerManager.class,
          DefaultObservationManager.class
      })
      public class DefaultExtensionManagerConfigurationTest
      {
         @Rule
         public final MockitoComponentManagerRule componentManager = new MockitoComponentManagerRule();

         private ExtensionManagerConfiguration configuration;

         private MemoryConfigurationSource source;

         @BeforeComponent
         public void registerComponents() throws Exception
         {
             // Register a Mocked Environment since we need to provide one.
             this.componentManager.registerMockComponent(Environment.class);

             // Register some in-memory Configuration Source for the test
             this.source = this.componentManager.registerMemoryConfigurationSource();
         }

         @Before
         public void setUp() throws Exception
         {
             this.configuration = this.componentManager.getInstance(ExtensionManagerConfiguration.class);
         }
      ...

      @BeforeComponent is used by ComponentManagerRule and methods annotated with it are called before other components are registered (i.e. before processing of @AllComponents and @ComponentList annotations).

      MockitoComponentManagerRule extends ComponentManagerRule and just adds some helper methods to register a mock component.

Best practices

Java Rendering Testing

We have a special framework for making it easy to write Rendering tests, see the Rendering Testing Framework

XAR Testing

Since XWiki 7.3M1 It's now possible to write integration tests for wiki pages on the filesystem (in XML format).

The way those tests work is that the XML file representing the wiki pages are loaded from the filesystem into XWikiDocument instances and a stubbed environment is defined so that XWikiDocument can then ben rendered in the desired syntax.

To write such a test:

  • Make your POM depend on the org.xwiki.platform:xwiki-platform-test-page module
  • Write a Java test class that extends PageTest
  • Possibly add some extra component registration that you need through existing annotations (*ComponentList annotations) or through the custom ComponentList annotation. Note that when extending PageTest this automatically brings some base components registration (the list defined in PageComponentList and ReferenceComponentList). Example of other existing annotations:
    • XWikiSyntax20ComponentList: XWiki Syntax 2.0-related components
    • XWikiSyntax21ComponentList: XWiki Syntax 2.1-related components
    • XHTML10ComponentList: XHTML 1.0-related components
  • You then verify the rendering of a page by setting the output syntax you wish to have, the query parameters and then call renderPage(). For example:
    setOutputSyntax(Syntax.XHTML_1_0);
    request.put("section", "Links");
    request.put("xpage", "print");
    ...
    String result = renderPage(new DocumentReference("xwiki", "XWiki", "XWikiSyntax"));
    assertTrue("...explanation if test fails", result.contains("...content that we wish to verify..."));
  • If you need to have some other pages loaded from sources too (for example if the page you're rendering contains an include macro that loads another page), you'll use the

    loadPage() API as in:loadPage(new DocumentReference("xwiki", "XWiki", "XWikiSyntaxLinks"));

Here's a full example:

@XWikiSyntax20ComponentList
@XHTML10ComponentList
public class WebRssTest extends PageTest
{
   private ScriptQuery query;

   @Before
   public void setUp() throws Exception
   {
        setOutputSyntax(Syntax.PLAIN_1_0);
        request.put("outputSyntax", "plain");
        request.put("xpage", "plain");

        QueryManagerScriptService qmss = mock(QueryManagerScriptService.class);
        oldcore.getMocker().registerComponent(ScriptService.class, "query", qmss);
        query = mock(ScriptQuery.class);
        when(qmss.xwql("where 1=1 order by doc.date desc")).thenReturn(query);
   }

   @Test
   public void webRssFiltersHiddenDocuments() throws Exception
   {
       // Render the page to test
       renderPage(new DocumentReference("xwiki", "Main", "WebRss"));

       // This is the real test!!
       // We want to verify that the hidden document filter is called when executing the XWQL
       // query to get the list of modified pages
       verify(query).addFilter("hidden/document");
   }

   @Test
   public void webRssDisplay() throws Exception
   {
        when(query.addFilter(anyString())).thenReturn(query);
        when(query.setLimit(20)).thenReturn(query);
        when(query.setOffset(0)).thenReturn(query);
        when(query.execute()).thenReturn(Arrays.<Object>asList("Space1.Page1", "Space2.Page2"));

        FeedPlugin plugin = new FeedPlugin("feed", FeedPlugin.class.getName(), context);
        FeedPluginApi pluginApi = new FeedPluginApi(plugin, context);
        when(xwiki.getPluginApi("feed", context)).thenReturn(pluginApi);

       // Render the page to test
       String xml = renderPage(new DocumentReference("xwiki", "Main", "WebRss"));

        assertTrue(xml.contains("<title>activity.rss.feed.description</title>"));
        assertTrue(xml.contains("<title>Page1</title>"));
        assertTrue(xml.contains("<title>Page2</title>"));
   }
}

Functional Testing

A functional test requires a running XWiki instance.

GUI tests

We now have 2 frameworks for running GUI tests:

  • One based on Selenium2/Webdriver which is the framework to use when writing new functional UI tests.
  • One based on Selenium1 which is now deprecated and shouldn't be used. We encourage committers to port tests written for it to the Selenium2 framework. Especially when committers bring modification to the old tests we encourage them to rewrite the tests as new Selenium2 tests.

Selenium2-based Framework

Using:

  • To debug a test simply start XE somewhere and then debug your JUnit tests as a normal JUnit test in your IDE.
    • Note that functional test Maven modules will create an XWiki instance in target/xwiki thanks to the execution of the XWiki Packager plugin, so it's simpler to start this XWiki instance when debugging.
  • In order to debug more easily flickering tests you can simply add the @Intermittent annotation on your test method and it'll be executed 100 times in a row (you can also specify @Intermittent(repetition=N)to repeat it N times). This is achieved thanks to the Tempus Fugit framework that we've integrated.
  • To run a specific test, pass the pattern property (it's a regex on the test class name) as in: mvn install -Dpattern=TestClass#testName (this will run the testName test from TestClass)
  • To run the tests on your own running instance (instead of letting mvn close your instance and start a fresh one), use -Dxwiki.test.verifyRunningXWikiAtStart=true. It could be useful to verify that you have not broken the tests on your instance before committing your changes.
  • By default the Firefox browser will be used but if you wish to run with another browser just pass the browser parameter as in:
    • Firefox (default): -Dbrowser=*firefox
    • Internet Explorer: -Dbrowser=*iexplore
    • Chrome: -Dbrowser=*chrome
    • PhantomJS: -Dbrowser=*phantomjs
  • You may reach a compatibility problem between the version of the browser and the version of Selenium.  For example, at the moment, Firefox 32.0 is required.  You may install it on your computer and refer to it with -Dwebdriver.firefox.bin="/path/to/your/firefox-32.0"

Best practices

  • Tests are located in xwiki-platform inside the module that they are testing. Note that in the past we were putting functional tests in xwiki-enterprise/xwiki-enterprise-test/xwiki-enterprise-test-ui but we have now started to move them to xwiki-platform.
  • Tests should be written using the Page Objects Pattern:
  • Since functional tests take a long time to execute (XWiki to start, Browser to start, navigation, etc) it's important to write tests that execute as fast as possible. Here are some tips:
    • Write scenarios (i.e. write only a few test methods, even only one if you can so that you can have a single fixture) instead of a lot of individual tests as you'd do for unit tests.
    • Use getUtil() to perform quick actions that are not part of what you wish to test (i.e that are part of the test fixture). For example to create a page you need in your fixture, write:
      getUtil().createPage(...);
      instead of
      WikiEditPage wep = new WikiEditPage();
      wep.switchToEdit(SPACE_NAME, DOC_NAME);
      wep.setTitle(DOC_TITLE);
      wep.setContent(CONTENT);
      ViewPage vp = wep.clickSaveAndView();
    • Your Maven module for test may depend on a specific profile (as a best practice, you can use integration-tests).  Doing this, it will be built only when specifically asked with -Pintegration-tests (see below for an example of the pom.xml)
      <profiles>
       <profile>
         <id>integration-tests</id>
         <modules>
           <module>application-task-test</module>
         </modules>
       </profile>
      </profiles>
  • Never, ever, wait on a timer! Instead wait on elements.
  • If you need to create, update or delete a page during your tests, use a space name specific to your test scenario with getTestClassName(). For example:
    getUtil().createPage(getTestClassName(), "Page", "Content", "Title");
  • Tests must not depend on one another. In other words, it should be possible to execute tests in any order and running only one test class should work fine.
  • Tests that need to change existing configuration (e.g. change the default language, set specific rights, etc) must put back the configuration as it was. This is true only in xwiki-enterprise or when several functional tests of different domains are executed one after another. However functional tests located in xwiki-platform are only running the tests for their module and thus it's not important, and saves times, if they don't clean up.
  • Tests are allowed to create new documents and don't need to remove them at the end of the test.

Examples of functional tests:

The Office Importer tests

In XWiki 7.3, we have introduced in xwiki-platform some functional tests for the Office Importer Application. To enable them, you need to enable the profile office-tests. An OpenOffice (or LibreOffice) server is needed on your system. You might also need to set an environment variable that points to the office home path, if not standard. This variable is called XWIKI_OFFICE_HOME and can be set like this:

## For Unix systems:
export XWIKI_OFFICE_HOME="/opt/libreoffice3.6"

You should set this environment variable in your CI agents.

Old Selenium1-based Framework

  • We were using Selenium RC to perform functional tests for GUI. We had created some JUnit extension to easily write Selenium tests in Java.
  • To run these tests on your local machine go to xwiki-enteprise/xwiki-enterprise-test/xwiki-enterprise-test-selenium and type mvn install.
  • To run a specific test, pass the pattern property as in: mvn install -Dpattern=DeletePageTest (this will run the DeletePageTest - Note that you don't have to specify the extension). In addition if you wish to execute only a specific method from a Test Case class, you can pass the patternMethod property as in: mvn install -Dpattern=DeletePageTest -DpatternMethod=testDeletePageCanDoRedirect.
  • To enable debug output from the selenium server start maven with the -Ddebug=true switch and then all messages from the selenium server are saved to: xwiki-enteprise/xwiki-enterprise-test/xwiki-enterprise-test-selenium/target/selenium/server.log.
  • To debug a functional Selenium test in your favourite Java IDE go in xwiki-enteprise/xwiki-enterprise-test/xwiki-enterprise-test-selenium and run maven with -Dmaven.surefire.debug. Maven will wait till you connect your IDE to the running JVM on port 5005. You can put a breakpoint in your IDE and debug the test.

Browser version

Currently we only run our functional tests on the Firefox browser.
The browser you use for functional tests needs to match the selenium version, otherwise unpredictable results may occur.

  • The current Selenium version we are using is 2.44.0
    • (valid for 18.March.2015. Actual version used can be verified here under the selenium.version property)
  • The Firefox version we use in you continuous integration agents is 32.0.1.
    • (valid for 18.March.2015. Ask on the list or on IRC for the actual used version since it's not publicly verifiable) 
    • To determine browser compatibility with the Selenium version used, scan Selenium's changelog and look for entries like "* Updating Native events to support Firefox 24, 31, 32 and 33". That shows the supported browser version for the particular Selenium version.

If you wish to run tests with the exact configuration as XWiki's Continous Integration server uses, you need to install and use locally the same Firefox version. To do so, you have to:

  1. Download the corresponding Firefox release
  2. Unzip it locally
  3. Use the webdriver.firefox.bin java system property to specify the location of your firefox version
    1. Depending on how you are starting the functional tests, you`d have to either add the system property in your maven build (surefire plugin configuration) or in your IDE (run configuration)
    2. Read Selenium's FirefoxDriver documentation for more information and options

XHTML, CSS & WCAG Validations

Performance Testing

  • These are memory leakage tests, load tests and speed of execution tests.
  • They are performed manually and in an ad hoc fashion for now. They are executed for some stable versions and for all super stable versions.
  • See Methodology and reports.

See the Profiling topic for details on how to use a profiler for detecting performance issues.

Manual testing

Here's the spirit we'd like to have from the XWiki Manual Testing Team: The Black Team.

Besides automated testing, ensuring software quality requires that features be tested by actual users (see Manual testing). In order to manage manual testing, a test plan is required. 

Tools Reference

We use the following tools in our automated tests:

  • JUnit: test framework
  • Mockito, JMock: mocking environment of unit test to isolate it
  • Hamcrest: extra assertions beyond what JUnit provides
  • GreenMail: for testing email sending
  • WireMock: for simulating HTTP connections to remote servers
  • JMeter: for performance tests
  • Dumbbench: for manual performance tests

Test Coverage

We now have a SonarQube instance showing Test coverage for both unit tests and integration tests. However it doesn't aggregate coverage data across top level modules (commons, rendering, platform, enterprise, etc).

We support both Jacoco and Clover to generate test coverage reports.

To generate test coverage reports make sure you can build your module and then pick one of the following strategies below depending on what you wish to generate.

After running Clover you'll have some instrumented JARs in your local repository so you should be careful not to use those later on for deploying for example. And if you copy them in an existing xwiki setup in WEB-INF/lib you'll get an error at runtime saying that the Clover JAR is missing...

Strategy 1

Using Jacoco

  • Go in the first top level module (e.g. xwiki-commons) and run: mvn clean jacoco:prepare-agent install -Djacoco.destFile=/tmp/jacoco.exec -Djacoco.append=false -Plegacy,integration-tests -Dxwiki.revapi.skip=true
  • Go in all the other top level modules you wish to add and run: mvn clean jacoco:prepare-agent install -Djacoco.destFile=/tmp/jacoco.exec -Djacoco.append=true -Plegacy,integration-tests  -Dxwiki.revapi.skip=true
  • Then whenever you wish to generate the full test report, run:
    TODO
    Jacoco supports generating report from a single module, it even supports generating aggregated report from several modules using the report-aggregate mojo introduced in 0.7.7 (but note that I don't know if that includes coverage induced by module B on module A). However jacoco doesn't seem to support the ability to execute several maven reactors (for example for building code located in various github repositories), using the same jacoco exec file and then generate a report for that. See also https://groups.google.com/forum/#!topic/jacoco/odVzr7P5i6w

    If I try to run mvn jacoco:report -Djacoco.destFile=/tmp/jacoco.exec -N I get the error: Skipping JaCoCo execution due to missing execution data file..

Using Clover

Go to the top level module containing children modules for which to generate a Clover report. Note that you won't be able to aggregate Clover data across different Maven runs with this strategy so you really need a single parent module.

  1. Run the following command:
    mvn clean clover2:setup install clover2:aggregate clover2:clover -Pclover,integration-tests,dev,jetty,hsqldb -Dxwiki.revapi.skip=true
    You might need to run the "install" goal instead of the "test" one if your local Maven repository doesn't already contain some test jars (apparently and for some reason Maven won't download them from the remote repository under some conditions).

Strategy 2

Using Jacoco

TODO

Using Clover

Use a single Clover database to which you add coverage information as you build modules one after another. This strategy is especially useful when you wish to manually run some modules and ensure that coverage data aggregate in a single place so that when you generate the report you have the result of all your runs.

  1. Instrument the source code with Clover for all modules that you want to include in the report, using:
    mvn clover2:setup install -Pclover,integration-tests,dev,jetty,hsqldb -Dmaven.clover.cloverDatabase=/path/to/user/home/.xwiki/clover/clover.db  -Dxwiki.revapi.skip=true

    When tests are executed they'll generate coverage data in the specified Clover database. Since there's a single Clover there's no need to merge Clover databases as in strategy 1 above.

  2. To generate the Clover report, execute the following from any module:
    mvn clover2:clover -N -Dmaven.clover.cloverDatabase=/path/to/user/home/.xwiki/clover/clover.db
  3. Remember to clean your clover database when you're done.
If you don't wish failing tests to stop the generation of the coverage report, you should pass -Dmaven.test.failure.ignore=true on the command line.

Here are typical steps you'd follow to generate full TPC for XWiki:

  • Clean your local repo and remove any previous clover DBs:
    rm -R ~/.m2/repository/org/xwiki
    rm -R ~/.m2/repository/com/xpn
    rm -R ~/.xwiki/clover
  • Generate coverage data for XWiki Commons:
    cd xwiki-commons
    mvn clean -Pclover,integration-tests -o
    mvn clover:setup install -Pclover,integration-tests -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -Dmaven.test.failure.ignore=true  -Dxwiki.revapi.skip=true -o -nsu
  • Generate Clover report just for Commons:
    mvn clover:clover -N -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -o -nsu
  • Generate coverage data for XWiki Rendering:
    cd xwiki-rendering
    mvn clean -Pclover,integration-tests -o
    mvn clover:setup install -Pclover,integration-tests -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -Dmaven.test.failure.ignore=true  -Dxwiki.revapi.skip=true -o -nsu
  • Generate Clover report for Commons and Rendering:
    mvn clover2:clover -N -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -o -nsu
  • Generate coverage data for XWiki Platform:
    cd xwiki-platform
    mvn clean -Pclover,integration-tests -o
    mvn clover:setup install -Pclover,integration-tests -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -Dmaven.test.failure.ignore=true  -Dxwiki.revapi.skip=true -o -nsu
  • Generate Clover report for Commons, Rendering and Platform:
    mvn clover:clover -N -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -o -nsu
  • Generate coverage data for XWiki Enterprise:
    cd xwiki-enterprise
    mvn clean -Pclover,integration-tests,jetty,hsqldb -o
    mvn clover:setup install -Pclover,integration-tests,jetty,hsqldb -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db  -Dmaven.test.failure.ignore=true -Dxwiki.revapi.skip=true -o -nsu
  • Generate full Clover report (for Commons, Rendering, Platform and Enterprise):
    mvn clover:clover -N -Dmaven.clover.cloverDatabase=/Users/vmassol/.xwiki/clover/clover.db -o -nsu

Using Clover + Jenkins

  • Install Jenkins 2.0+ and the following plugins:
    • Pipeline plugin
    • XVnc plugin
  • Make sure to have Maven ("mvn" executable) + VNCServer ("vncserver" executable) installed on the build agent
  • Build agent must be running unix
  • Setup a Pipeline job with the following content:
    node() {
      def mvnHome
      def localRepository
      def cloverDir
      stage('Preparation') {
        def workspace = pwd()
        localRepository = "${workspace}/maven-repository"
        // Make sure that the special Maven local repository exists
        sh "mkdir -p ${localRepository}"
        // Remove all XWiki artifacts from it
        sh "rm -Rf ${localRepository}/org/xwiki"
        sh "rm -Rf ${localRepository}/com/xpn"
        // Make sure that the directory where clover will store its data exists in
        // the workspace and that it's clean
        cloverDir = "${workspace}/clover-data"
        sh "rm -Rf ${cloverDir}"
        sh "mkdir -p ${cloverDir}"
        // Get the Maven tool.
        // NOTE: Needs to be configured in the global configuration.           
        mvnHome = tool 'Maven'
      }
      // each() has problems in pipeline, thus using a standard for()
      // See https://issues.jenkins-ci.org/browse/JENKINS-26481
      for (String repoName : ["xwiki-commons", "xwiki-rendering", "xwiki-platform", "xwiki-enterprise"]) {
        stage("Cloverify ${repoName}") {
          dir (repoName) {
            git "https://github.com/xwiki/${repoName}.git"
            runCloverAndGenerateReport(mvnHome, localRepository, cloverDir)
          }  
        }      
      }
      stage("Publish Clover Reports") {
        def shortDateString = new Date().format("yyyyMMdd")
        def dateString = new Date().format("yyyyMMdd-HHmm")
        def prefix = "clover-"
        for (String repoName : ["commons", "rendering", "platform", "enterprise"]) {
          dir ("xwiki-${repoName}/target/site") {
            if (repoName != 'commons') {
              prefix = "${prefix}+${repoName}"
            } else {
              prefix = "${prefix}${repoName}"
            }
            sh "tar cvf ${prefix}-${dateString}.tar clover"
            sh "gzip ${prefix}-${dateString}.tar"
            sh "ssh [email protected] mkdir -p public_html/site/clover/${shortDateString}"
            sh "scp ${prefix}-${dateString}.tar.gz [email protected]:public_html/site/clover/${shortDateString}/"
            sh "rm ${prefix}-${dateString}.tar.gz"
            sh "ssh [email protected] 'cd public_html/site/clover/${shortDateString}; gunzip ${prefix}-${dateString}.tar.gz; tar xvf ${prefix}-${dateString}.tar; mv clover ${prefix}-${dateString};rm ${prefix}-${dateString}.tar'"
          }
        }
      }
    }
    def runCloverAndGenerateReport(def mvnHome, def localRepository, def cloverDir) {
      wrap([$class: 'Xvnc']) {
        withEnv(["PATH+MAVEN=${mvnHome}/bin", 'MAVEN_OPTS=-Xmx2048m']) {
          sh "mvn -Dmaven.repo.local='${localRepository}' clean clover:setup install -Pclover,integration-tests -Dmaven.clover.cloverDatabase=${cloverDir}/clover.db -Dmaven.test.failure.ignore=true -Dxwiki.revapi.skip=true"
          sh "mvn -Dmaven.repo.local='${localRepository}' clover:clover -N -Dmaven.clover.cloverDatabase=${cloverDir}/clover.db"
        }
      }
    }

Example Reports

Tags:
Created by VincentMassol on 2007/03/07 13:33
    

Get Connected