Many of our customers seek to integrate automation test case execution with Spira so that the results are automatically reflected in Spira as new test runs.

In this example, we consider the case where a team have a cucumber based automation framework.In cucumber you could have tags for each cucumber test case. You would then need to map these test cases with the ones in Spira dynamically. That way, once you execute the regression pack, the results will be automatically updated in Spira against each test case.

Consider the following example Test Case mapping between Spira and Cucumber:

Test case in Spira

Test. Case in cucumber feature



tag name - CheckLogin



tag name - CheckLogout



tag name - CreateEmployee



tag name - UpdateEmployee



tag name - DeleteEmployee




The solution is to add code to your test framework so that once the cucumber script finishes, you can dynamically send back the results to Spira.
The correct REST API call you should make is the following:

POST: projects/{project_id}/test-runs/record


Records the results of executing an automated test

You need to use this overload when you want to be able to set Test Run custom properties

How to Execute

To access this REST web service, you need to use the following URL (make sure to replace any parameters (eg {project_id}) with the relevant value (eg 1):


Request Parameters

project_idThe id of the current project

Request Body


TestRunFormatIdThe format of the automation results (1=Plain Text, 2=HTML) stored in the 'RunnerStackTrace' field
RunnerNameThe name of the external automated tool that executed the test
RunnerTestNameThe name of the test case as it is known in the external tool
RunnerAssertCountThe number of assertions/errors reported during the automated test execution
RunnerMessageThe summary result of the test case
RunnerStackTraceThe detailed trace of test results reported back from the automated testing tool
AutomationHostIdThe id of the automation host that the result is being recorded for
AutomationEngineIdThe id of the automation engine that the result is being recorded for
AutomationEngineTokenThe token of the automation engine that the result is being recorded for (read-only)
AutomationAttachmentIdThe id of the attachment that is being used to store the test script (file or url)
ParametersThe list of test case parameters that have been provided
ScheduledDateThe datetime the test was scheduled for
TestRunStepsThe list of test steps that comprise the automated test These are optional for automated test runs. The status of the test run steps does not change the overall status of the automated test run. They are used to simply make reporting clearer inside the system. They will also update the status of appropriate Test Step(s) if a valid test step id is provided.
TestRunIdThe id of the test run
NameThe name of the test run (usually the same as the test case)
TestCaseIdThe id of the test case that the test run is an instance of
TestRunTypeIdThe id of the type of test run (automated vs. manual)
TesterIdThe id of the user that executed the test The authenticated user is used if no value is provided
ExecutionStatusIdThe id of overall execution status for the test run Failed = 1; Passed = 2; NotRun = 3; NotApplicable = 4; Blocked = 5; Caution = 6;
ReleaseIdThe id of the release that the test run should be reported against
TestSetIdThe id of the test set that the test run should be reported against
TestSetTestCaseIdThe id of the unique test case entry in the test set
StartDateThe date/time that the test execution was started
EndDateThe date/time that the test execution was completed
BuildIdThe id of the build that the test was executed against
EstimatedDurationThe estimated duration of how long the test should take to execute (read-only) This field is populated from the test case being executed
ActualDurationThe actual duration of how long the test should take to execute (read-only) This field is calculated from the start/end dates provided during execution
ProjectIdThe id of the project that the artifact belongs to The current project is always used for Insert operations for security reasons
ConcurrencyDateThe datetime used to track optimistic concurrency to prevent edit conflicts
CustomPropertiesThe list of associated custom properties/fields for this artifact


Sample Code

For those using Java to execute the Cucumber scripts, you can use the following Java sample code to send back the results:

import com.google.gson.Gson;

import java.io.DataOutputStream;
import java.io.IOException;
import java.net.HttpURLConnection;
import java.util.*;
import java.net.URL;

 * This defines the 'SpiraTestExecute' class that provides the Java facade for
 * calling the REST web service exposed by SpiraTest
 * @author Inflectra Corporation
 * @version 5.0.0
public class SpiraTestExecute {
     * The URL appended to the base URL to access REST. Note that it ends with a slash
    private static final String REST_SERVICE_URL = "/services/v5_0/RestService.svc/";

    public String url;
    public String userName;
    public String token;
    public int projectId;

    SpiraTestExecute(String url, String userName, String token, int projectId) {
        this.userName = userName;
        this.token = token;
        this.url = url;
        this.projectId = projectId;


     * Performs an HTTP POST request ot the specified URL
     * @param input The URL to perform the query on
     * @param body  The request body to be sent
     * @return An InputStream containing the JSON returned from the POST request
     * @throws IOException
    public static void httpPost(String input, String body) throws IOException {
        URL url = new URL(input);

        HttpURLConnection connection = (HttpURLConnection) url.openConnection();
        //allow sending a request body
        //have the connection send and retrieve JSON
        connection.setRequestProperty("accept", "application/json; charset=utf-8");
        connection.setRequestProperty("Content-Type", "application/json; charset=utf-8");
        //used to send data in the REST request
        DataOutputStream outputStream = new DataOutputStream(connection.getOutputStream());
        //write the body to the stream
        //send the OutputStream to the server

     * Records a test run
     * @param testCaseId        The test case being executed
     * @param releaseId         The release being executed against (optional)
     * @param testSetId         The test set being executed against (optional)
     * @param executionStatusId The status of the test run (pass/fail/not run)
     * @param runnerName        The name of the automated testing tool
     * @param runnerTestName    The name of the test as stored in JUnit
     * @param runnerAssertCount The number of assertions
     * @param runnerMessage     The failure message (if appropriate)
     * @param runnerStackTrace  The error stack trace (if any)
     * @param endDate           When the test run ended
     * @param startDate         When the test run started
     * @return ID of the new test run
    public void recordTestRun(int testCaseId, Integer releaseId, Integer testSetId, Date startDate,
                             Date endDate, int executionStatusId, String runnerName, String runnerTestName, int runnerAssertCount,
                             String runnerMessage, String runnerStackTrace) {
        String url = this.url + REST_SERVICE_URL + "projects/" + this.projectId + "/test-runs/record?username=" + this.userName + "&api-key=" + this.token;

        Gson gson = new Gson();

        //create the body of the request
        String body = "{\"TestRunFormatId\": 1, \"RunnerName\": \"" + runnerName;
        body += "\", \"RunnerTestName\": \"" + runnerTestName + "\",";
        body += "\"RunnerStackTrace\": " + gson.toJson(runnerStackTrace) + ",";
        body += "\"StartDate\": \"" + formatDate(startDate) + "\", " + "\"EndDate\": \"" + formatDate(endDate) + "\",";
        body += "\"ExecutionStatusId\": " + executionStatusId + ",\"RunnerAssertCount\": " + runnerAssertCount;
        body += ",\"RunnerMessage\": \"" + runnerMessage + "\",";
        body += "\"TestCaseId\": " + testCaseId;

        if(releaseId != null) {
            body += ", \"ReleaseId\": " + releaseId;
        if(testSetId != null) {
            body += ", \"TestSetId\": " + testSetId;

        body += "}";

        //send the request
        try {
            httpPost(url, body);
        catch (Exception e) {

     * Turn the date into the format readable by Spira
     * @param d
     * @return
    private static String formatDate(Date d) {
        return "/Date(" + d.getTime() + "-0000)/";

     * Send a test run to Spira from the info in the given test run
     * @return ID of the new test run
    public void recordTestRun(TestRun testRun) {
        Date now = new Date();
        recordTestRun(testRun.testCaseId, testRun.releaseId == -1 ? null : testRun.releaseId,
                testRun.testSetId == -1 ? null : testRun.testSetId, now, now, testRun.executionStatusId,
                "JUnit", testRun.testName, 1, testRun.message, testRun.stackTrace);