Testing module
Update Test Run results in Targetprocess with REST API
Targetprocess REST API allows you to set up integrations with your external QA automation service. You can create an application that will get results of tests executed within your tool and post them to Targetprocess.
Here is the method we can recommend implementing in your client application:
Authentication
Obtain a security token to pass client authentication (must only be done once). Save the token. Pass it in all your further REST API requests.
More details can be found in Authentication article.
Get details of test cases, test plans, linked work items
Get list of test cases
Get the list of all Test Cases
from Targetprocess:
GET /api/v1/TestCases/?include=[ID,Name,Project]
Extract and save numeric IDs of all Test Cases your would like to update results for.
Get details of each test case
Get test case full information by ID, including description, steps and expected results, run order of test steps and parent project:
/api/v1/TestCases/?include=[ID,Name,Description,TestSteps[Description,Result,RunOrder],Project]
Get parent test plans for test cases
Next step is to get list of parent test plans for the test cases.
In a loop, query each Test Case
by ID
. Extract and save numeric ID of parent Test Plan. Here in the example ID
is 12345
:
GET /api/v1/TestCases/12345?include=[TestPlans[ID]]
Refine the list of Test Plans so that it only contains test plans from proper Projects, and also exclude duplicates from the list.
Get parent test plans and linked assignable work items for test case
This example allows to extract list of all test plans the test case with known ID is part of. For each test plan get reference to linked assignable entity: its ID, type, name and the Project it is part of.
GET /api/v1/TestCases/12345?include=[TestPlans[ID,Name,LinkedAssignable[Id,EntityType,Name,Project]]]
Get linked assignable work items and list of test cases for test plan
This example allows to extract list of all test cases that belong to particular test plan. Also it helps to get reference to assignable entity linked to the test plan: its ID, type, name and the Project it is part of.
GET /api/v1/TestPlans/1234?include=[ID,Name,LinkedAssignable[Id,EntityType,Name,Project],TestCases]
Get assignable work item and list of linked test cases and test plans
GET /api/v1/UserStories/2345?include=[LinkedTestPlan[TestCases,ParentTestPlans,ChildTestPlans]]
Create test plan runs
Create a new Test Plan Run entity for each Test Plan. New Test Plan Run entity is needed because it serves as a container for most recent run results of a test plan and all inner test cases. This action creates new Test Case Run entities with status "Not run" automatically for each Test Case in the Test Plan.
POST /api/v1/TestPlanRuns?resultFormat=json&resultInclude=[Id,TestCaseRuns[Id,TestCase]]
Content-Type: application/json
{"TestPlan": {"Id": 3456}}
Store IDs of each newly created Test Plan Run. With this call you are also able to obtain numeric IDs of all newly created Test Case Runs in each most recent test plan run. It helps to resolve what Test Case Run is connected to what Test Case.
Update run result per test case
Update Test Case Runs with known IDs one by one with the run result of each Test Case.
POST /api/v1/TestCaseRuns/567
Content-Type: application/json
{"Status": "Failed", "Comment": "The feature does not work as expected."}
Update state of test plan run
Update EntityState for your whole Test Plan Run to Done when needed. Use REST API POST to TestPlanRuns resource, put Test Plan Run ID to the end of query and and pass Entity State ID and Entity State Name as parameters. This action fills in End Date field value automatically to a current timestamp.
POST /api/v1/TestPlanRuns/45678 HTTP/1.1
Content-Type: application/json
{"EntityState": {"ID": 31, "Name": "Done"}}
The EntityState ID and Name should be taken from Workflow settings of Test Plan Run entity and related the same Process as a parent Project of the Test Plan Run.
The following GET query to the API helps to obtain valid values:
GET /api/v1/TestPlanRuns/45678?include=[Project[Process[ID]]]&format=json
{
"ResourceType":"TestPlanRun",
"Id":45678,
"Project":{"ResourceType":"Project","Id":5145,"Process":{"Id":2}}
}
Extract Process ID from this response.Here it is "2". Then perform one more GET call to the API and pass the workflow ID as a parameter.
GET /api/v1/EntityStates?include=[Id,Name]&where=(EntityType.Name eq "TestPlanRun") and (Workflow.Process.ID eq "2")&format=json
{
"Items":[
{"Id":30,"Name":"Open"},
{"Id":31,"Name":"Done"}
]
}
Then use {"Id":31,"Name":"Done"} as "EntityState" parameter if you would like to change Entity State for your Test Plan to "Done".
Let us know if you'll decide to go this way and if you require any further assistance.
Performance Saving Recommendations
Please keep in mind the limitations that we earnestly recommend to observe:
- Instead of single Test Plan for 1.000 - 10.000 test cases use splitted test plans having up to 500-800 test cases each and no more. Say, we recommend to split your 10.000-tests master test plan into 15-20 smaller parts.
- Create new test plan runs for these plans with throttling. Say 5 minutes between creation of new runs for plans with ~500 test cases and 8-10 minutes for new runs for plans with ~800-1000 test cases.
- Try to not mix order of 'create test runs' and 'update test case run statuses' actions. Start to update statuses only after all runs are created.
- Smaller delays between 'close test plan run' operations also has to be added. One-minute delay between closing of test plan runs should be enough.
- The performance may be affected only during 'create/update' actions, but it won't be a problem if you plan to schedule these actions to night time. During working day the system performance should not decrease.
We have monitoring tools that help to measure overall system performance on our side. Let us know when your 'production' integration is established and launched. We'll review our performance dashboards and check our graphs for bursts.
More examples: Get
Get last test plan run for all test plans
GET /api/v1/TestPlanRuns/?include=[TestPlan]&where=(IsLastStarted eq 'True')
Get last test plan run summary for single test plan
GET /api/v1/TestPlanRuns/?include=[ModifyDate,PassedCount,FailedCount,BlockedCount,OnHoldCount,NotRunCount]&where=(IsLastStarted eq 'True') and (TestPlan.Id eq 456)
Get last test plan run summary for work item
GET /api/v1/TestPlanRuns/?include=[TestPlan[LinkedAssignable],ModifyDate,PassedCount,FailedCount,BlockedCount,OnHoldCount,NotRunCount,Bugs]&where=(IsLastStarte eq 'True') and (TestPlan.LinkedAssignable.Id eq 12345)
Get test plan run summary
Get summary details of particular test run.
GET /api/v1/TestPlanRuns/456?include=[TestPlan[LinkedAssignable],IsLastStarted,ModifyDate,PassedCount,FailedCount,BlockedCount,OnHoldCount,NotRunCount,Bugs]
Get all status of all test cases in particular test plan run
GET /api/v1/TestPlanRuns/?include=[TestCaseRuns[Status]]
Get all bugs related to test plan
Here we retrieve all bugs related to test plan with ID
= 234
:
GET /api/v1/Bugs/?include=[TestPlanRuns[TestPlan]]&where=(TestPlanRuns.TestPlan.Id eq 234)
Get all bugs related to test case
Here we retrieve all bugs related to test case with ID
= 456
:
GET /api/v1/Bugs/?include=[TestCaseRuns[TestCase]]&where=(TestCaseRuns.TestCase.Id eq 456)
Get all bugs related to user story
GET /api/v1/UserStories?include=[Id,Name,Description,EntityState,Bugs[Id,Name,Description,CreateDate,EntityState]]
More examples: Create and Assign
Create test plan
POST /api/v1/TestPlans?resultInclude=[Id]
Content-Type: application/json
{
"Name":"Demo Test Plan",
"Project":{"ID":2}
}
Create test plan with test cases
POST /api/v1/TestPlans?resultInclude=[Id]
{
Name:"Demo Test Plan",
Project:{ID:2},
TestCases:{
Items:[
{Name:"TC 1",Project:{ID:2}},
{Name:"TC 2",Project:{ID:2}},
{Name:"TC 3",Project:{ID:2}}
]
}
}
Create test case with detailed steps
POST /api/v1/TestCases?resultInclude=[Id]
Content-Type: application/json
{
Name:"Demo Test Case",
Project:{ID:2},
TestSteps:{
Items:[
{Description:"Step #1 Action",Result:"Step #1 Expected result"},
{Description:"Step #2 Action",Result:"Step #2 Expected result"},
{Description:"Step #3 Action",Result:"Step #3 Expected result"}
]
}
}
Assign test case to an existing test plan
Existing assignments of this test case to other test plans are preserved.
POST /api/v1/TestCases/1234
Content-Type: application/json
{
TestPlans:{
Items:[
{ID:123},
{ID:124}
]
}
}
Make test plan linked to a work item
Both actions make the same result.
POST /api/v1/UserStories/1234
Content-Type: application/json
{
LinkedTestPlan:{ID:2345}
}
POST /api/v1/TestPlans/2345
Content-Type: application/json
{
LinkedGeneral:{ID:1234}
}
More examples: Delete and Unassign
Unassign multiple test cases from single test plan
DELETE /api/v1/TestPlans/234/TestCases?childrenIds=1234,1235,1236
Unassign a particular test case from multiple test plans
DELETE /api/v1/TestCases/2345/TestPlans?childrenIds=123,124,125
Updated 12 months ago