Integrate JMeter into ALM Octane Recently I have been on different engagements, where JMeter Load/Performance Testing were part of a continuous integration (CI) pipeline. On other engagements, JMeter Load/Performance Testing were not part of any continuous Integration pipeline. In both cases, we were successful connected ALM Octane with JMeter (with a CI pipeline and without). In the following, I will explain 2 different approaches on how to integrate JMeter with ALM Octane. Easiest way to integrate a testing tool into ALM Octane is to use a continuous integration (CI) Integration – basically if the testing tool can talk to the CI (such as Jenkins) through a framework (such as Junit), we can say with high probability that ALM Octane will understand this communication – in some cases the XML provided by the testing tool need to be transformed into the correct format. The first approach would be to integrate the testing tool with the CI. From the CI, ALM Octane will push the test results to its workspaces. In any case, if the organization is not consuming CI pipelines to deliver performance testing, also for this through a valid Junit XML the results can be pushed to ALM Octane using a test run collection tool. Getting Started! Before getting started, make sure to download & configure the following: • JMeter: https://jmeter.apache.org/download_jmeter.cgi o In this example the JMeter version 5.2.1 was used, which is the latest one. • Jenkins: https://jenkins.io/download/ • ALM Octane Plugin for Jenkins: https://plugins.jenkins.io/hp-application-automation-tools-plugin • ALM Octane Test Results Collection Tool: https://github.com/MicroFocus/octane-collection-tool • Obviously an ALM Octane instance. In case you don’t have ALM Octane, get a trial version here: https://www.microfocus.com/en-us/products/alm-octane/free-trial • Python: https://www.python.org/downloads/ - is required to transfom the jmeter result.xml into the Junit.xml
24
Embed
Integrate JMeter into ALM Octanecommunity.microfocus.com/dcvta86296...Integrate JMeter into ALM Octane Recently I have been on different engagements, where JMeter Load/Performance
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Integrate JMeter into ALM Octane Recently I have been on different engagements, where JMeter Load/Performance Testing were part of a
continuous integration (CI) pipeline. On other engagements, JMeter Load/Performance Testing were not
part of any continuous Integration pipeline. In both cases, we were successful connected ALM Octane
with JMeter (with a CI pipeline and without). In the following, I will explain 2 different approaches on how
to integrate JMeter with ALM Octane.
Easiest way to integrate a testing tool into ALM Octane is to use a continuous integration (CI) Integration
– basically if the testing tool can talk to the CI (such as Jenkins) through a framework (such as Junit), we
can say with high probability that ALM Octane will understand this communication – in some cases the
XML provided by the testing tool need to be transformed into the correct format. The first approach
would be to integrate the testing tool with the CI. From the CI, ALM Octane will push the test results to its
workspaces.
In any case, if the organization is not consuming CI pipelines to deliver performance testing, also for this
through a valid Junit XML the results can be pushed to ALM Octane using a test run collection tool.
Getting Started! Before getting started, make sure to download & configure the following:
Understand and decide how JMeter tests will be represented in ALM Octane. There are different representation options, how JMeter tests could be reflected in ALM Octane. You can
choose to have tests represented by the sampler in Jmeter, which means low number of automated tests
in ALM Octane with higher number of test runs (depends on the thread groups). Another way would be to
say, each JMeter thread represents an automated test in ALM Octane, in such scenario you end up with a
high number of tests with a low number of test runs. Other options would be to represents tests by test
fragment of JMeter. So basically, there are different options. However, we will focus in this article on the
following 2 representation options in ALM Octane.
Why it makes sense to push performance test results in to ALM Octane ALM Octane is an Application Lifecycle Management Platform which acts as an DevOps Design Center to
cover E2E all relevant phase across the lifecycle. Performance and Load tests are part of the continuous
testing strategy in DevOps. Establishing Delivery Pipeline which executes all relevant tests (unit, system,
integration, functional, performance, security) is the main goal of organizations transforming to
Enterprise DevOps. ALM Octane has been built to connect all phases together and deliver application
with high quality and pace.
Performance tests (whether Load Runner [Professional, Enterprise or Cloud], Gatling, JMeter, etc.) need
to be pushed to increase visibility on coverage and avoid broken traceability. Once performance tests are
pushed into ALM Octane, they are represented as Automation Tests [AT]. Now these performance tests
can be assigned to user story, requirements and / or defects.
Hereby it is possible to link the requirements, user stories and / or defects coming into ALM Octane from
various other tools such as Atlassian JIRA, Microsoft Azure DevOps, ServiceNow, in case you have a very
heterogenous tool chain. To establish this integration, use Micro Focus Connect Core
Once everything is connected, enjoy the ALM Octane Dashboard.
Option 1: JMeter Sampler representation as ALM Octane Automated Tests Each JMeter Sampler is pushed as one single automated test. As a result, you have a smaller number of
tests, which represents the most likely JMeter behaviour.
Therefor all thread runs are represented in the previous runs tab on the automated runs in ALM Octane.
Option 2: JMeter Thread-Runs representation as ALM Octane Automated Tests In this scenario, all thread runs per sampler are pushed into ALM Octane as one single automation test.
As a result, you have a higher number of tests with a lower number of runs.
Integrate JMeter into ALM Octane through a Continuous Integration Pipeline. In this scenario, we will use Jenkins. You need to configure the following:
• Make sure JMeter is installed on the Jenkins node which should perform execute the JMeter Tests.
• Install the Micro Focus Automation Tools Plugin for Jenkins (link is above in the Getting Started
section).
• Configure API Client ID and Client Secret from ALM Octane in order to communicate with Jenkins
<xsl:param name="S" select="$mSec mod 60000 div 1000"/> <xsl:value-of select="concat($y, format-number($m, '-00'), format-number($d, '-00'))" /> <xsl:value-of select="concat(format-number($H, 'T00'), format-number($M, ':00'), format-number($S, ':00'))" /> </xsl:template> <!-- https://jmeter.apache.org/usermanual/listeners.html#attributes JMeter Attribute Meanings - If enabled in JMeter by Bytes sby Sent Bytes de Data encoding dt Data type ec Error count (0 or 1, unless multiple samples are aggregated) hn Hostname where the sample was generated it Idle Time = time not spent sampling (milliseconds) (generally 0) lb Label lt Latency = time to initial response (milliseconds) - not all samplers support this ct Connect Time = time to establish the connection (milliseconds) - not all samplers support this na Number of active threads for all thread groups ng Number of active threads in this group rc Response Code (e.g. 200) rm Response Message (e.g. OK) s Success flag (true/false) sc Sample count (1, unless multiple samples are aggregated) t Elapsed time (milliseconds) tn Thread Name ts timeStamp (milliseconds since midnight Jan 1, 1970 UTC) varname Value of the named variable --> <xsl:template match="/testResults"> <testsuites> <testsuite> <!-- required for Junit xsd - no available in the jmeter result --> <xsl:attribute name="id">1</xsl:attribute> <!-- required for Junit xsd - no available in the jmeter result --> <xsl:attribute name="name">web.protocol.http</xsl:attribute> <!-- required for Junit xsd - no available in the jmeter result --> <xsl:attribute name="package">http.protocol.listener</xsl:attribute> <!-- required for Junit xsd - no available in the jmeter result --> <xsl:attribute name="hostname">Jmeter-Executor</xsl:attribute> <!-- required for JUnit xsd --> <xsl:attribute name="timestamp"> <xsl:call-template name="millisecs-to-ISO"> <!-- get timestamp from first test result convert it from epoch to ISO8601 --> <xsl:with-param name="millisecs" select="*[1]/@ts" /> </xsl:call-template>
</xsl:attribute> <!-- required for Junit xsd - count of test results --> <xsl:attribute name="tests"><xsl:value-of select="count(*)"/></xsl:attribute> <!-- required for Junit xsd - count of test failures --> <xsl:attribute name="failures"><xsl:value-of select="count(*[./assertionResult/failure[text() = 'true']])"/></xsl:attribute> <!-- required for Junit xsd - count of test errors --> <xsl:attribute name="errors"><xsl:value-of select="count(*[./assertionResult/error[text() = 'true']])"/></xsl:attribute> <!-- required for Junit xsd - Time taken (in seconds) to execute all the tests --> <xsl:attribute name="time"><xsl:value-of select="sum(*/@t) div 1000"/></xsl:attribute> <properties></properties> <xsl:for-each select="*"> <testcase> <xsl:attribute name="classname"><xsl:value-of select="concat(name(), '.', substring-before(concat(@tn,' '),' '))"/></xsl:attribute> <xsl:attribute name="component"><xsl:value-of select="@lb"/></xsl:attribute> <xsl:attribute name="name"><xsl:value-of select="@lb"/></xsl:attribute> <xsl:attribute name="id"><xsl:value-of select="concat(concat(@lb, '.', @tn), '.', @ng)"/></xsl:attribute> <xsl:attribute name="package"><xsl:value-of select="concat(@rm, '.HTTP/', @rc)"/></xsl:attribute> <xsl:attribute name="time"><xsl:value-of select="@t div 1000"/></xsl:attribute> <xsl:attribute name="system-out"><xsl:value-of select="concat('Name: ', @lb, ', Thread: ', @tn, ', Number of active threads: ', @na, ', Number of active thread groups: ', @ng, ', Return Code: ', @rc, ', Return Message: ', @rm)"/></xsl:attribute> <xsl:attribute name="timestamp"> <xsl:call-template name="millisecs-to-ISO"> <!-- get timestamp from first test result convert it from epoch to ISO8601 --> <xsl:with-param name="millisecs" select="@ts" /> </xsl:call-template> </xsl:attribute> <xsl:attribute name="status"><xsl:value-of select="concat(@rm, '.HTTP/', @rc)"/></xsl:attribute> <xsl:if test="assertionResult/failureMessage"> <failure> <!-- show only the first failure message (if multiple) as the JUnit schema only supports one faulure node --> <xsl:attribute name="message"><xsl:value-of select="assertionResult[./failure = 'true']/failureMessage"/></xsl:attribute> <!-- show only the first failure type (if multiple) as the JUnit schema only supports one faulure node --> <xsl:attribute name="type"><xsl:value-of select="assertionResult[./failure = 'true']/name"/></xsl:attribute> </failure> </xsl:if> <xsl:if test="@s = 'false'">
<xsl:if test="responseData"> <error><xsl:value-of select="responseData"/></error> </xsl:if> <failure> <!-- show only the first failure message (if multiple) as the JUnit schema only supports one faulure node --> <xsl:attribute name="message"><xsl:value-of select="concat('Response Code: ', @rc, ', Response Message: ', @rm)"/></xsl:attribute> <!-- show only the first failure type (if multiple) as the JUnit schema only supports one faulure node --> <xsl:attribute name="type"><xsl:value-of select="@rc"/></xsl:attribute> </failure> </xsl:if> </testcase> </xsl:for-each> <!-- required for JUnit xsd --> <system-out></system-out> <!-- required for JUnit xsd --> <system-err></system-err> </testsuite> </testsuites> </xsl:template> </xsl:stylesheet>
The XSL I use to upload JMeter tests by thread name (Option 2 above) is the following:
<?xml version="1.0" encoding="UTF-8"?> <xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" version="1.0"> <xsl:output method="xml" indent="yes" encoding="UTF-8"/> <xsl:template name="millisecs-to-ISO"> <xsl:param name="millisecs"/> <xsl:param name="JDN" select="floor($millisecs div 86400000) + 2440588"/> <xsl:param name="mSec" select="$millisecs mod 86400000"/> <xsl:param name="f" select="$JDN + 1401 + floor((floor((4 * $JDN + 274277) div 146097) * 3) div 4) - 38"/> <xsl:param name="e" select="4*$f + 3"/> <xsl:param name="g" select="floor(($e mod 1461) div 4)"/> <xsl:param name="h" select="5*$g + 2"/> <xsl:param name="d" select="floor(($h mod 153) div 5 ) + 1"/> <xsl:param name="m" select="(floor($h div 153) + 2) mod 12 + 1"/> <xsl:param name="y" select="floor($e div 1461) - 4716 + floor((14 - $m) div 12)"/> <xsl:param name="H" select="floor($mSec div 3600000)"/> <xsl:param name="M" select="floor($mSec mod 3600000 div 60000)"/> <xsl:param name="S" select="$mSec mod 60000 div 1000"/> <xsl:value-of select="concat($y, format-number($m, '-00'), format-number($d,
'-00'))" /> <xsl:value-of select="concat(format-number($H, 'T00'), format-number($M, ':00'), format-number($S, ':00'))" /> </xsl:template> <!-- https://jmeter.apache.org/usermanual/listeners.html#attributes JMeter Attribute Meanings - If enabled in JMeter by Bytes sby Sent Bytes de Data encoding dt Data type ec Error count (0 or 1, unless multiple samples are aggregated) hn Hostname where the sample was generated it Idle Time = time not spent sampling (milliseconds) (generally 0) lb Label lt Latency = time to initial response (milliseconds) - not all samplers support this ct Connect Time = time to establish the connection (milliseconds) - not all samplers support this na Number of active threads for all thread groups ng Number of active threads in this group rc Response Code (e.g. 200) rm Response Message (e.g. OK) s Success flag (true/false) sc Sample count (1, unless multiple samples are aggregated) t Elapsed time (milliseconds) tn Thread Name ts timeStamp (milliseconds since midnight Jan 1, 1970 UTC) varname Value of the named variable --> <xsl:template match="/testResults"> <testsuites> <testsuite> <!-- required for Junit xsd - no available in the jmeter result --> <xsl:attribute name="id">1</xsl:attribute> <!-- required for Junit xsd - no available in the jmeter result --> <xsl:attribute name="name">web.protocol.http</xsl:attribute> <!-- required for Junit xsd - no available in the jmeter result --> <xsl:attribute name="package">http.protocol.listener</xsl:attribute> <!-- required for Junit xsd - no available in the jmeter result --> <xsl:attribute name="hostname">Jmeter-Executor</xsl:attribute> <!-- required for JUnit xsd --> <xsl:attribute name="timestamp"> <xsl:call-template name="millisecs-to-ISO"> <!-- get timestamp from first test result convert it from epoch to ISO8601 --> <xsl:with-param name="millisecs" select="*[1]/@ts" /> </xsl:call-template> </xsl:attribute> <!-- required for Junit xsd - count of test results --> <xsl:attribute name="tests"><xsl:value-of
select="count(*)"/></xsl:attribute> <!-- required for Junit xsd - count of test failures --> <xsl:attribute name="failures"><xsl:value-of select="count(*[./assertionResult/failure[text() = 'true']])"/></xsl:attribute> <!-- required for Junit xsd - count of test errors --> <xsl:attribute name="errors"><xsl:value-of select="count(*[./assertionResult/error[text() = 'true']])"/></xsl:attribute> <!-- required for Junit xsd - Time taken (in seconds) to execute all the tests --> <xsl:attribute name="time"><xsl:value-of select="sum(*/@t) div 1000"/></xsl:attribute> <properties></properties> <xsl:for-each select="*"> <testcase> <xsl:attribute name="classname"><xsl:value-of select="concat(name(), '.', @lb)"/></xsl:attribute> <xsl:attribute name="component"><xsl:value-of select="@lb"/></xsl:attribute> <xsl:attribute name="name"><xsl:value-of select="concat(@lb, '_', @tn)"/></xsl:attribute> <xsl:attribute name="id"><xsl:value-of select="concat(concat(@lb, '.', @tn), '.', @ng)"/></xsl:attribute> <xsl:attribute name="package"><xsl:value-of select="concat(@rm, '.HTTP/', @rc)"/></xsl:attribute> <xsl:attribute name="time"><xsl:value-of select="@t div 1000"/></xsl:attribute> <xsl:attribute name="system-out"><xsl:value-of select="concat('Name: ', @lb, ', Thread: ', @tn, ', Number of active threads: ', @na, ', Number of active thread groups: ', @ng, ', Return Code: ', @rc, ', Return Message: ', @rm)"/></xsl:attribute> <xsl:attribute name="timestamp"> <xsl:call-template name="millisecs-to-ISO"> <!-- get timestamp from first test result convert it from epoch to ISO8601 --> <xsl:with-param name="millisecs" select="@ts" /> </xsl:call-template> </xsl:attribute> <xsl:attribute name="status"><xsl:value-of select="concat(@rm, '.HTTP/', @rc)"/></xsl:attribute> <xsl:if test="assertionResult/failureMessage"> <failure> <!-- show only the first failure message (if multiple) as the JUnit schema only supports one faulure node --> <xsl:attribute name="message"><xsl:value-of select="assertionResult[./failure = 'true']/failureMessage"/></xsl:attribute> <!-- show only the first failure type (if multiple) as the JUnit schema only supports one faulure node --> <xsl:attribute name="type"><xsl:value-of select="assertionResult[./failure = 'true']/name"/></xsl:attribute> </failure> </xsl:if> <xsl:if test="@s = 'false'"> <xsl:if test="responseData"> <error><xsl:value-of select="responseData"/></error> </xsl:if>
<failure> <!-- show only the first failure message (if multiple) as the JUnit schema only supports one faulure node --> <xsl:attribute name="message"><xsl:value-of select="concat('Response Code: ', @rc, ', Response Message: ', @rm)"/></xsl:attribute> <!-- show only the first failure type (if multiple) as the JUnit schema only supports one faulure node --> <xsl:attribute name="type"><xsl:value-of select="@rc"/></xsl:attribute> </failure> </xsl:if> </testcase> </xsl:for-each> <!-- required for JUnit xsd --> <system-out></system-out> <!-- required for JUnit xsd --> <system-err></system-err> </testsuite> </testsuites> </xsl:template> </xsl:stylesheet>
Based on the representation you prefer to have in ALM Octane, you need to change the XSL file and use it
to transform the JMeter XML into the desired Junit XML File.
To do this, you need to save the following python code into a python script file for example
This can be performance directly from ALM Octane Pipelines module,
or in Jenkin by Build Now.
In ALM Octane it shows that the pipeline is being executed…
View the Results after Pipeline Run
Once the Jenkins Job is finished, it will push all test results in to ALM Octane.
The Overview Tab will show you a summary on the current build run. In addition, you can use previous
build runs to see the complete history of this pipeline.
Octane automatically classifies PROBLEMATIC TESTS by type, such as continuously failing, regression,
unstable, etc. This helps to speed up the failure analysis.
Using the Dashboard module of ALM Octane, you can configure your performance test reports on
progress, execution and coverage.
In case you want to drilldown on a specific test run, ALM Octane saves a direct link to you Jenkins build
with which the automated run is linked to.
From here you will be redirected to the Jenkins Job run with all the performance trend reporting.
You can map all pushed JMeter tests to ALM Octane Application Modules. This will allow you to
understand in which business areas of your application under test the performance need more attention.
Integrate JMeter Tests into ALM Octane without any continuous integration server Now, if you don’t have a continuous integration server, in this section, we will explain how you can still
integrate and push JMeter tests into ALM Octane. For integrating JMeter without a CI, you will require
the following:
• Python must be installed on the machine from where you want to push JMeter results to ALM Octane
(link is above in the Getting Started section).
• Download the test result collection tool for ALM Octane from GitHub (link is above in the Getting
Started section).
Configure the Test Rusults Collection Tool
Once you have downloaded the test results collection tool from github, you need to create a
config.properties file in the same folder from where the test-result-collection-tool.jar will run.
Copy and modify the following configuration text into a new text file and save it as config.properties.
# Server URL with protocol and port server = http://octane-server:8080 # Server sharedspace ID sharedspace = 1001 # Server workspace ID workspace = 1002 # Server username user = [email protected] # Proxy host address proxyhost = proxy.microfocus.com # Proxy port number proxyport = 8080 # Proxy username proxyuser = test
Run JMeter tests to generate results for ALM Octane
Run your JMeter performance Tests from commandline or directly in JMeter.
Is the test completed, transform the JMeter XML to a Junit well formatted XML.
We will use the same approach as in the first scenario. Run using Python the convert_jmeter_to_junit.py
script with the Jmeter result-tree.xml and the XSL file from the first scenario (jmeter-junit-tests-by-
sampler.xsl or jmeter-junit-tests-by-sampler-threadname.xsl) to generate the Junit XML file.
Simply use the same command line:
By now, you should have the generate Junit xml file ready to be pushed to ALM Octane.
Push Junit xml with the Test Results Collection Tool to
ALM Octane
To execute the test results collection tool, make sure you have the prepared config.properties file ready.
Open command line interface and type the following command.