Create and run automated unit, system and integration tests for SoftwareAG's webMethods™ IntegrationServer with minimal effort. IwTest is a key success factor for making Continuous Integration work in your webMethods™ projects.
With IwTest's unique record & generate facility, the time it takes you to run a service, is the time you need to create a test case. Literally. There is no excuse anymore for not having a set of automated tests.
IwTest is also indispensable in any upgrade project when you don't have a set of reliable regression tests. Let IwTest record service invocations in your QA environment and build a regression test suite over night! You're saving incredible amounts of time, effort and tedious work.
/books/chapters/titleequals/is greater than/is type of/etc
Once upon a time
IwTest's unique feature is the ability to record service invocations and use the recorded inputs and outputs to generate test cases. Never was creating test cases easier. No need to manually save inputs and outputs.
Creating a test suite is as simple as:
Recording not only works on the IntegrationServer your developing on. You can also record services in an environment with more realistic data, e.g. in your User Acceptance Test environment and download the recorded data for test suite generation.
With the same minimal effort you can also include mocks (also known as stubs). Just tell IwTest to additionally record inputs and outputs of sub-services that access external systems, like adapter services or web connectors. Also exceptions are recorded. IwTest will include m automatically when generating a test suite.
Of course, if your service produces varying outputs, e.g. uuid's, dates or timestamps, then you need to post-edit the expected data and change some fields in the expected results into regular expressions
When one of the services that you want to mock is executed multiples times when you invoke the top-level service, things get complicated. Of course you don't want it to return the same result every time, and it's cumbersome to write an actual stub mock service as a replacement for the real service.
This scenario is tackled by IwTest by using dynamic mocks. IwTest will record the different input/output combinations of the sub-service. During test case execution IwTest will replace the calls to the external service with a dynamic stub. The actual input is matched against the recorded inputs and the corresponding output (or exception!) is returned.
IwTest supports pub/sub scenario's via so-called callbacks. In your test case you tell IwTest to capture the pipeline that's present at the beginning or end of a service invocation and run assertions against that data. This service may be located on another IntegrationServer!
IwTest doesn't care how the (asynchronous) execution of that service was triggered. It may have been a Broker message, JMS message, MQTT event, Web Service call etc.
By the way, you're not limited by just one callback: you can define as many in a test case as you like.
Of course on the remote IntegrationServer you can again define stubs so you can prevent that the asynchronously executed service sends data to an external system.
Execute your test cases from your your Continuous Integration tool (e.g. Jenkins, or Cruise Control) with a single command. With another single command download the results in JUnit-format.
Jenkins has an excellent JUnit plugin that keeps history of the various test runs, highlights the differences between them, produces nice graphs and notifies developers when a build failed or has become unstable.
You can download and evaluate IwTest for free. The first 90 days it's fully functional, only the number of test cases that you can execute is limited to 50. After the evaluation period the recording and test suite generation function is disabled.
You can either purchase a permanent license or take a subscription. The permanent license doesn't expire, but is only valid for a major version. With a subscription on the other hand you're entitled to all upgrades.
|# Test cases||50||Unlimited||Unlimited|
|Bug fixes||90 Days|
|Minor upgrades||90 Days|
|Major upgrades||90 Days|
You can request a quote by sending an email to
IwTest consists of two regular IS packages: IwTest & IwTestAgent.
|0.9.4||10 July 2017||IwTest v0.9.4||158e5f9a64bb79c21fe3afff1e91978f|
The latest version of the major release are always downloadable from this website. If you purchased a permanent license and would like to upgrade to the newest new major version, then ask for a quote by sending an email to
You can send bug reports and feature requests to
The following features are planned for upcoming releases:
WmTestSuite does not provide a recording and test suite generation facility, nor does it support asynchronous scenario's.
CATE provides similar and more advanced functionality, but divided over two different products (CATE Developer Edition (DE) and CATE Enterprise Edition (EE)) and against a much higher license fee.
In IwTest a test suite is just another flow service.
IwTest does not support (JMS) Messaging directly. However, IwTest does support asynchronous (pub/sub) scenario's.
In your test case you can wait for a service to execute asynchronously, either on the same IS, or on a remote IS, and validate its inputs, outputs or even exception.
If there is no subscribing service (yet), then you could also create a separate trigger in your test package and subscribe to the message that you want to capture.
Note that you can extend your test case with multiple asynchronous events.
Test cases come at a cost. First there is the cost to create them (with IwTest, that cost is relatively small, but still) and later there is the cost to update them as you apply fixes or enhance the functionality.
As a first approach, focus on your transformation services (of course you've separated your transformation logic from the tranport logic, right?).
The best time is when initial development is done and your code is in functional acceptance testing, but before it's released to production.
Later you can shift your focus to end-to-end scenarios and - time and budget permitting - to utility services.
If you have your test data available, then with IwTest's record & generate facility, it's literally a matter of minutes. Also if you want to stub services.
If on the other hand you need to prepare your test data and vary it in order to test different mapping variations or to evoke different responses from the target system, then it will take as long as it takes you to prepare your data.
Yes, but IwTest cannot link asynchronous events to one test case. So the data will be recorded, but in pub/sub scenario's you'll have to manually extend a test case it with a so-called 'callback' to an asynchronous event.
No, IwTest doesn't produce excessive amounts of data unless you tell it to. There is a parameter that controls the minimum interval between two service invocation recordings. The default value is 2 seconds.
Yes! Just specify the services IwTest should collect data for.
Warning! IntegrationWise does not recommend to use IwTest in a production environment. Do this only in a QA environment.
Obviously there is a performance penalty when recording data that flows through you integration platform. And although IwTest has been tested extensively, IntegrationWise cannot assume responsibility for any negative effects the use of IwTest might have.
IwTest supports the same data types the IDataXMLCoder class supports, so that's everything you can do a 'Save Pipeline to File' for. So IwTest does not support an xml 'node'. If the service you want test takes a node as an input, simply create a wrapper flow service that takes an xml-string as an input, call 'pub.xml:xmlStringToXMLNode' and then invoke your flow service with the 'node'.
Most probably your service produces different results each time it's executed. There might be for example a timestamp in the outputs, or an auto-generated id. This is something IwTest cannot detect.
The solution is to use a regular expression in the expected results. This also works for fields that are numbers or dates. For example replace:
Because CI tools can take care of this. Jenkins for example has a marvelous JUnit plugin. It compares the test results of the current run with those of the previous runs and presents the differences nicely. In addition it draws nice graphs of history of the test runs. There is nothing IwTest could add here.
No, use tools like JMeter to do that. Nevertheless, IwTest can be of great help here. Before you run your performance tests you can tell IwTest to create stubs for those services that interact with an external system. By isolating your integration platform like this, you can much better judge how well it performs. Usually the external systems are the performance bottle necks, not the integration platform.
No, IwTest does not require wM DevOps Edition. However, you do need access to the file system of the IntegrationServer if you have to post-edit assertions in case your services produces time dependent results.
IwTest is a product of IntegrationWise B.V., a Netherlands based IT consultancy hat specialises in integration projects using SoftwareAG's webMethods™ Digital Business Platform.