Web interface of OpenLM User Interface has been designed to match our customers’ requirements, and has evolved over many iterations of production and feedback. Nevertheless, clients often want to present the license usage data that has been accumulated in their OpenLM Database in a different and original way. To answer this need, OpenLM provides SOAP interface to open APIs.
OpenLM has integrated API interface implemented in SOAP. API structure can be obtained by accessing a SOAP WSDL file from your browser:
The following is service help page that shows all supported methods:
- <hostname> stands for the OpenLM Server machine host name.
- Port number (7020 in the example above) is the default port for the SOAP proxy. It can be changed in the OpenLM Server configuration tool –> Port Settings tab.
- It is also possible to obtain the API by entering the same URL line in a WCF client tool such as STORM (see below). You can also use other tools such as “SoapUI” https://www.soapui.org/ or “Postman” https://www.getpostman.com/.
Extract API data format
We find the following order of action to be most intuitive for filling in the exact message data required per each API:
- Open OpenLM User Interface web application using Chrome.
- Click F12 button and select ‘Network’ tab.
- Click ‘Record’ button and produce whatever report you need in the User Interface. You will be able to see exact format of the query you wish to employ through API.
- Fill in the required data on your WCF testing tool (e.g. Storm) interface.
- Please note that some tools fill in empty values within array type variables, perhaps as ‘placeholders’. These will not comply with expected message formats and will need to be taken care of.
- It is not recommended to access OpenLM Database tables directly as OpenLM is not obligated to keep the Database tables’ structure through upcoming version releases.
- In any case prior to developing a new type of report that is not found in User Interface – please contact OpenLM support team. Maybe it’s already implemented or currently on the drawing board.
- Assistance in the use of OpenLM APIs is out of scope for normal support. If you require it, we suggest turning to our professional services to produce a quote for your implementation.