OL Learn

Fast and Performant Return Parameter

Hello Community,

we are working on a solution that creates reminder files out of a ERP System via Planet Press to PDF-Archive.

The source Data can be recieved out of a IBM DB2 DataBase Business Routine that COBOL Creates and that is extracted and pre-organized for Planet Press by a C# Routine that connects to de IBM/DB2.

Planet Press catches the defined Data via a single SQL Request on a defined View on the IBM/DB2.

1st Priority ist that it has to have a over the top performance.

The Data Construction out of the DB2 allows us only one SQL Request per Reminder Series created by the ERP System.

So we need a Return Parameter for each finished SQL-Request, means when the dedicated ResultSet is correctly recieved that Planet Press must have to send an information of the state (success/fail) of this Request at runtime to the C# Routine, so this function can pre-organize the data for the next Reminder Series out of the system.

How can we (Planet Press Developer/C# Developer) realize this issue. Im more on the code devolping side of this project, but my solution was to work with sockets? Cause theyre fast and lightweight.

Answers recommended, please.

I do not know about sockets, but I suppose you do your Query from the Database plugin in Workflow?
If so, you could setup a script right after it that looks at the returned data and decide if it is a success or a failure and send a response from there.
In the script you can also setup a condition to decide what to do after the script (continue on success or stop the process on failure).

Hi, yes thats what was on our minds at first affect, but Im afraid that would decreas the performance of the workflow. Scripts are slowing down the process on our system. Thats why we are thinking about the socket solution.

Isnt there a built in functionality for each step in the workflow to send a return parameter for each model used in the workflow?

Im afraid that scripting isnt the solution we can use.

Cant we use that built in web server an in this scenario we use a web socket via the intranet?

Any other suggestions?

Am I getting it right that the intention is to report back the status of each record?

If so, I am guessing that ‘success’ happens only when the workflow process finishes successfully. For this, have a variable/job info with the record id and the status equal to ‘fail’. When you get to the stage where process can be deemed successful, change the status to ‘success’. Then you can pass that information to a separate ‘Update status’ process. In the ‘Update status’ you are free to either call your C# program to do the update. All errors can be handled by an on error process - have a look at error.errormsg and error.errorid.

Keep in mind that the usual bottlenecks in workflow are processing records one-by-one rather than a whole batch at a time and not splitting processes preventing parallel runs and resulting in poor CPU utilisation.

ps. “Premature Optimisation is The Root of All Evil”. In other words, build your workflow configuration, and see where the bottleneck is, otherwise you are likely to spend countless hours for negligible benefit.

Intention is (has to be) to report back every succesfull Recieve of one Result Set, which contains out of several Records (up to 5000). These Records per ResultSet, lets call em DataBlocks, are already just parts of more data to manage by Planet Press for document creation (reminder documents). So in peak Planet Press has to recieve several DataBlocks containing out of up to 5000 Records as a Reminder Series.

So we dont need a signal per workflow process, therefore per recieved Result Set?? The workflow process can be managed further by Planet Press itself, FIFO Queue based??

So my questeion was if every model, in this case the SQL Statement Model, has the possibility to implement a return parameter which can be recieve by an external program?

I think I am missing something here. What do you mean by ‘model’?

Model means each Step of the Workflow, here it is the DataMapper Plugin. I have to know for an external C# Program when the DataMapper has recieved the dedicated Result Set. Just in this moment the Data Extraction against the DB2 from C# Routine can proceed.

I don’t think I fully understand your request, but from what I can gather you want to:

  • Execute the Data mapping (DM) task in Workflow
  • Once the DM task is complete, immediately notify a C# application

The problem is that you don’t explain how the C# application is supposed to receive the information. You mention sockets, but there is no task that can natively handle sockets in Workflow.

If your C# application can run a Web server, then you can use Workflow’s HTTP Client Input task immediately after the DM task. You would then point the task’s URL to the Web server run by the C# application and pass the Recordset ID as a parameter (e.g. http://localhost:1234?recordset=42).

If the C# does not run a web server, then you could store the Recordset ID in Workflow’s repository and the C# app could periodically check if a new ID is available in the Repository.

One other option is for the Workflow process to create an empty file (the name of the file would be the Recordset ID) in a specific folder. The C# application could then subscribe to Windows notifications for any change in that folder and immediately react to the creation of a new file.

Thx so much, that was helfpfull, I will check the options against the idea of calling a socket client via a script that calls the Socket Listener C# ?!

Since you have your own C# application, yet another option would be to forego Workflow entirely and directly talk to Connect Server using its REST API. There are endpoints to start, monitor and retrieve the result from data mapping operations, as well as all other operations (content creation, job creation, output creation). This way you would have complete control over the process, allowing you to customize and tweak it as you see fit.

Great, we will discuss this. I didnt know that PP has an built in REST API. This could be the fast and reliable solution I was searching for.