Stefan Cronje · Jan 22, 2023 go to post

For simplicity I am updating the table using the System Management Portal SQL.

I used auditing to get the PID and logged the other in the trigger.
Both are the same value: 95437

The ^ERRORS contains 3472 lines just for this. Do you want to see something specific in it?

There is no reference to %session in it.

Stefan Cronje · Jan 22, 2023 go to post

Thank you for clarifying. That makes sense.

It would have been nice to still have that information accessible somehow. If you add audit logs and other things based on triggers, you lose that traceability if SMP is used. This does create a bit of a gap in the audit trail if you are keeping one based on trigger events. The Dynamic SQL audit event can be used, but you can't link it directly to a record that has been updated or deleted, if you are using information that is only available in %session.

Can the InBackground behaviour be changed in a configuration?

Stefan Cronje · Jan 22, 2023 go to post

Thank you.

The tools used to run updates is not up to me. The end-customer has very strict security policies, so the only way to run SQL is via the SMP. Only that web port is open to most personnel, and then you are doing this on a remote desktop as well.

Stefan Cronje · Jan 25, 2023 go to post

Will you please get the query plan for this without running it in %parallel first, so that we can see what it does internally.

From there we can determine full scans on tables, etc.

Also, as mentioned, which fields are indexed and the types of indices.

Stefan Cronje · Jan 25, 2023 go to post

As a starting point, you can try thr following to force the IS NULL filter to be applied first - assuming there is an index on it.
From %FIRSTTABLE Records SQLUser.Books Books

Stefan Cronje · Jan 25, 2023 go to post

Do you want to check the file on the local disk or on a different machine?

What is the end goal of the BP? If the file did not transfer, what do you want it to do?

Stefan Cronje · Jan 25, 2023 go to post

So this code runs on Server A?

Server A handles the passthrough and then needs to check that the file landed on Server B, correct?

If that is the case, the solution is not going to be this simple. You will have to have another operation called by the BP to check if the file is on Server B. In short

  • A custom operation that uses the FTP Outbound Adaptor to FTP to Server B and pull a list of files.
  • The operation will need the filename and should be provided by the Ensemble message from the BP.
  • The operation should then check if the filename received in the Ensemble message is in the File List on Server B. You might be able to filter for the filename directly with FTP, I will have to confirm that.
  • The operation should then return a message to the BP that contains a "result" of the check. The BP can then act on that by creating an Alert Message and sending it to Ens.Alert.

I hope this helps. Let me know if I am misunderstanding the requirement.

Stefan Cronje · Jan 25, 2023 go to post

But, before you jump in write a lot of code, check the "Stay Connected" setting on the Passthrough Operation.

I have found that most FTP servers do not like long connections. I have solved many SFTP issues on business operations by setting "Stay Connected" to 0

Stefan Cronje · Jan 25, 2023 go to post

Wtih -1 the connection probably "dies" on the other server.

The Passthrough Operation does not know it died, so FTP is going to fail, immediately. Then it will retry if E=R is set on the Action and within the timeout period, it will reconnect and then transfer the file, but soon the next file will come and same will happen if the connection died.

I recommend changing that setting to 0 first and monitoring what happens.

UPDATED:
To ensure delivery

  • Stay Connected: 0
  • Reply Code Actions: E=R
  • Failure Timeout: -1
Stefan Cronje · Jan 26, 2023 go to post

Hi there,

Are you using VSCode?

If so, you can convert the EOL for new files you create and ones you edit. On VSCode you can use LF on WIndows too without issues.

Otherwise, after exporting the classes, do the following in terminal

Set tOldFile = ##class(%Stream.FileCharacter).%New()
w tOldFile.LinkToFile("C:\whereever\code-with-crlf.xml")
Set tNewFile = ##class(%Stream.FileCharacter).%New()
w tNewFile.LinkToFile("C:\whereever\code-with-lf.xml")
Do tOldFile.Rewind()
While ('tOldFile.AtEnd) {  set tTempStr = tOldFile.ReadLine()  Do tNewFile.Write($ZSTRIP(tTempStr,"*",""_$CHAR(13)) _ $CHAR(10))  }
w tNewFile.%Save()
do tOldFile.%Close()
do tNewFile.%Close()

Then import that file and see if this solution broke your code. :)

Stefan Cronje · Jan 31, 2023 go to post

Maybe have a look at %Library.FunctionalIndex and look at the defining indexes section in the documentation for BuildValuesArray()

Stefan Cronje · Feb 3, 2023 go to post

This looks promising.
Struggling with docker-compose though. Running it on Ubuntu and docker-compose does not use BuiltdKit, or so it says. So the --mount option is not working.
I am trying to find a way around it to check this out.

Stefan Cronje · Feb 3, 2023 go to post

Thanks.

I have found the solution. This is for everyone who uses Ubuntu and need to use docker-compose with BuildKit.
Do not use docker-compose up -d, but rather docker compose up -d. In other words, do not use docker-compose, use the Compose plugin of Docker.

See the below link for information on what do to:

Install the Compose plugin | Docker Documentation
 

Stefan Cronje · Feb 8, 2023 go to post

Which Business Service class are you using?

How is the produciton receiving the message?

Usually for REST services on a production, I do the following:

  • Create a "dispatch" class with the basic which extends from %CSP.REST
  • This REST class will receive the messages, then invoke a business service in the production.
  • You set up a CSP application that uses this class as the dispatcher.
    • Set it as authenticated, which will use basic authentication.

Something like the below. This is in one class, but you can put it into two. You then use this class as the dispatch class on the CSP application:

Class MyPackage.RESTService Extends (%CSP.REST, Ens.BusinessService)
{

Parameter UseSession = 0;/// Name of Ensemble Business Service. Set in implementationParameter BUSINESSSERVICENAME = "Production Clever Methods";

XData UrlMap [ XMLNamespace = "http://www.intersystems.com/urlmap" ]
{
<Routes>
	<Route Url="/somemethod Method="POST" Call="HandleSomeFancyMethod"/>
</Routes>
}



ClassMethod HandleSomeFancyMethod() As %Status
{
	set sc = $$$OK
	try {
		// Check Content Type
		if (%request.ContentType '= "application/json") {
			// Throw some error here or repond
		}
		set %response.ContentType = "application/json"

		// Check Data Recieved
		if (%request.Content = "") {
			// Empty data error or "bad request or something"
		}

		// Parse the input into a Dynamic Object or whatever and validate as you'd like. You can also just send the stream to the service as indicated further down

		// Create a business service
		set sc = ##class(Ens.Director).CreateBusinessService(..#BUSINESSSERVICENAME, .tService)
		if $$$ISERR(sc) {
			// throw some error
		}

		// Create input for Service
		set tEnsRequest = ##class(Ens.StreamContainer).%New()
		do tInput.Rewind()
		set sc = tEnsRequest.StreamSet(tInput)
		if $$$ISERR(sc) {
			// trhow some error
		}

		Set tAttrs=##class(%ArrayOfDataTypes).%New()
		do tAttrs.SetAt(%response.ContentType,"ContentType")
		do tEnsRequest.SetAttributes(.tAttrs)

		// Process the input
		set sc = tService.ProcessInput(tEnsRequest, .tEnsOutput)
		// handle the sc however you see fit
		set sc = tEnsOutput.StreamGet().OutputToDevice()
		// handle the sc however you see fit

	} catch tEx {
		// error 500
	}

	quit sc
}

Method OnProcessInput(pInput As Ens.StreamContainer, Output pOutput As Ens.StreamContainer) As %Status
{
	set sc = $$$OK
	try {
		// do whatever you want to do
		// You can send to other business hosts and so forth


		// Set the response object into the stream
		set tStream = ##class(%Stream.GlobalCharacter).%New()
		// tDynamicObj in this case is the reponse object
		set sc = tStream.Write(tDynamicObj.%ToJSON())
		set sc = pOutput.StreamSet(tStream)
	} catch ex {
		set sc = ex.AsStatus()
	}

	quit sc
}

}
Stefan Cronje · Feb 9, 2023 go to post

Using the proposed approach, there will not be a port listening for REST messages on the Production.

All WS requesst will have to go through the CSP gateway.
If there are other Business Operations utilising that REST service, you will have to connect via the web-server and set up credentials to use.

Also, as suggested by @Dmitry Maslennikov , you can split the services into smaller services if control is required at that level.

Then for local services consumed by the same or other productions pn the same server, you can use Ens RESTService, and control access to the port on the OS firewall to only allow only for traffic from localhost.

Also have a look at the class documentation.
You can disable the local port, so that it will not listen from the production and enable it to go via CSP. You then need to set the CSP application authentication options.
From the class documentation.
property EnableStandardRequests as %Boolean [ InitialExpression = 1 ];

Listen via the CSP WebServer in addition to listening on the HTTP.InboundAdapter's custom local port, if the Adapter is defined/

Note that SSLConfig only applies to the custom local port. To use SSL via the CSP WebServer, you must configure the WebServer separately.

If the Service is invoked via the CSP WebServer, the ?CfgItem= URL parameter may be used to distinguish between multiple configured same-class Services but the standard csp/namespace/classname URL must be used.

Stefan Cronje · Feb 9, 2023 go to post

Good question. I have had the requirement to do this a few times already, but create a BPL instance every time.

This becomes an effort when exposing services for different API versions, and you build your BPL and Logic to be compatible with the latest. All you need to do is transform the old version to the new one. But you want it somewhere "between" the service and the BPL.

The same goes for message versioning on Business Operations. Some systems you work with use the different version of the same API.

I have started adding APIVersion properties to the services and operations, but then still needsto do a lot of code to cater for each version of the message, or an additional BPL for every transformation.
Also, I need a Lookup to map versions to Business Host names.

It would be great if the following was possible:

  • Ens.Request and Ens.Response had a MessageVersion parameter or something similar.
  • The Business host has a MessageVersion property as a SETTING
  • The Transformation could be handled the Business Host class in some way. Get the MessageVersion value of the message and check the MessageVersion of the TargetHost. If it differs it invokes a transformation class to transform the message. This class will have to have some rules in it to convert between different version of the messages.
  • On the Response message the same principle should apply. Some operations work on older versions of an API.
  • There will of course have to be a transformation class configuration of some sort per business host for the request and responses.

There may even be much simpler solutions than this. But I agree with question asked.

Stefan Cronje · Feb 10, 2023 go to post

An installer manifest will be helpful,

Also, maybe clearer instructions on the installation? Like to clone the git repo, what to import, the compile order if any?

The root package name being "Demo" does not distinguish the "author/provider". My boss would not want to see something in the Live environment with the name of Demo.

Stefan Cronje · Feb 10, 2023 go to post

Thank you for the information and the proposal to have a brainstorming session as a community. 

Stefan Cronje · Feb 10, 2023 go to post

Great idea this.

May I suggest one change. Use a different tag for the unit test code.

The <example> tag is used to display nicely formatted code in the documatic for a dev - like a one liner on how to use it or something like that, or a just a block of code for context.

Now if you put in an example, it is going to be mixing with the unit tests.

Stefan Cronje · Feb 10, 2023 go to post

I understand completely. As a side comment, which is a bit off topic, I like what you've done and think I will contribute to it when time allows. I created the old Dynamic Object Adapter pacakge for Ensemble. There are things in there we can add to the OpenAPI suite if needed, for example the reverse of API first. Existing class definitions to swagger type thing. 

Stefan Cronje · Feb 10, 2023 go to post

Thank you for your response on this. I see what you are saying regarding the different problems being solved.

This is basically the agenda I am pushing - let's talk about packages and what should be and should not be packaged together, what we need, etc. BUT without adding "red tape" that will demotivate community members from contributing.

I seem to be giving a lot of 2c today. wink

Stefan Cronje · Feb 10, 2023 go to post

This is a great tool.

I am wondering if it will work for everyone. In the world of finance, you do not get SSH access to servers.
Most of the times the super-server port is also closed off for everything except the web gateway.

If the web version can be run on it, it is great - but in banking environment, not everyone is on the "containerised" buzz yet, so this will not be allowed.

Sure, I can probably install and configure the package and set up the web application.

Now there are two things left I want to raise:

  1. Multi-line SQL without having SSH access. Also do not have SCP or SFTP access.
    1. If this is present, and I have missed it, I apologise.
  2. Database transactions.
    1. I have a SQL shell I built a long time ago, which worked with db transactions.
    2. Doing DML, you may want to verify the results before committing it to the DB and have to option to rollback.
      1. This will be really great if the app can handle it.
Stefan Cronje · Feb 11, 2023 go to post

I updated it like that last week. Apparently, I did not send it for approval, which I thought i did.

Stefan Cronje · Feb 11, 2023 go to post

Thank you for the clarification.

If rollback and commit is supported, then verifying the results is just the step of doing a select before committing in order to verify the update/insert was correct and as expected.

Nothing special to it or automated in any way.

This is great. Thank you.

Stefan Cronje · Feb 12, 2023 go to post

For a start:

select distinct by (VISIT_SERIAL_NO,HQ_ORG_CODE) VISIT_SERIAL_NO,HQ_ORG_CODE
can be changed to
select distinct VISIT_SERIAL_NO,HQ_ORG_CODE

It will do the same.

Secondly:
Will you please remove the %parallel and click on "Show Plan". Post that plan here. It will help to determine where the query is slow. It might be using the wrong index. There are many.

Lastly:
Have you tuned the table and checked the result after that?

Stefan Cronje · Feb 12, 2023 go to post

If you want to use the group by, then you should probably do the count where the group by is being done, and use distinct by as you had it.

select distinct by(serial,hq) hq, count(1)
from thetable
group by hq

If you want it per that grouping.

There are no filters in it, so it is going to do a full table scan and compare and calculate values for each row. Taking the amount of time it takes, it is actually fast.

Maybe look into Bitslice indexes. It might help but at a cost of performance on insert and update:
InterSystems Documentation