Mike.W · Apr 29, 2016 go to post

There may be arguments in favour of either solution, depending on the types of data involved and programmer preference, but if you are to embrace the full Ensemble "model" then I think the second option is far better.

By putting the non-HL7 into a message sent to a business process,  it gets stored and becomes visible in Ensemble in it's raw form (or as close as you can make it) on the  message queue into that Process. This makes support much easier as you can see before and after messages in the Ensemble GUI. Also, a Business Service should do a minimum of work so that messages are input as fast as possible.

Using DTL is also the cleanest option, since it is meant to transform messages, but I admit that sometimes this is more effort than it is worth. I have had to deal with complex xml documents, and ended up writing methods in my custom message class to make extraction easier to understand in the DTL.

Mike

Mike.W · May 17, 2016 go to post

Yep, that's the first thing I noticed as well. I often go from one search to another and do not want to step back to the start each time. Search is probably the most important facility for me. (Thanks to interSystems for opening it up for comment, by the way.)

Mike.W · May 17, 2016 go to post

Great idea. (I find SharePoint frustrating. All I want is a simple link to a folder in a library, but it's really hard to get.)

Mike.W · Jul 6, 2016 go to post

Hi Andrew,

Thanks for posting this. I am in the process of coding a similar automated statistics email, so I am reading your code with interest and may well use some of it. My only comment is that since this is for monitoring Ensemble, I actually decided to use Ensemble to implement it. So I have a message class with properties like "Subject" etc. and this is sent to an Operation that uses the Ensemble email adapter to send it. There is also a Service that runs once a day to pull the required information and builds the email message.

Mike

Mike.W · Jul 7, 2016 go to post

I agree, the Production class is a major problem for Configuration Management. In the past we've tried System Defaults and found it very awkward to use. In the last project we started with separate classes for dev, test and live. The updates to test and live versions were done in a release preparation namespace, with comparisons done to ensure they were in step, as mentioned earlier. Then a release was built from that and installed in test slot and later in live slot.

But it all got harder and harder to handle, and when the system went fully live we stopped doing full releases and went over to releasing only changed classes, etc. and never releasing the production classes. Now the problem is that changes to the production have to be done manually via the front end. Fortunately, there are a lot less of these now.

Regards,

Mike

Mike.W · Oct 4, 2016 go to post

A lot of the code we have to deal with is like the "One Line" method, but then some of it was written back in the days when there was very little space in the partition for the program, and the source code was interpreted directly, so it had to be kept small. What is different is that while the code to handle the traverse is all kept on one line, which I think makes the scanning method clear, other code for pulling and writing stuff is usually on lower "dotted" lines, like this:

S Sub1=""FS Sub1=$O(^Trans(Sub1)) Q:Sub1=""  D
.W !,"Sub1: ",Sub1
.S Sub2=""F  S Sub2=$O(^Trans(Sub1,Sub2)) Q:Sub2=""  D
.. ; etc

And yes, it uses the short version of commands (as you did for the Quit, I notice). I'm not saying this is the best style to use, obviously, as things have moved on, but we do still tend to use it when amending the old code.

I think it is useful to be aware of all these variations, as you may come across old code using them. Also, sometimes the array structure can be better or faster than objects and SQL, whether in a global arrays, or in temporary local arrays. Its a very neat and flexible storage system.

Mike

Mike.W · Jul 26, 2017 go to post

I had a similar situation and ended up with an Ensemble Service reading in the meta data file (like your xml), and composing an Ensemble message with that information, including a file reference for the data file (your pdf). This meant that the meta data file could be automatically archived by Ensemble, but now I had to archive the data file instead, using calls to the OS like you have done for your xml file above.

In my case this did make some sense, as I wanted to convert the data file using an OS call to an "exe", and at least the messages in Ensemble had all the meta information, file name, etc. But I also think it was a bit clumsy so would be interested in any better ideas.

Regards,

Mike

Mike.W · Aug 8, 2017 go to post

Many guides to "good programming" (in any language) would advise that the return from a function/method should be used for "real" data only, and any "exception" situations should be flagged as an error. While I'm not convinced this is always the best way, I can see the advantages. Code with repeated tests of returned status values can be messy and hard to read, and if the only thing it can do when the status is a fail is to quit out again with a status of "failed", then there is not a lot to be gained.

Mike

Mike.W · Nov 1, 2017 go to post

Amir's answer with option 2 is what we did. The XML we sent had to be converted to allow it to be sent, so our code looked a bit like this:

Method ImportEpisode(pRequest As EnsLib.EDI.XML.Document, Output pResponse As Ens.Response) As %Status
{
 ; Use format 8 bit regardless of cache default (else Base64Encode gives ILLEGAL VALUE error)
 Set sendingXML = pRequest.OutputToString("C(utf-8)",.tSC)
  If $$$ISERR(tSC) Quit tSC
  $$$TRACE("Sending: "_sendingXML)
  Set sending = $system.Encryption.Base64Encode(sendingXML)
  Set tSC = ..Adapter.InvokeMethod("ImportEpisode",.result,sending,{plus some other id parameters})
  If $$$ISERR(tSC) Quit tSC
 Set resultXML = $system.Encryption.Base64Decode(result)

...etc.

I hope this is useful to you.

Mike

Mike.W · Dec 8, 2017 go to post

I'm just thinking that maybe we need more information as to why this is needed before recommending anything. If the network connection is good enough for mirroring, then why not just map the classes to a central repository? Perhaps all that is needed are security settings to prevent updates from the "slave" systems. Perhaps there is no network connection, in which case mirroring or shadowing is not possible, and what is needed is a good way to automate export/import to OS files.

Mike

Mike.W · Dec 8, 2017 go to post

I have used the  %XML.Writer class to create a document, but only for a fairly simple one that was destined for a SOAP outbound call. The SDA is tricky, so I would imagine using HealthShare (Ensemble) would be much easier. (I have used Ensemble classes like EnsLib.EDI.XML.Document that can be added to a message, etc. and used as the target for transformations once you have an appropriate document definition loaded in. Reduces the coding required, though not entirely as repeating groups are an issue.)

Mike

Mike.W · Dec 8, 2017 go to post

I won't embarrass myself by listing the MUMPS code from 1991 that does this on our application, but I will comment that you need to work out how many birthdays have gone by so it must compare month and day values once the basic year subtraction has been done. It gets quite complicated. (Might also like to look at whether you need more than just a "years old", and also need months or days for very low values.)

Mike

Mike.W · Jan 4, 2018 go to post

Hi. We had a site upgrade from 2012.2.5 to 2017.1.0 last year, and it included a mirror. We had very few code changes needed - just an issue with it failing to save objects inside a Business Process where we had used some "unusual" structures. The upgrade itself went smoothly. The only issue was afterwards when the next backup was a "full" one instead of the scheduled "partial", using more space than expected. Our Production was much smaller than yours, with only about 120 items, and it is hard to say how much effort went into pre-release testing as it was "fitted in" around other work by a team of people. Maybe a couple of man  months?

To be honest, it all depends on how much custom or unusual code you have, and how much testing the customer wants. We upgraded a development namespace and re-ran test messages through all the important paths and compared the result before and after upgrade. Plus some connection testing to cover all the "types" we used: ftp, web service, HL7, etc.  In our case the testers included people from the user side, so they could decide when they were happy with it.

InterSystems were very helpful. We raised a call a few months before and they gave advice on testing and desk checked our detailed plan of the upgrade itself, including how to do the mirror.

Good luck.

Mike.W · Jan 12, 2018 go to post

Hi.

I'm not sure what your "persistent objects" contain, but if the repeating data is in the form of strings or streams then perhaps you could put them into a XPATH document object ( %XML.XPATH.Document) and use the evaluator in that? I support a system that has a message with a transient property to hold the document created, so it's only done once (per processing), and a method that builds that property if needed and then calls the  EvaluateExpression method in that property for an expression supplied as a parameter. This is used in transformations to extract data to post into HL7 v2 messages, so the same calls should work in rules as well.

Of course the XPATH expressions may be no easier to define than your exporting and importing methods. I certainly struggled with them and in the end had to build lots of special inputs that added the huge long list of nested items that usually went at the front of each expression, resulting in calls like source.Pull("Baby","ep2:id/@extension") where "Baby" added a prefix of over 160 characters to find the baby section of the Mother's full document before tunneling down to the hospital number. If you already have a sub-section of the full document then you may not need as much.

Mike

Mike.W · Feb 21, 2018 go to post

Anyone know why we ended up with this strange behaviour? Why doesn't COS store 2 and "2" in the same way in lists? The rest of the programming environment is based around them being the same - (2="2") - as everything is a string until used otherwise. It may use "an optimized binary representation", but surely that's not really an excuse. Just curious.

Mike.W · Apr 20, 2018 go to post

Yes, we have something like that, except we use the letter "q" as the prefix, and we follow it by the programmer's initials so that it becomes a "personal" set that is left alone in all namespaces, dev-test and live. We also extend this to rule to globals and things inside the application like functions, screens, tasks, etc. The in-house configuration management system we use ignores them so they are left untouched. It's a useful convention.

(We might have used "z" like you, but it was already taken for "utility/library" stuff.)

Mike.W · Jun 15, 2018 go to post

Hi,

I won't claim this is an answer, because it's not quite the same and people may object to the structure, but here is one solution that is used quite a lot in code I look after. Basically, a subroutine is called and then tests are done and a Quit is used to drop out when a match is found. Often used for validation, something like this that returns a result in the zER variable:

V1 ; Validate ORGC
   S ORG=zORG WC2^hZUTV zER'="" Q
   I IPACC<9,'$D(^hIW(WAID)) zER="No details set up for ward" Q
   D WARDON^hILO1 LOCK zER="Ward in use" Q

VQ2 Q

Apologies for the old-fashioned code! However, you can see each test can be quite complex and using lots of variables, but it is easy to understand as long as you expect the structure to work that way.

This is very similar to the "clean code" solution of making the whole thing into a function that returns a value:

ClassMethod Main(val1 As %String, val2 As %String) {  write ..MyOutput(val1,val2) }ClassMethod MyOutput(val1 As %String, val2 As %String) As %String {  if val1 = 1 return "case 1"  if val1 = 2, val2="*" return "case 2"  return "default match" }

Of course there are probably as many answers as there are Cache programmers!  :-)

Mike.W · Jul 27, 2018 go to post

Hi, I also help support an NHS trust using Ensemble, and it also has ever-growing PDF files in messages. We have our incoming PDFs as external file streams and it helps, though you have to bear in mind that the files are not going to be part of the cache backup for Disaster Recovery, etc. (Not sure about mirroring. I'd assume they don't get mirrored either as the contents are not in the journal.)

As yet, we don't have as big a problem as you - less messages and we only keep 92 days - but that is just as well as the PDF files are converted to base64 encoded in HL7 v2 messages, so they then do take up space in the database, and the journal, and the backup, which has resulted in the need to expand the disk space recently. I can recommend keeping Ensemble on a virtual server with disk expansion on demand.

I tend to think the problem is not going to go away whatever you do. I assume, like us, the PDFs come from 3rd party applications and they are always going to be producing  ever more and prettier documents as time goes by. So I recommend looking at more disk. :-)  / Mike

Mike.W · Oct 18, 2018 go to post

Hi. I was actually looking up some information about pattern matching, but came across this warning:

If a call attempts to use indirection to get or set the value of object properties, it may result in an error. Do not use calls of this kind, as they attempt to bypass property accessor methods (<PropertyName>Get and <PropertyName>Set). Instead, use the $CLASSMETHOD, $METHOD, and $PROPERTY) functions, which are designed for this purpose

This was from https://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=GCOS_operators#GCOS_operators_pattern

So it looks like it may well work now, but there's no guarantee it will always work.

Surely there is a $method() call you can use to get the next item in the array?

Mike.W · Oct 25, 2018 go to post

I think that is my preferred method as well, but it depends to some extent what you are going to do with the result, and what you want to happen if the input number is too big. This $J solution will always return all the characters input, which may be the safest thing. (Though any space characters inside the input will get converted to zeros.)

When the fail mode needs to still return the same length string, e.g. to avoid messing up some fixed length message format, it might be best to use $E(), e.g.

W $E("0000"_number,$L(a)+1,*)

I've also seen the following used, but I'm not sure I recommend it. So many interesting ways it could go wrong!

w $E(number+10000,2,999)
Mike.W · May 3, 2019 go to post

Hi. The "clean code" people would recommend just one parameter max, and better would be none! But I think that's going a bit far, and agree that 3, or maybe 4, maximum should be aimed for to keep things easy to understand when reading, though there may be exceptions.

The array passing is a good idea to reduce the number for normal routines, but I think the ideal for classes is using objects. If you are truly embracing objects, and self-documenting code, then new classes are usually needed and what used to be parameters become the setting of properties, like this:

table=##class(CMT.UI.Table).%New()
table.TopLine=10,table.BottomLine=21
table.HeaderFormat="Underline"
table.DefineQuery("CMT.UI.PatchSite:MyList")
table.AddColumn(3,,"BOLD")
...etc.

table.Display()

It works well in some cases, but I have to admit that there is a tendency for the number of classes to get a bit silly if you take it to the extreme. Sometimes simple code is best.  :-)     / Mike

Mike.W · May 3, 2019 go to post

Hi / It sounds like a good idea! I can think of a number of interfaces I've seen where the target application - a small local system - struggled to keep up with the flow of updates from a large PAS.

The only thing I've done like it was complicated, and had to use a proper Business Process. In that case the "department" was neonatal, so we were only interested in patients admitted to a particular ward. The solution looked for HL7 admissions and transfers to that ward, and when found used the data to create a local record in Caché. Then all other types of message could be checked against those records to see if it needed further processing and passing on (to "BadgerNet" eventually when a full episode was built up). Of course this only works if you can define a clear "starting point" that can be spotted in the message stream.                        / Mike

Mike.W · Oct 4, 2019 go to post

Hi. It depends on what you mean by "certain criteria". If it's a special file name then you could amend the FileSpec  property to skip the ones you don't want yet. If it's in the content, then maybe you should be reading in the file (creating a copy or allowing archive so the original continues to exist) and sending it as a message into Ensemble that can then  be held up in a business process until ready to send out to an Operation that creates an output file. That is the way ensemble is supposed to work, so you get a full record of what happened, etc.

(Otherwise, I'm pretty certain that there are actions that reset the list of processed files - maybe resetting that file path or restarting the job - but I cannot find the documentation about it at the moment. )

Mike.W · Jul 7, 2016 go to post

Hi Steve,

I have done something like you describe. I used BPL, and at the time tried to keep away from using bits of code, but it got complex and in retrospect I'm not sure it was the best way. The diagrams are nice, but I think a bit of well written code might have been easier to follow!

First I created a "TempStore" class with an "MRN" (Medical Records Number) property and no permanent storage. This is used as the target class for a transform that pulls out the patient id and puts it in that property.

In the BPL Process I added an instance of the TempStore class to the BPL context object, and the first activity in the diagram is the transform with Source of "request" and Target "context.TempStore".

With the MRN found, I then use code like the following in the Value of an "assign" activity to put the target stored object into another context property of "context.BNetEpisode" already set to the same class.

##class(...).MRNIndexOpen(context.TempStore.MRN,4)

An "if" activity with a Condition like "$IsObject(context.BNetEpisode)" is used to see if anything was found, and create a new one if required by setting the "context.BNetEpisode.MRN" property equal to "context.TempStore.MRN".

The "context.BNetEpisode" property is then be used as the Target for "transform" activities later on with Create = "existing" used. Ensemble does a save automatically when the Process completes.

I hope this makes sense. (I cannot provide the full code as it belongs to the customer, and anyway it gets a lot more complex as there are 3 types of inbound message, one of which was an HL7 v3 document, and it was actually using an xml document inside the stored object to hold much of the data - but that's another story).

Mike

Mike.W · Sep 21, 2016 go to post

This may be slightly dodgy as it depends on assuming how the DTL will be compile, but I have used them to implement a loop around a set of standard DTL actions.

For one the source message had a stream with a document in it, and it needed to be chopped up into chunks, each one being put in a field in an HL7 v2 OBX segment. So an initial CODE action opens the loop with something like "Do {", gets the next chunk and quits if none. Then there are some normal ASSIGN actions setting up the new segment - easy to put in and appear in the DTL display. And finally there is a second CODE action to end the loop with "}".

It works, but only as long as InterSystems keep the compiled code for the assigns nice and simple.

A similar use was when dealing with xml documents. Again I needed a loop that enclosed normal ASSIGN actions, but this time for things like stepping backwards and deleting unwanted elements, or creating them from an array.

Also, I used a CODE action to work out if a DTL was running inside a business process, or just in the Studio for testing, so that some expected properties could be created to allow the test to run. Just for debugging.

Regards,

Mike

Mike.W · Sep 22, 2016 go to post

Hi Jenny,

The FOREACH has to iterate over something, but there is only one input field, so I don't think there is any other way than a CODE action in a DTL (at least on v2012). I could have done the whole loop in the one CODE, or called an outside function, but I preferred to use the GUI to set up the ASSIGN actions for me, to provide the extra validation and visual links in the DT Builder. I have used DTL whenever possible in other areas like in BPL to extract data from messages into temporary variables. Maybe not always the quickest or easiest way, but I think it makes the system more transparent.

For the loop the DT is actually called from a routing rule. The source is a simple message class with a few fields, and the target an HL7 v2 MDM_T02 message. I suppose if a full BP was involved then the loop could have been implemented in BPL in some way, but I think that would get complicated. Perhaps DTL needs a more general WHILE/UNTIL loop action added?

Mike.W · Mar 10, 2017 go to post

Hi,

We have an Ensemble Service class that extends " EnsLib.SOAP.Service" and provides a web method with a parameter that is a class that includes a property of " %Stream.FileBinary", plus all the meta data in other properties. This allows the source application, written in .net, to send us documents fairly easily, as all the translation back and forth into xml, etc. is done for you (I assume it is also easy to do at the .net end). There is not a lot of code needed to define the class and web service, then it just needs to build a message object with that same input class as a property, and send that onwards as usual.

(Unfortunately, we then have to convert the file into base 64 encoded chunks and insert into segments in one of those MDM^T02 messages like you do. But that's another story.)

Mike

Mike.W · May 2, 2017 go to post

The timeouts for the web front end can be frustrating. Where we had searches that we wanted to do regularly we have ended up creating a Business Service class that does embedded SQL queries on the Ens.MessageHeader table, and puts the results into a simple text message that then gets sent as an attachment to an email. This gives us our daily stats in a CSV format to copy and paste into a spreadsheet. Yes, we could have built an XML spreadsheet file directly, but that is tricky, and not much of an advantage as we want to build on it each day without the query working through many day's of data.

We also had a go creating something using Ens.BusinessMetric for a "recent activity" graph, but the end result was a bit limited in how it could be displayed and analysed (using DeepSee) as we only had the Ensemble license.

Mike

Mike.W · Nov 17, 2017 go to post

if you open the class in Studio, and view the "Inspector" pane, then pick "Property" from the first drop down and your property in the other (or double click it), then it shows a very long list of possible things you can add. It may even show all of them, I don't know. You can then click on the values to enter them, sometimes getting drop down lists of possible values.

I've often used the inspector as a way of finding out what might be available, and then searching any keywords in the Caché documentation to confirm how to use it.

Regards,

Mike

Mike.W · Nov 23, 2017 go to post

I may have misunderstood your requirement, but  you may not need a Business Rule at all. To pull data from the request into the context  I've actually used Transforms.

To get this to work, I created a new class that Extends (%SerialObject, %XML.Adaptor), and defined in it the properties that I need to store. I could then define a Transformation from the incoming message type to this new one, pulling everything I needed. In the BPL  I then added a property called "TempStore" of that type to the Context object in my BLP. To pull the data I added a Transform Activity with a Source of "request" and a Target of "context.TempStore" using the Transform.

Later Activity boxes could then use the fields with references like "context.TempStore.priorMRN" to do tests, etc. I've also used the same trick to update outgoing messages with data from the context (using Create = existing in the Transform).

I hope this is useful.

Regards,

Mike