Article
· Oct 17, 2024 2m read
ORU to MDM with large HL7 v2 messages.

I was working on a DTL but kept getting ERROR #5002... MAXSTRING errors. The problem was that most of the DTL GUI action steps only support the string data type when working with the segments. A %String has a limit of 3,641,144 characters and my OBX5.1 was 5,242,952 characters long as the example provided. Of course PACS admin stated ultra high quality up to and including 4K resolution files were needed, so we could not get the vendor to compress or reformat these files to compressed jpg or something similar.

3 7
1 162

Hi Team,

I am converting xml message into HL7 message but the input XML message contains pdf which is converting into base 64 and getting mapped to OBX:5.5 in HL7 message and sending it to downstream

In Downstream service i am using normal HL7 TCP class EnsLib.HL7.Service.TCPService but the message looks like below i am not sure why stream is taking as another segment in HL7 message,

Any thoughts on this?

Thanks,

Smythee

0 15
0 142

Using the FHIR DEMO, I have pieced together how to make a FHIR Request using OAuth against an External FHIR Repository. When I execute the Patient search (HS.FHIRServer.Interop.Request), I get a HS.FHIRServer.Interop.Response that has a Quick Stream ID, which I then use to convert the Quick Stream to a JSON Dynamic Object. if I do a trace on the Raw JSON Object, I am able to pull out single elements, however I want to pull the raw JSON into a defined Class Structure.

0 2
0 137

Hello Community,

I'm a beginner and currently working on a project to convert CCDA files to FHIR using InterSystems IRIS. I have developed a web form to upload CCDA files, and I'm attempting to convert the uploaded CCDA files to FHIR. However, I am encountering an issue where the conversion process results in an empty entry.
Here's the Output it displays on HTML page:

0 9
0 99

I was looking for an easier way to build the FHIR Query String, given the Record Map request that is passed into the DTL.

I built this Function, but when I run a message through it, my Query String that is passed back into the DTL is a Reference Pointer and not the String I am looking for.

0 3
1 100

I'm running into an intermittent issue with some of our Custom Operations/Processes as a result of some large FHIR R4 Binaries. Essentially we get a response from an AthenaHealth FHIR endpoint that appears to be too large to be processed using the IRIS Built In Functions for FHIR:

I've replicated it on the command line here using a file (binary.json) that has the response from the FHIR Endpoint. Not sharing full contents due to PHI concerns.

0 1
0 92

We have a scenario where we use the best practice article of a BP and DTL to split up HL7 messages mainly ORUS

https://community.intersystems.com/post/splitting-oru-messages-using-obj...

It is really useful but we have this code in many places that we are trying to consolidate it in one place.

0 1
0 80

Has anybody encountered this before?

I did a very usual hl7 adt transformation DTL, but it is with a customized schema, like ADT_ALL, it is a structure that tries to cover all the ADT schema scenario so we can use 1 type for all ADT messages.

the DTL tests fine in DTL test, with all the segment transformed correctly, but once I put it to test, to send message through source and the message after transformation can only go to MSH, EVN, but won't transform other segment like PID etc.

0 5
0 61

I thought I knew how to return a Response from a Business Process back to the Source Config Name, but I guess not.

I am working on a Proof of Concept, that the Request Message Class would determine a "Route" within a Business Process to make a FHIR call (search, read) to our External FHIR repository, and return back the HS.FHIR.DTL.vR4.Model.Resource.xxxxxxx as a Response to the Source Config Name.

0 0
0 63

Like many others probably find themselves, we were stuck doing live data mapping in our Interface Engine that we really didn't want to do, but had no good alternative choice. We want to only keep mappings for as long as possibly needed and then purge expired rows based upon a TTL value. We actually had 4 use cases for it ourselves before we built this. Use cases:

1 0
0 45