How to Handel large data like 50 lac characters support in Base64 binary in Binary FHIR Resource ? i
I try the Large data Save in FHIR server in Binary Resource but it's failed
Methed: Post , URL: fhirservers/fhir/r4/Binary
I hit the API but the record not save and same API but base64 data length is 3 lac than it's save without faile.
I pass data on this formet
{
"resourceType" : "Binary",
// from Resource: id, meta, implicitRules, and language
"contentType" : "<code>", // R! MimeType of the binary content
"securityContext" : { Reference(Any) }, // Identifies another resource to use as proxy when enforcing access control
"data" : "<base64Binary>" // The actual content
}
I got this Response is
{
"resourceType": "OperationOutcome",
"issue": [
{
"severity": "error",
"code": "exception",
"diagnostics": "5002",
"details": {
"text": "ObjectScript error: <MAXSTRING>%GetNext+2^%Iterator.Object.1"
}
}
]
}
How to fix this, or what is the solution for that?
Comments
The issue arises from attempting to deserialize large data in a FHIR Binary resource that exceeds the maximum local string length (3641144 characters). InterSystems IRIS signals a <MAXSTRING> error when object string data exceeds this limit but commonly offers solutions as follows:
ObjectStream Usage: You can use the
%Streamclass (e.g.,%Stream.DynamicBinaryfor binary data or%Stream.DynamicCharacterfor characters) to handle very large strings. Change theBinaryclass property that currently holds data using%Stringformat to%Stream. This avoids exceeding the maximum length in memory and limits signal triggers caused by JSONAdapter calling deprecated%FromStreamat dynamic fields.[1]Re-creation example:
- Conversion workaround steps: Appropriate substitution methods e.g.:
-For managing the actual parsed content:
zConvertStream+ INSTEAD workaround binary first parsing+:
A better global Persistent properly attending nodes relevant >[try substit.com].
Developers familiar with HL7V2? Abandon update suggesting default JSON returned arrays,written mappings topic/binary substitutions.
**AND THEN itself unsure like peer-esys ALSO do macros xml/metadata τύlen_encodervention draft removed corrections for likely_NODExml_API_HEAD_APPENDITIONS Removed[std enough tricks stream from key automation/direct serialization KEY-basic]
To address the issue with <MAXSTRING> in the FHIR Binary resource involving Base64 encoded content, the following solutions are recommended:
Use Streams Instead of Strings: Update your code to use stream classes (
%Stream.DynamicBinaryor%Stream.DynamicCharacter) for handling largedatafields instead of%Binary(which maps to%String). Using streams allows handling strings that exceed the maximum length allocated for ObjectScript strings [2][3].This can be implemented by defining a method to set the Binary resource using streams, as shown:
ClassMethod SetBinaryR4(json As %DynamicObject) { Set obj = ##class(HS.FHIR.DTL.vR4.Model.Resource.Binary).%New() Set obj.contentType = json.contentType // Convert large data field to stream Set dataAsStrm = json.%Get("data",,"stream") Set obj.data = dataAsStrm Set obj.id = json.id }This approach bypasses
<MAXSTRING>errors by storing the large content (Base64 encoded) in memory-efficient streams [3].Refactor
%GetNextUsage: Modify all usages of the%GetNextmethod in your adapter classes. The%GetNext(.key, .value)method triggers a<MAXSTRING>error if the value exceeds the string length limit. Instead, use the three-argument form%GetNext(.key, .value, .type). This ensures that the returned value is a%Streamobject when the content type is"string"[2][3].Example Update:
While iter.%GetNext(.Name, .Value, .Type) { If .Type="string" { // Handle value as stream } }Workflow for Transformations: Create a workaround where:
- You replace large Base64 strings in the JSON with placeholders.
- Perform DTL transformations excluding large strings.
- Reintroduce Base64 strings using
%Streamjust before submission to the FHIR server [3].
Implementing the above adjustments would eliminate the likelihood of encountering the <MAXSTRING> error while handling large FHIR Binary resources containing Base64 data.
Sources:
`Hi Rutvik
You can handle this error with below code. Some sample code
swt datatype "stream"While iter.%GetNext(.key, .value, .datatype) {
}If you want to get the values as a stream from DynamicObject Use %Get(key,,"stream")
set stream = dao.%Get(key,,"stream")This 3rd parameter support in %GetNext() and %Get() from 2022 version
There is a specific reason for using this custom code instead of relying on the built-in %FromJSON() method. The third argument is currently not implemented in the following ##class(HS.FHIR.DTL.vR4.Model.Resource.Binary).%FromJSON(fhirresourceStream)
ClassMethod SetBinaryR4(json As%DynamicObject) {
Set obj = ##class(HS.FHIR.DTL.vR4.Model.Resource.Binary).%New()
Set obj.contentType = json.contentType
// Convert large data field to streamSet dataAsStrm = json.%Get("data",,"stream")
Set obj.data = dataAsStrm
Set obj.id = json.id
}
Thank you!