We continue with the series of articles related to the QuinielaML application. In this article we are going to cover how to prepare the raw data that we have captured using the Embedded Python functionality.
Welcome everybody!
This topic unites posts which describe approaches, tools, and solutions to import and export data into InterSystems IRIS and other InterSystems Data platform products: CSV, JSON, SQL, flat files, binary files, globals, streams, etc.
We continue with the series of articles related to the QuinielaML application. In this article we are going to cover how to prepare the raw data that we have captured using the Embedded Python functionality.
Welcome everybody!
Hello guys,
I want to move a development IRIS for Health database to another server. I will do this manually for specific reasons. If I simply copy the /mgr folder along with all the files (.DAT, .GBK, etc.) and configure it in the new server, will it work?
Best regards.
Hello!
I'm new to Caché systems and I have a question...
I need to develop a custom task that daily exports the result of a query to a csv file in the database directory (I use the UNIX version).
Does anyone have any code or help to solve this case?
Thanks!!!
Hi,
I am trying to load all the data tables from one iris server to a client server but some of the tables data failing to load all the time. But I can load around 100 tables successfully but 8 to 10 tables are failing all the time. I made an IRIS odbc connection using odbc driver to load the data from tables.
Also I can see read server loop error message on the iris server side as the same time the table loading fails.
Please find the screen shot attached which shows the error on client server.
Can anyone provide me an advise how to fix the issue.
Thanks
Jude
It can sometimes be useful to list or export all of the subclasses that are derived, directly or indirectly, from a given class. In Studio, the Class -> Derived Classes menu option will show such a list, but I'm not aware of a built-in API for programmatically exporting their source code.
Hey Developers,
Enjoy watching the new video on InterSystems Developers YouTube:
⏯ Projecting Data into InterSystems IRIS with Foreign Tables
Idea sourced from InterSystems Ideas Portal
Previous Posts:
Using AI to Simplify Clinical Documents Storage, Retrieval, and Search
Doctor-Patient Conversations: AI-Powered Transcription and Summarization
The healthcare industry is continuously evolving, and the need for efficient document management and patient data management is more critical than ever. In this article, we will focus on the specific aspects of integrating Google Docs and Google Sheets with FHIR data in the context of healthcare data interoperability.
how can i import IRIS.DAT file into iris system. I am told that it is database file that will be import into iris db.
Hi Developers,
Enjoy watching the new video on InterSystems Developers YouTube:
Exported contents of a namespace on one server (classes, include files and lookup tables). Importing that code into a newly created namespace on another server. Both servers Ensemble 2018.1, same build. Export was via InterSystems Studio. Export is around 18Mb in total (XML file sizes).
When importing and compiling on the new server, getting errors as below - with #6301: SAX XML Parser error prominent - on a number of the imported files, all containing data transformations or business processes.
Assuming I have an SQL table with data, is it possible to get DML export (INSERT statements for this data)?
hi,I meet a problem,i want to move huge data from ensemble to oracle.it`s about 3years business data.i have tried some case!
1.EnsLib.SQL.OutboundAdapter
i parse the data one by one,and translate it into inser or update sql statement then write to Oracle,it·s most slowly,and take more disk space。
2.save the data to a new table,and then use dbeaver transport data to oracle。
3.user linktable. save object as insert,but,i can·t user %openid method.
hope to find a high-efficiency way!
WIN SQL is the normal editor used by most of the users .But we can't download large amount of data using winsql . So I have written a tutorial how to connect with a new Java based editor called Squirrel SQL which can easily download or export data in excel or any other formats. Also I included a Java JCBC connection program to connect with the IRIS database especially a mirroring/failover server.
SQL Java based Editor to Export large amount of data and Java JDBC program for IRIS connection
SQL Java based Editor to Export large amount of data
Is there a command that will loop through the flat files of a given Linux/Unix folder? I can write the code to open and read each file. But the file names are unknown. I am looking for a way to access each file given a named Linux folder. The files have differing structures so a record map will not work.
Thank you for reading and thank you even more for answering!
Is there a way to do something like $SYSTEM.OBJ.ExportAllClassesIndividual() for the routines?
I'd like to export all routines in single files
In today's fast-paced and highly competitive manufacturing industry, efficient machine communication and data exchange is essential to maximize productivity and minimize downtime. That's where MTConnect comes in. MTConnect is an open, royalty-free standard that provides a common language for communication between machines, devices, and software applications in a manufacturing environment.
Hi
Is there a way to read JSON and transform it (in DTL) by using a VDOC of this JSON (without transform it to internal message) like I can do with HL7 or XML?
If it possible, I guess that I should have a schema of the JSON so the second question is how to build a schema for JSON and load it to the IRIS?
Thanks, Ori.
The capacity of taking numerous records every second while also facilitating real-time queries simultaneously in real time is called Hybrid Transactional Analytical Processing (HTAP). It is also called Transactional analytics or Transanalytics or Translytics and is a very useful element in scenarios where there is constant flow of real time data coming from IIOT sensors or data on fluctuations in stock market, and supporting the need for querying these data sets in real-time or near real-time.
In terms of general through-put design and long term support, I'm considering what would be a "best approach" for needing to create multiple batch files in a few different layouts from the same data-sets.
Hello,
I'm currently struggeling with a HTTP request to a URL, which contains an jpeg image file.
Testing the request with a browser or Postman results in the image being shown normally.
Using a %Net.HttpRequest with different configurations has resulted in a corrupted file.
My code works for some URLs from other servers perfectly fine, but with some it produces corrupted file contents which do not represent a jpeg.
1. Review of Interface Settings: Ensure these are set purposefully! (Documentation linked)
(Documentation Linked)
2. DTL Editor Enhancements
In this GitHub we gather information from a csv, use a DataTransformation to make it into a FHIR object and then, save that information to a FHIR server all that using only Python.
The objective is to show how easy it is to manipulate data into the output we want, here a FHIR Bundle, in the IRIS full Python framework.
Has anyone had any success reading barcodes from PDFs or images in a Cache/IRIS application? I've been looking at some possible solutions for this, including the open source ZXing libraries. I know we have the ability to create them in Zen and Intersystems Reports, but as far as I know, there's nothing built in to actually read data from a barcode. If anyone has suggestions on how to go about this, I'd love to hear them.
Thanks to @Yuri Marx we have seen a very nice example for DB migration from Postgres to IRIS.
My personal problem is the use of DBeaver as a migration tool.
Especially as one of the strengths of IRIS ( and also Caché) before is the availability of the
SQLgateways that allow access to any external Db as long as for them an access usinig
JDBC or ODBC is available. So I extended the package to demonstrate this.
I am creating "models" that contain rows in several class tables called Model, Path, Node.
So, model 19 includes 1 row in the Model table, 11 rows in the Path table, and 10 rows in the Node table.
I'd like to write some utilities to "move" model 19 from one instance to another.
Hey there,
I'm writing an import Routine to read files into a global. The code is working fine except for the 'Delete' command. The files are being imported, copied but not deleted. Maybe someone has an Idea what ist happening.
I get the low level return value of -32 but i couldn't find anywhere to show me what that actually means. And my Caché version doesn't support the $ZU command.
Here's the Code
Hi folks!
Sometimes we need to import data into InterSystems IRIS from CSV. It can be done e.g. via csvgen tool that generates a class and imports all the data into it.
But what if you already have your own class and want to import data from CSV into your existing table?
There are numerous ways to do that but you can use csvgen (or csvgen-ui) again! I prepared and and example and happy to share. Here we go!
Hi everyone,
How do you alter primary key on a table that has data?
Any example much appreciated.
Cheers,
Tom
Hello,
I am working on the new USCDI Requirement to store LOINC Codes/Descriptions for various types of Clinical Notes in the SDA.
According to all of the documentation I had been able to find, in order to store a coded value for Document Type, I have to use Custom Pairs:
Hello,
Our team is looking for a way to export all of our Cache SQL tables into Microsoft SQL Server. I have only found a method to export one table at a time into an ASCII file. We have over 170 tables so this would be very tedious and time consuming. Is there a way to directly export from Cache to SQL Server. Alternatively is it possible to export the entire database in a single shot or even multiple tables to text files?
Thanks!