Written by

Question Ankur Shah · Jan 4, 2022

Implement CI/CD with VSTS for Intersystems IRIS rest api

Hi Team,

I want implement build/release pipeline for InterSystems IRIS rest API with VSTS without docker container or other tool.

Can you please provide step by step guide for same.

Thanks,

Ankur Shah

Product version: IRIS 2019.1

Comments

Dmitry Maslennikov · Jan 4, 2022

Most of the CI/CD processes are now based on Container's way. And do it without Docker makes this process much more complex. And It's not quite clear what do you want to achieve. 

And in any way, this task is quite complex, and very much depends on what kind of application you have, how you build it right now, in some cases major OS, and even when some other languages and technologies are used for application. You may contact me directly, I can help with this, I have experience and knowledge on this.

0
Ankur Shah  Jan 4, 2022 to Dmitry Maslennikov

We are using angular as front end and Intersystem IRIS as a backend. 
We created a CI/CD pipeline for angular project without docker container with VSTS. Same way we want to implement CI/CD pipeline for Intersytem IRIS.

Goal is to move our IRIS code from stage server to production server with the help of CI/CD pipeline. Moreover, we don't have any idea on docker and not sure what additional infrastructure required for used docker container. 

0
Ankur Shah  Jan 5, 2022 to Dmitry Maslennikov

Hi,

By mistake I have accepted your answer. Can you please help me on this. It would be great if you can provide some video or tutorial.

Thanks,

Ankur Shah  

0
Yuji Ohata · Jan 4, 2022

Hi.

I implement CI/CD pipeline of IRIS in AWS without container!
I use CodeCommit what is git service and CodeDeploy what is deploy service.

When source code(cls files) was pushed to CodeCommit, CodeDeploy executes to pull source files from CodeCommit, and deploy to application server.
Application server is installed IRIS, and use Interoperability for monitor deploy files.

When Interoperability detects files, executes $SYSTEM.OBJ.DeletePackage(path) and $SYSTEM.OBJ.ImportDir(path, "*.cls;*.mac;*.int;*.inc;*.dfi", flag).

0
Ani Mayreddy  Jul 18, 2024 to Yuji Ohata

Hi @Yuji Ohata , we are trying to do something similar on our end. Did you execute any specific commands for Interoperability to monitor the files in the pipeline? Can you please share the sample github workflow file, that would be very helpful. TIA

0
Yuji Ohata  Jul 18, 2024 to Ani Mayreddy

Hi @Ani Mayreddy 

> Did you execute any specific commands for Interoperability to monitor the files in the pipeline?
No. I don't use IRIS Interoperability on the pipeline architecture.
The pipeline was created on AWS CodePipeline, and the pipeline runs automatically when pushed to git repository.

0
Ani Mayreddy  Jul 19, 2024 to Yuji Ohata

Thank you for sharing this, I am having issues with buildspec.yml and appspec.yml file. Once I push the changes to Git what are the sequence of commands that will be executed. I am not sure how the changes will be detected and pushed to Iris app. If you won't mind can you please provide the list of commands that you have used in buildspec. and appspec

0
Yuji Ohata  Jul 22, 2024 to Ani Mayreddy

The buildspec.yaml is just to check what git source has compile errors.

version:0.2env:  variables:      RESULT:0phases:  build:    commands:# Make Directory for compile check.      -mkdir-p/user/local/work# Copy files what get from CodeCommit.      -cd..      -cp-r*/user/local/work      -chmod-R777/user/local/work# Start container's IRIS.      -su-irisowner-c"iris start IRIS"# Import check.      -su-irisowner-c"iris terminal IRIS \"##CLASS(%SYSTEM.OBJ).ImportDir(\\\"/user/local/work\\\",\\\"*.cls;*.mac;*.int;*.inc;*.dfi\\\",\\\"k\\\",,1)\" "| tee result.txt
      - sh ./chk_err.sh
      # Compile check.
      - su - irisowner -c "iris terminal IRIS \"##CLASS(%SYSTEM.OBJ).CompileAll(\\\"ck\\\")\" " | tee result.txt
      - sh ./chk_err.sh
      - su - irisowner -c "iris terminal IRIS \"##CLASS(%Library.MessageDictionary).ImportDir(\\\"/user/local/work/01/res\\\")\" " | tee result.txt
      - sh ./chk_err_md.sh
      # These sh is check result.txt that has errors.
      -
  post_build:    commands:      -cd$CODEBUILD_SRC_DIR_main_out      -mvmake_flg_file.bat../      -mvappspec.yml../      -cd..      -mv00base      -mv01main      -ls-al

And appspec.yaml is to store sources to any directory.

version:0.0os:windowsfiles:  - source:/    destination:E:\deploy\base

After that, Importing to iris is done by creating an interoperability job and using the file detection mechanism.

Import program.

ClassMethod ImportDir(path As%String, flag As%String = "k") As%Status
{
    Do..Log("ImportDir Start:"_path_" / "_flag)
    Set errorLog = ""Set imported = ""Set selectedlist = ""Set count = 0If (##class(%File).DirectoryExists(path)) {
        Set status = $SYSTEM.OBJ.ImportDir(path, "*.cls;*.mac;*.int;*.inc;*.dfi", flag, .errorLog , 1, .imported, , .selectedlist)
        Set prg = ""For {
            Set prg = $ORDER(imported(prg))
            If (prg = "") {
                Quit
            }
            Set count = count + 1
        }
    } Else {
        Set status = $$$ERROR($$$GeneralError, "パスが存在しません。"_path)
    }
    Do..Log("ImportDir End:"_count)
    Return status
}
0
Ani Mayreddy  Jul 22, 2024 to Yuji Ohata

Thank you so much for sharing this

0
Niklas Thilmont · Jan 11, 2022

Might be a bit late to the party, but we use studio project exports in one of our project to create build artifacts, mainly because we are working with customers that do not support containers or other methods of deployment.

Here is the snippet:

ClassMethod CreateStudioExport() As %Status
{
        #Dim rSC As %Status
        #Dim tSE As %Exception.StatusException
        #Dim tProject as %Studio.Project
        Try {
            set tRelevantFiles = ..GetAllRelevantFiles()
            set tProject = ##class(%Studio.Project).%New()
            set tProject.Name = "My Studio Export"
            set tIterator = tRelevantFiles.%GetIterator()
            while tIterator.%GetNext(.key , .classToAdd ) {
                write "Adding "_classToAdd_" to project export",!
                $$$ThrowOnError(tProject.AddItem(classToAdd))
            }
            $$$ThrowOnError(tProject.%Save())
            zwrite tProject
            $$$ThrowOnError(tProject.Export("/opt/app/studio-project.xml", "ck", 0, .errorLog, "UTF-8"))
            Set rSC = $$$OK
        } Catch tSE {
            zwrite errorLog
            Set rSC = tSE.AsStatus()
            Quit
        }
        Quit rSC
} 

ClassMethod GetAllRelevantFiles() As %DynamicArray
{
    set tt=##class(%SYS.Python).Import("files")
    set string = tt."get_all_cls"("/opt/app/src/src")
    return ##class(%DynamicArray).%FromJSON(string)
}

Here is the python script:

import os
import json # Used to gather relevant files during a buildpipeline step! 

def normalize_file_path(file, directory):
    # Remove the first part of the directory to normalize the class name
    class_name = file[directory.__len__():].replace("\\", ".").replace("/", ".")
    if class_name.startswith("."):
        class_name = class_name[1:]
    return class_name 

def is_relevant_file(file):
    file_lower = file.lower()
    return file_lower.endswith(".cls") \
        or file_lower.endswith(".inc") \
        or file_lower.endswith(".gbl") \
        or file_lower.endswith(".csp") \
        or file_lower.endswith(".lut") \
        or file_lower.endswith(".hl7") 

def get_all_cls(directory):
    all_files = [val for sublist in [[os.path.join(i[0], j) for j in i[2]] for i in os.walk(directory)] for val in sublist]
    all_relevant_files = list(filter(is_relevant_file, all_files))
    normalized = list(map(lambda file: normalize_file_path(file, directory), all_relevant_files))
    print(normalized)
    return json.dumps(normalized)

It is all rather hacky, and you probably have to use the snippets I provided as basis, and implement stuff yourself. 

What we do is:

  1. Spin up a docker container with python enabled in the buildpipeline that has the source files mounted to /opt/app/src
  2. Execute the CreateStudioExport() method in said docker container
  3. Copy the newly created studio export to the build pipeline host
  4. Tag the studio export as artifact and upload it to a file storage

Maybe this helps! Let me know if you have questions!

0