User bio
404 bio not found
Member since Jun 8, 2016
Posts:
Replies:

Hi Nikhil,

If your main goal is to receive email alerts when there’s a disk space issue or any other system-related problem affecting the instance, then you may not need to create a custom task at all.

Instead, consider using InterSystems IRIS Log Monitor. It’s part of the System Monitor and is designed specifically for this kind of functionality.

🔗 Log Monitor documentation

The Log Monitor can be configured to automatically send email notifications when certain conditions are met — such as low disk spacewrite daemon problemslicense issues, or other events that could impact the health of your instance.

So, if email monitoring is your objective, this built-in solution is likely more robust and easier to maintain than implementing a scheduled task from scratch. It handles the complexity for you and covers more ground than just database free space.

Hope this helps!

If you use IRIS, the easiest and cleanest way to interact with AWS S3 is by using embedded Python and the boto3 library, which is the official AWS SDK for Python.

IRIS allows you to write class methods with Language = python, so you can write pure Python inside your ObjectScript classes. Here’s a simple example of listing objects in an S3 bucket:

Class MyApp.S3Utils
{

ClassMethod ListS3Objects(bucket As %String = "your-bucket-name") [ Language = python ]
{
    import boto3

    s3 = boto3.client("s3")
    response = s3.list_objects(Bucket=bucket,Prefix="your_folder")

    contents = response.get("Contents", [])
    for obj in contents:
        print("Key:", obj["Key"])
}

}

To call this method from ObjectScript:

Do ##class(MyApp.S3Utils).ListS3Objects("my-s3-bucket")

Just make sure the Python environment IRIS is using has boto3 installed:

pip install boto3

If you’re working with Ensemble, which doesn’t support embedded Python, you can still achieve the same result by calling a Python script externally using $ZF(-1):

Set cmd = "python3 /path/to/your_script.py"
Set status = $ZF(-1, cmd)

This way, you can keep the S3 logic in a Python script and still integrate it with Ensemble.

So the one-line non-deprecated way would be: 

Set sc=person.%JSONExportToString(.json), dynObj = {}.%FromJSON(json)
    
    // or if we have long objectsSet sc=person.%JSONExportToStream(.json), dynObj = {}.%FromJSON(json)

But I agree it would be nicer to have just one command!, like the one that @Laura Blázquez García pointed out. 

Open Exchange applications:
Certifications & Credly badges:
Mario has no Certifications & Credly badges yet.
Followers:
Mario has no followers yet.
Following:
Mario has not followed anybody yet.