Written by

Sales Engineer at InterSystems Iberia
Question Luis Angel Pérez Ramos · Jun 19, 2023

The irishealth-ml-community container consumes all memory available

Hi community members!

I'm trying to deploy a container based on IRIS Community for Health ML image available from this url but when I start the container the memory consumption skyrockets to 99% making impossible to work with the instance (it never goes below the 95% of the memory). When I do the same with the IRIS Community for Health image it never goes over 80% of memory.

Is this a known problem? Should I try another version? Maybe have I just to sit down in a corner and cry like a baby?

Comments

Erik Hemdal · Jun 26, 2023

I ran a quick test, Luis, on my Windows 10 laptop with 32GB of memory.  Memory usage peaked at about 25GB and settled down to about 20GB.  When I deleted the container and stopped Docker, memory went down to about 10GB used.

I did not see obvious memory exhaustion in what is an informal test.  I don't have a good sense of what "good" memory usage should be like on this image.  If this is causing you trouble, reach out to WRC and they can get you more help.

0
Robert Cemper  Jun 26, 2023 to Erik Hemdal

win docker desktop not just consumes fast-growing  vhdx
but also a lot of temp files, that get never deleted or shrinked
not even with deinstall / reinstall
typically in 
C:\Users\<usename>\AppData\Local\Temp\docker-scout\sha256 
C:\Users\<username>\.docker\scout\sbom\sha256
C:\Users\<username>\AppData\Local\Temp\  *.ico, *.vhdx

0