Thanks for the input, it was obvious when somebody else pointed it out. I managed to import that sample XML to that object!
Cheers
- Log in to post comments
Thanks for the input, it was obvious when somebody else pointed it out. I managed to import that sample XML to that object!
Cheers
I'm facing the same issue currently. Is it possible to share your results so I could get some leads how to communicate with an NTLM enabled server via Ensemble?
Cheers
Kari
Thank you very much for your response.
Yes, I think that might work in our situation. I will read the docs and look at the %ZSTART / OnStart routines and how to use it.
Thanks for the help!
Kari
Hello,
And a great tutorial, thanks! One question tho. Do you know how to pass environment variables to the %Installer? In my scenario, I would like to configure a CI/CD pipeline and define some environment variables when building the docker container and reference those variables within the %Installer to create for ex. A specific user account (username and password as env variables). How can I achieve this?
I've tried setting the env variables within the Dockerfile with ENV someEnv="1234" and try to get the variable with %System.Util.GetEnviron("someEnv") within %Installer, but it just returns an empty string.
If you have any insight or tips, it would be appreciated.
Cheers!
Kari Vatjus-Anttila
Hi,
Thanks for the response. I upgraded the container to 2021.2.0.617.0, and it seems the FHIR metadata gets populated with the security related data correctly now.
Thank you very much for the input and have a good day!
Kari Vatjus-Anttila
Thank you for the response.
It seems to work! I set the database permissions to read-only during my Installer script and it seems that the Code database is where it should be and is not persistent. Thank you very much for your responses.
I couldn't set the permissions via the Installer Manifest so I ended up writing a small ObjectScript script to achieve this. Here it is for future reference:
/// Change database permissions
/// <ul>
/// <li><var>dbDir</var> Path to the database.</li>
/// <li><var>mode</var> Permission mode. 0 = read/write, 1 = read-only. Optional</li>
/// </ul>
ClassMethod SetupDBPermissions(dbDir as %String, mode as %Integer = 0) As %Status {
New $NAMESPACE
Set origNs = $NAMESPACE
Set $NAMESPACE = "%SYS"
Set sc = $$$OK
Set db = ##class(SYS.Database).%OpenId(dbDir)
Write "Setting database permission for " _ db.Directory _ ". Setting ReadOnly from " _ db.ReadOnly _ " to " _ mode, !
Set db.ReadOnly = mode
$$$ThrowOnError(db.%Save())
Set $NAMESPACE = origNs
Return sc
}Nope, does not make any difference. I checked out the extensions code and I suspect (I might be wrong) that the issue is here:
https://github.com/intersystems-community/vscode-objectscript/blob/mast… Line 289
const agent = new http.Agent({
keepAlive: true,
maxSockets: 10,
rejectUnauthorized: https && config("http.proxyStrictSSL"),
});rejectUnauthorized is going to be true if the scheme is using HTTPS (which I am). If that is the issue, it would be great if this parameter could be overwritten with some general setting provided by the extension for example "Allow Insecure connections".
That works! I must admit, I was a bit perplexed that why setting "http.proxyStrictSSL" to true didn't have any effect but the property you suggested made the difference.
Like Brett said, maybe somebody should re-write the line so it doesn't cause any further confusion among users :)
Cheers and thanks for the quick help yet again!
That was fast!
I downloaded the extension and tested it. It works.
I removed the objectscript.http.proxyStrictSSLproperty and set proxyStrictSSL to true -> connection failed as expected: VSCode notified that the server is using a self-signed cert. Next, I set proxyStrictSSLto false and tried to reconnect -> connection ok and VSCode prompted me to choose the namespace I want.
Thank you for your effort, it seems like your change works as expected.
//Kari
Hi,
I know it is a old thread but I thought I'd ask if there are any updates to this framework? The repo is 4 years old and I'm looking for a solution for a mocking framework atm. I noticed also that some classes are missing like Jonathan Lent stated. Is there any possibility for you to update the repo with the latest changes if there are any? I'm working on unit tests and trying to find out a way to mock different business hosts and I found this framework to be a promising one.
Cheers,
Kari
Hi,
Thank you very much for such a quick reply! I will take a look at the repo you provided and will see if I can get the framework up and running on my end.
Thanks again and have a good day!
//Kari
Edit: I created a separate thread about this so it gets more visibility: The thread can be found from here: https://community.intersystems.com/post/test-coverage-coverage-report-not-generating-when-running-unit-tests-zpm
...
Hello,
@Timothy Leavitt, thanks for the great article! I am facing a slight problem and was wondering if you, or someone else, might have some insight into the matter.
I am running my unit tests in the following way with ZPM, as instructed. They work well and test reports are generated correctly. Test coverage is also measured correctly according to the logs. However, even though I instructed ZPM to generate Cobertura-style coverage reports, it is not generating one. When I run the GenerateReport() method manually, the report is generated correctly.
I am wondering what I am doing wrong. I used the test flags from the ObjectScript-Math repository, but they seem not to work.
Here is the ZPM command I use to run the unit tests:
zpm "common-unit-tests test -only -verbose
-DUnitTest.ManagerClass=TestCoverage.Manager
-DUnitTest.UserParam.CoverageReportClass=TestCoverage.Report.Cobertura.ReportGenerator
-DUnitTest.UserParam.CoverageReportFile=/opt/iris/test/CoverageReports/coverage.xml
-DUnitTest.Suite=Test.UnitTests.Fw
-DUnitTest.JUnitOutput=/opt/iris/test/TestReports/junit.xml
-DUnitTest.FailuresAreFatal=1":1The test suite runs okay, but coverage reports do not generate. However, when I run these commands stated in the TestCoverage documentation, the reports are generated.
Set reportFile = "/opt/iris/test/CoverageReports/coverage.xml"Do##class(TestCoverage.Report.Cobertura.ReportGenerator).GenerateReport(<index>, reportFile)Here is a short snippet from the logs where you can see that test coverage analysis is run:
Collecting coverage data for Test: .036437 seconds
Test passed
Mapping to class/routine coverage: .041223 seconds
Aggregating coverage data: .019707 seconds
Code coverage: 41.92%
Use the following URL to view the result:
http://192.168.208.2:52773/csp/sys/%25UnitTest.Portal.Indices.cls?Index=19&$NAMESPACE=COMMONUse the following URL to view test coverage data:
http://IRIS-LOCALDEV:52773/csp/common/TestCoverage.UI.AggregateResultViewer.cls?Index=17
All PASSED
[COMMON|common-unit-tests] Test SUCCESSWhat am I doing wrong?
Thank you, and have a good day!
Kari Vatjus-Anttila
I tried to read through the source code to get an idea what triggers the report generation. Didn't find any clues, so I ended up tweaking the Manager.cls source code a bit. I modified OnAfterSaveResult() method where I check if the user has defined the CoverageReportClass and CoverageReportFile parameters and trigger the report generation if defined.
#; Generate coverage reports if neededIf (..UserFields.GetAt("CoverageReportClass") '= "") && (..UserFields.GetAt("CoverageReportFile") '= "") {
Set coverageReportClass = ..UserFields.GetAt("CoverageReportClass")
Set coverageReportFile = ..UserFields.GetAt("CoverageReportFile")
Set tSC = $ClassMethod(coverageReportClass, "GenerateReport", tRunIndex, coverageReportFile)
}That seems to work and solves my problem. If anyone has a better solution to the problem feel free to comment. For now I will stick with this.
Cheers,
Kari
Hello,
"Have you made further changes to TestCoverage.Manager"
No, only that if-clause which checks if Coverage related flags are set.
I am not that familiar with the Mocking Framework yet to know if the unit test manager causes any issues. I thought that setting -DUnitTest.ManagerClass=TestCoverage.Manager overrides the unit test manager. I verified that by adding some debug logging to the TestCoverage.Manager class.
At the moment, my solution works as expected. If you manage to find any clues why it does not work, I am very interested to know.
By the way, where exactly are the arguments CoverageReportClass and CoverageReportFile checked? I didn't find any trace of them in the source code.
Hi,
I was struggling with the same thing but managed to find a way to input the username and password unattended as a part of a CI/CD pipeline.
(echo 'username'; echo 'password'; echo: 'halt') | iris session IRIS
I use that to check if access to terminal is OK.
You can also run a script file after login by replacing the last echo with cat <script_ file>. For example:
(echo '_system'; echo 'password'; cat iris.script) | iris session IRIS
PS: It would be nice if irissession could accept a username and password via a command-line argument, so we don't have to resort to using these hacky methods.
Br. Kari
Hi,
And thanks for testing out the filter @David Foard
The example I showed is quite simple. Everything (markdowns, src, etc.) reside in the same repo so the Doxygen configuration is simple.
In your case I believe you need to link external documentation to your project, which is in another repo. Check out this link: Linking external documentation.
Also, here you can find all the configuration options Doxygen supports: Doxygen Configuration Reference.
- Kari
Sure, I will submit the solution to Open Exchange. Thanks for pointing that out @Evgeny Shvarov
Thanks for sharing your thoughts. When you have lots of REST APIs to deal with, it's a real hassle to split them into secure and non-secure apps because it makes maintenance complicated. Plus, the way JWT authentication works in the web app seems pretty basic and doesn't handle scopes properly, as per the docs.
In cases where you're using an external OAuth 2.0 token provider, not IRIS, and you want an easy way to check those tokens in your API, this solution still makes sense.
However, it's important to note that JWT signatures should be validated if they are signed. You can achieve this in a couple of ways: by using an OAuth2 Client defined in IRIS, which contains data about the token server and by using the %SYS.OAuth2.Validation class's ValidateJWT method to validate the token. Alternatively, you can utilize a predefined JWKS_URI parameter inside the REST implementation class to facilitate the validation process.
I'll take a look at the JWT validation using these two methods and update the post accordingly.
Thanks, Kari
I'll send you a DM about this. Most likely, it's a configuration-related issue in GitHub. One thing to note is that you have to select the source branch for Pages deployment as the gh_pages branch.
You've raised some excellent points, and it's certainly something I need to consider for the future. Thank you for bringing up Delegated Authentication. I'll make the necessary modifications to the article to emphasize that Web Applications should not be granted %All permissions but rather only the permissions that are essential for the API usage.
An interesting approach, but it involves custom tinkering with internal classes. Ultimately, it comes down to personal preference, but I tend to favor building on top of verified and tested implementations, reserving the extension of InterSystems base classes as a last resort.
In my approach, there's no need to modify any class implementations. Instead, you can simply use a utility method for validation. As previously discussed, these methods should be invoked as part of the normal authentication process, not within the API implementation, as Rich Taylor mentioned.
If I understand your solution correctly, the client sends a request, and the logic checks whether the application has the correct privileges. Does it also verify whether the provided credentials are accurate? In my method, I verify Basic Auth credentials via $SYSTEM.Security.Login() and then use $SYSTEM.Security.Check() to confirm whether the user in question possesses the required privileges.
I'll need to explore Delegated Authentication in IRIS and adjust the article accordingly.
I looked through the source to understand how those links are generated and found this:
When you click the Health button in the Management Portal, the list of namespaces that are displayed in the chooser are linked to the default application of the namespace, which in my case happened to be /csp/<namespace>. That was the reason why the URLs were returning 404. After I modified the namespace default application to be /csp/healthshare/<namespace>, the links started working.
So after I upgraded the container, the web application data was stored in durable %SYS and was not updated on container startup. One has to modify the applications manually or via some method after instance startup (%ZSTART, for example).
Here is a short script that I used to update the default application for all namespaces:
Set $NAMESPACE = "%SYS"
Set oldUiUrl = "/csp/" _ namespace _ "/"
Set newUiUrl = "/csp/healthshare/" _ namespace _ "/"
Set namespace = "yournamespace" // Change this to your namespace or add it as a method parameter that is used to loop through all your namespaces
Write "Checking if app " _ oldUiUrl _ " exists...", !
If (##class(Security.Applications).Exists(oldUiUrl, .oldApp, .status)) {
If (##class(Security.Applications).Exists(newUiUrl, .newApp, .status)) {
Write "Switching namespace default application from " _ oldUiUrl _ " to " _ newUiUrl, !
Set namespace = $ZConvert(namespace, "L")
Set props("IsNameSpaceDefault") = 0
Set sc = ##class(Security.Applications).Modify(oldUiUrl, .props)
If $$$ISERR(sc) {
Write "Failed to modify application " _ oldUiUrl _ ". Error=" _ $SYSTEM.sc.GetErrorText(sc), !
Return sc
} Else {
Write "Application " _ oldUiUrl _ " modified successfully", !
}
Set props("IsNameSpaceDefault") = 1
Set sc = ##class(Security.Applications).Modify(newUiUrl, .props)
If $$$ISERR(sc) {
Write "Failed to modify application " _ newUiUrl _ ". Error=" _ $SYSTEM.sc.GetErrorText(sc), !
Return sc
} Else {
Write "Application " _ newUiUrl _ " modified successfully", !
}
}
}
If you have any good tips on how to improve the upgrade process, please share them in the comments.
According to WRC support, SOCKS proxies are unfortunately not supported. What a shame, especially when an open-source tool like curl works. At the moment, the only solution for me is to request a VPN solution that allows me to access company internal services.
I ended up using Data Element Encryption (https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cl…) together with Production Credentials to store my secrets for 3rd party APIs securely, because in my use case that was way more easier than dealing with password vaults.
I encrypt my secrets using the data element encryption key I created (and activated), and then add them to Production Credentials encrypted.
When I need to use the credentials e.g. in a HTTP Operation, I just get the credentials I want and decrypt them with the same key just before sending the request.
I implemented a simple class which handles these tasks for me so I can just call Encrypt() and Decrypt() to get my ciphertext / plaintext when needed.
Let me know if you need more assistance with this approach.
Br, Kari