Timothy Leavitt · Dec 12, 2019 go to post

Another option rather than having two versions of the whole codebase could be having a wrapper module around webterminal (i.e., another module that depends on webterminal) with hooks in webterminal to allow that wrapper to turn off projection-based installation-related features.

Timothy Leavitt · Dec 16, 2019 go to post

For bootstrap-table, I think the examples on their site are probably more useful than anything I could dig up. https://examples.bootstrap-table.com/#welcomes/large-data.html shows pretty good performance for a large dataset. Tabulator looks nice too though.

In any case it would probably be cleanest to load data via REST rather than rendering everything in the page in an HTML table and then using a library to make the table pretty.

Timothy Leavitt · Jan 15, 2020 go to post

1. Suppose $TLevel > (tInitTLevel + 1). That means that someone else's transaction was left open. You can't always guarantee that the code you're calling will behave by matching tstart with tcommit or trollback 1, but you can account for the possibility of your dependency misbehaving in your own transaction cleanup. Agreed on never using argumentless trollback.

2. Great point - updated accordingly.

Timothy Leavitt · Jan 15, 2020 go to post

That's interesting. I think it would really be:

If (^$LOCK("^MyGlobal(42)","OWNER") = $Job) {
    Lock -^MyGlobal(42)
}
Timothy Leavitt · Mar 15, 2016 go to post

Here are two perspectives, from different development workflows:

My team (working on a large Caché-based application) does development on a newer Ensemble version than the version on which our software is released. We perform functional and performance testing against the older version. Most of the time, moving code from newer versions to older works just fine; when it does fail, it tends to be very obvious. The last class dictionary version change was in 2011.1, so that isn’t a concern for upgrades involving recent versions. These used to be much more frequent. Working this way provides a good sort of pressure for us to upgrade when we find new features that the old version doesn’t have or performance improvements on the newer version. It also eases concerns about future upgrades to the version on which the application has been developed.

For Caché-based internal applications at InterSystems, we have separate designated dev/test/live environments. Application code changes typically spend a relatively short time in dev, and a much shorter time in test, before going live. Upgrades are typically done in a short time frame and move through the environments in that order. It would be incredibly risky to upgrade the live environment first!  Rather, our process and validation for upgrades go through the same process as functional changes to these applications. The dev environment is upgraded first; this might take time if there are application issues found in testing after the upgrade. The test environment is upgraded next, typically a very short time before the live environment is upgraded. It's OK if code changes are made in dev before the test environment is upgraded, because changes will still be compiled and tested in the older Caché/Ensemble version prior to going live. Of course, if testing fails, the upgrade may become a prerequisite for the given change to the application. Additionally, we periodically clone the test environment for upgrades to and validation against field test versions. Using virtual machines makes this very easy.

Timothy Leavitt · Mar 16, 2016 go to post

1/2. I use the "My Content" page to quickly get back to my own posts. I don't use "My Collaborations" or understand how the content that appears there is determined.

3. I'd expect to see my own posts (and perhaps answers) on "My Content" and for "My Collaborations" to show links to questions that I have answered and to posts/answers that I have commented on. Answers could fit in either category (or both); I'm not sure if those are best understood as content or as a particular type of comment.

4. Show the content described in (3) on these pages. Also, for answers, rather than showing "answer406256" as the title (for example), show "Answer: <title of question>" and link to the answer within the question page rather than the answer on its own. I think the same would also apply to comments shown on "My Collaborations" (if that approach is taken). If "My Collaborations" shows comments it might make sense to group them by post, in case there's a very active back-and-forth.

Timothy Leavitt · Mar 17, 2016 go to post

I'd suggest something like this, after calling Get:

Set tParams = request.ReturnParams()
Set tQuery = $Case(tParams,"":"",:"?"_tParams)
Set tURL = $Select(request.Https:"https://",1:"http://")_request.Server_":"_request.Port_"/"
    _request.Location_tQuery
Timothy Leavitt · Mar 18, 2016 go to post

Try:

Do $System.Process.Terminate(,exitCode)

See documentation for reference. On older versions I believe the equivalent is:

Do $zu(4,$job,exitCode)

But this shouldn't be used if the nicer method is available.

Timothy Leavitt · Mar 21, 2016 go to post

In 2016.2, you can do this by overriding %ToDynamicObject in Data.Person as follows:

/// In addition to the default behavior, also set the ID property to ..%Id()
Method %ToDynamicObject(target As %Object = "", ignoreUnknown = 0) [ ServerOnly = 1 ]
{
    set:target="" target = {}
    set target.ID = ..%Id() //Set ID property first so it comes at the beginning of the JSON output.
    do ##super(.target,.ignoreUnknown)
}
Timothy Leavitt · Mar 28, 2016 go to post

An easier way to capture the stack and values of variables at all stack levels is:

Do LOG^%ETN

Then it's possible to view full information on stack and variables from terminal with:

Do ^%ER

Or in the management portal at System Operation > System Logs > Application Error Log.

If you're logging to globals to track more limited parts of the process state for debugging, it's helpful to use a ^CacheTemp* or ^mtemp* global so that the debugging information (a) isn't rolled back by TROLLBACK and (b) won't accumulate in an important database if the debugging code is accidentally left in.

Timothy Leavitt · Apr 1, 2016 go to post

Presumably, if you're showing the results in a report sorted based on the currently-selected column (in the currColumn property of the tablePane), you could also look at the sort order for the tablePane (sortOrder property, "asc" or "desc") and then $order over the index global in reverse if it's "desc".

Here's a class query/example that could help - you can modify the ROWSPEC to fit your purposes.

/// Queries snapshot data for a Zen tablePane, optionally sorted.
/// QuerySnapshotExecute returns an error if the snapshot or a required index (for sorting) is missing.
/// 
/// sessionId : CSP session ID of the user whose tablePane will be shown
/// snapshotId : snapshotId property of the tablePane
/// tablePaneIndex : index property of the tablePane
/// sortColumn : (optional) currColumn property of the tablePane: the column name of the column to sort by
/// sortOrder : (optional; default is ascending) sortOrder property of the tablePane (asc/desc)
Query QuerySnapshot(sessionId As %String, snapshotId As %Integer, tablePaneIndex As %Integer, sortColumn As %String = "", sortOrder As %String = "") As %Query(ROWSPEC = "col1:%String,col2:%String,col3:%String,col4:%String,col5:%String,col6:%String,col7:%String,col8:%String,col9:%String,col10:%String,col11:%String") [ SqlProc ]
{
}

ClassMethod QuerySnapshotExecute(ByRef qHandle As %Binary, sessionId As %String, snapshotId As %Integer, tablePaneIndex As %Integer, sortColumn As %String = "", sortOrder As %String = "") As %Status
{
    Set tDataGlobal = "^CacheTemp.zenData("""_sessionId_""","_snapshotId_","_tablePaneIndex_",""data"")"
    Quit:'$Data(@tDataGlobal) $$$ERROR($$$GeneralError,"Invalid reference to tablePane snapshot.")
    Set tIndexGlobal = $Case(sortColumn,"":"",:"^CacheTemp.zenData("""_sessionId_""","_snapshotId_","_tablePaneIndex_",""index"","""_sortColumn_""")")
    Quit:'$Data(@tIndexGlobal) $$$ERROR($$$GeneralError,$$$FormatText("tablePane snapshot index not populated for property %1",sortColumn))
    Set qHandle = $ListBuild(tDataGlobal,tIndexGlobal,$Case(sortOrder,"desc":-1,:1),"","")
    Quit $$$OK
}

ClassMethod QuerySnapshotFetch(ByRef qHandle As %Binary, ByRef Row As %List, ByRef AtEnd As %Integer = 0) As %Status [ PlaceAfter = QuerySnapshotExecute ]
{
    Set $ListBuild(tDataGlobal,tIndexGlobal,tSortOrder,tSub1,tSub2) = qHandle
    If (tIndexGlobal = "") {
        // Not sorting by any column.
        Set tSub2 = $Order(@tDataGlobal@(tSub2),tSortOrder)
        If (tSub2 = "") { Set AtEnd = 1 }
    } Else {
        // First $order over values of the indexed column
        Set:tSub1="" tSub1 = $Order(@tIndexGlobal@(tSub1),tSortOrder)
        If (tSub1 '= "") {
            // There may be multiple matches for a single key in the index. Get the next one for this key.
            Set tSub2 = $Order(@tIndexGlobal@(tSub1,tSub2),tSortOrder)
            // If we previously were on the last value for the index key, move on to the next index key.
            If (tSub2 = "") {
                Set tSub1 = $Order(@tIndexGlobal@(tSub1),tSortOrder)
                Set:tSub1'="" tSub2 = $Order(@tIndexGlobal@(tSub1,tSub2),tSortOrder)
            }
        }
        If (tSub1 = "") && (tSub2 = "") { Set AtEnd = 1 }
    }
    
    If 'AtEnd {
        Set Row = @tDataGlobal@(tSub2)
        Set $List(qHandle,4) = tSub1
        Set $List(qHandle,5) = tSub2
    }
    Quit $$$OK
}

ClassMethod QuerySnapshotClose(ByRef qHandle As %Binary) As %Status [ PlaceAfter = QuerySnapshotExecute ]
{
    Quit $$$OK
}

Sample use, against /csp/samples/ZENTest.TableTest.cls (and, in my case, with the class query defined in App.TablePaneUtils):

call App.TablePaneUtils_QuerySnapshot(<your session ID>,<your snapshot number>,23)
call App.TablePaneUtils_QuerySnapshot(<your session ID>,<your snapshot number>,23,,'desc')
call App.TablePaneUtils_QuerySnapshot(<your session ID>,<your snapshot number>,23,'Title','asc')
Timothy Leavitt · Apr 5, 2016 go to post

Here's a simple example that'll run in the Samples namespace. It demonstrates saving all the data at the same time and saving it one row at a time after each cell is edited.

Class App.Sample.DataGridPage Extends %ZEN.Component.page
{

/// This Style block contains page-specific CSS style definitions.
XData Style
{
<style type="text/css">
#dataGrid {
    width: 100%;
    height: 500px;
}
</style>
}

/// This XML block defines the contents of this page.
XData Contents [ XMLNamespace = "http://www.intersystems.com/zen" ]
{
<page xmlns="http://www.intersystems.com/zen" title="dataGrid save sample">
<jsonSQLProvider id="json" OnSubmitContent="SubmitContent"
  targetClass="%ZEN.proxyObject" sql="select ID,Name,DOB,SSN from sample.person order by name" />
 <dataGrid pageSize="20" id="dataGrid" pagingMode="client" controllerId="json" sortMode="client"
  selectMode="cells" onchangecell="return zenPage.fireChangeCell(value);" onchange="zenPage.gridChanged();">
 <columnDescriptor caption="ID" type="string" readOnly="false"/>
 <columnDescriptor caption="Name" type="string" readOnly="false"/>
 <columnDescriptor caption="DOB" type="string" readOnly="false"/>
 <columnDescriptor caption="SSN" type="string" readOnly="false"/>
 </dataGrid>
 <hgroup labelPosition="left" cellAlign="even">
 <radioSet id="modeRadio" valueList="edit,manual" displayList="After Each Edit,Manually"
  label="Save Data: " value="edit" />
 <button onclick="zen('json').submitContent()" caption="Save Everything" />
 </hgroup>
</page>
}

ClientMethod fireChangeCell(value) [ Language = javascript ]
{
    // Capture the number of the last row that was changed.
    zenPage._lastChangedRow = zen('dataGrid').getProperty('currRow');
    return value;
}

ClientMethod gridChanged() [ Language = javascript ]
{
    if (zen('modeRadio').getValue() == 'edit') {
        zen('json').submitContent('saveRow:'+zenPage._lastChangedRow);
    }
}

Method SubmitContent(pCommand As %String,
pProvider As %ZEN.Auxiliary.jsonProvider,
pSubmitObject As %ZEN.proxyObject,
ByRef pResponseObject As %RegisteredObject) As %Status
{
    Set tSC = $$$OK
    Try {
        TSTART
        If (pCommand = "") {
            //Save everything.
            For {
                Set tProxy = pSubmitObject.children.GetNext(.tKey)
                Quit:tKey=""
                $$$ThrowOnError(..SavePersonProxy(tProxy))
            }
        } Else {
            Set tCommandInfo = $ListFromString(pCommand,":")
            If ($lg(tCommandInfo,1) = "saveRow") {
                //Save only the specified row (faster)
                Set tData = pSubmitObject.children.GetAt($lg(tCommandInfo,2))
                If $IsObject(tData) {
                    Set tObj = ##class(Sample.Person).%OpenId(tData.ID,,.tSC)
                    $$$ThrowOnError(tSC)
                    Set tObj.Name = tData.Name
                    Set tObj.SSN = tData.SSN
                    Set tObj.DOB = $zdh(tData.DOB)
                    $$$ThrowOnError(tObj.%Save())
                } Else {
                    $$$ThrowStatus($$$ERROR($$$GeneralError,"An error occurred saving row "_$lg(tCommandInfo,2)))
                }
            }
        }
        TCOMMIT
    } Catch anyException {
        TROLLBACK
        Set tSC = anyException.AsStatus()
    }
    Quit tSC
}

Method SavePersonProxy(pProxy As %ZEN.proxyObject) As %Status
{
    Set tObj = ##class(Sample.Person).%OpenId(pProxy.ID,,.tSC)
    Quit:$$$ISERR(tSC) tSC
    
    Set tObj.Name = pProxy.Name
    Set tObj.SSN = pProxy.SSN
    Set tObj.DOB = $zdh(pProxy.DOB)
    Quit tObj.%Save()
}

}
Timothy Leavitt · Apr 13, 2016 go to post

For batch/shell scripts, ccontrol runw may be better. You can see all the options with:

ccontrol help

With ccontrol runw, spaces are accepted; for example, this should work (after replacing <instancename> with the name of your Caché instance):

ccontrol runw <instancename> ^ANDYTST(\"c:\folder with spaces\\\") USER

I'm not sure if there are options other than OS authentication (which, if enabled, has such commands run as the Caché user matching the OS-level username).

This post may also be relevant: https://community.intersystems.com/post/how-return-status-code-cache-pr…

Timothy Leavitt · Apr 15, 2016 go to post

In short: you could put a Zen page with a dynaForm in an <iframe>, or use something other than Zen/dynaForm.

The documentation about custom workflow task forms says that the form should be a fragment of HTML in a CSP page, not an entire page. Although Zen pages are CSP pages, it looks like Zen pages can't be used directly as the form template. Under the hood, the inclusion of this CSP page bypasses %OnPreHTTP, which does some necessary setup for Zen pages (particularly, initializing %page and %application). Even if this wasn't the case, and a full Zen page could be inserted, it would end up looking pretty weird.

A fairly simple solution would be create a very simple CSP page that has an <iframe> containing your Zen page, and to use that CSP page as the form template. Any necessary data from %task could be passed along in the Zen page's URL. The onAction method could also be propagated to the iframe, perhaps using Window.postMessage (etc.) to define how the frames can interact.

If that's getting too complicated, perhaps consider using something other than Zen/dynaForm that would fit more naturally in a CSP page. (Perhaps modern JS libraries, REST, etc.)

Timothy Leavitt · Apr 19, 2016 go to post

This is a really good point.

At some level, this is part of the behavior of %Studio.SourceControl.ISC, the studio extension class for source control using Perforce. Studio doesn't automatically recompile the class and dependent classes after checkout either. This has bitten me before - I've undone a checkout, but forgotten to recompile, leaving the old compiled version in effect. It might be reasonable for %Studio.SourceControl.ISC to have an option to automatically compile edited items after undo of a checkout, or even to just do that all the time.

Also, Atelier actually does have a separate "compile" option, in the toolbar at the top. (The icon has a file with "010" on it.)

This is an important feature; in addition to the case you noted, there are several situations I can think of offhand where a class would need to be recompiled even though it hasn't changed:

  • The behavior of a macro defined in a .inc file changes. Classes that use that macro must be recompiled to get the new behavior.
  • A method in Class A is called from a [ CodeMode = objectgenerator ] method in Class B. If the implementation of the method in Class A changes, Class B may need to be recompiled. (It won't be recompiled automatically.)

One downside to automatically compiling impacted/dependent classes is that it can take a while - if a minor change impacts hundreds of classes, it might be reasonable to save the class and compile it in separate actions. There's a preference in Atelier (Preferences -> Atelier -> Save Settings, "server save action") to not compile files automatically when they're saved to the server. Atelier is much better about this than Studio, though; when compiling hundreds of dependent classes, Studio tends to freeze up. With Atelier there's the possibility of a timeout, but the editor should remain responsive while the compilation is happening.

Timothy Leavitt · Apr 20, 2016 go to post

There are good options for what you want available in 2016.2, and possibly better answers for SQL -> JSON after that.

In 2016.2, %RegisteredObject also supports $toJSON and $fromJSON, so there won't be any need to use %ZEN.Auxiliary.jsonProvider to do that conversion. Under the hood, the path is really RegisteredObject -> Dynamic Object (via $compose) -> JSON, and JSON -> Dynamic Object -> RegisteredObject (via $compose)

Therefore, the behavior of $toJSON and $fromJSON can be modified for %RegisteredObject subclasses by overriding (typically) %ToDynamicObject and %FromObject. Here's an example that might serve as a useful starting point for Object -> JSON/JSON -> Object on 2016.2+:

Class DCDemo.JSONDateTime Extends (%Persistent, %Populate)
{

Property Name As %String;

Property DateField As %Date;

Property "Time_Stamp_Field" As %TimeStamp;

Property TimeField As %Time;

ClassMethod Run()
{
    Do ..%KillExtent()
    Do ..Populate(1)
    
    Set tObj = ..%OpenId(1)
    Write "Object ID 1",!
    zw tObj
    Write !
    
    Set tJSON = tObj.$toJSON()
    Write "JSON for that object:",!
    Write tJSON,!,!
    
    Set tObj2 = ..$fromJSON(tJSON)
    Write "Object from that JSON:",!
    zw tObj2
    Write !
}

Method %ToDynamicObject(target As %Object = "", ignoreUnknown = 0) [ ServerOnly = 1 ]
{
    Set tObj = ##super(target,ignoreUnknown)
    Do ..DateTimeToISO8601(tObj)
    Quit tObj
}

ClassMethod %FromObject(source = "", target = "", laxMode As %Integer = 1) As %RegisteredObject [ ServerOnly = 1 ]
{
    Set tObj = ##super(source,target,laxMode)
    If source.%IsA("%Library.AbstractObject") {
        Do ..ISO8601ToDateTime(tObj)
    }
    Quit tObj
}

ClassMethod DateTimeToISO8601(pObj As %Library.AbstractObject) [ CodeMode = objectgenerator ]
{
    #dim tProp As %Dictionary.CompiledProperty
    Set tKey = ""
    For {
        Set tProp = %compiledclass.Properties.GetNext(.tKey)
        Quit:tKey=""
        
        If (tProp.Type '= "") && 'tProp.ReadOnly && 'tProp.Calculated {
            Set tType = tProp.Type
            Set tExpr = ""
            If $ClassMethod(tType,"%Extends","%Library.Date") {
                Set tExpr = "Set %arg = $zd(%arg,3)"
            } ElseIf $ClassMethod(tType,"%Extends","%Library.Time") {
                Set tExpr = "Set %arg = $zt(%arg,1)"
            } ElseIf $ClassMethod(tType,"%Extends","%Library.TimeStamp") {
                Set tExpr = "Set %arg = $Case(%arg,"""":"""",:$Replace(%arg,"" "",""T"")_""Z"")"
            }
            Do:tExpr'="" %code.WriteLine($c(9)_$Replace(tExpr,"%arg","pObj."_$$$QN(tProp.Name)))
        }
    }
}

ClassMethod ISO8601ToDateTime(pObj As DCDemo.JSONDateTime) [ CodeMode = objectgenerator ]
{
    #dim tProp As %Dictionary.CompiledProperty
    Set tKey = ""
    For {
        Set tProp = %compiledclass.Properties.GetNext(.tKey)
        Quit:tKey=""
        
        If (tProp.Type '= "") && 'tProp.ReadOnly && 'tProp.Calculated {
            Set tType = tProp.Type
            Set tExpr = ""
            If $ClassMethod(tType,"%Extends","%Library.Date") {
                Set tExpr = "Set %arg = $zdh(%arg,3)"
            } ElseIf $ClassMethod(tType,"%Extends","%Library.Time") {
                Set tExpr = "Set %arg = $zth(%arg,1)"
            } ElseIf $ClassMethod(tType,"%Extends","%Library.TimeStamp") {
                Set tExpr = "Set %arg = $Extract($Replace(%arg,""T"","" ""),1,*-1)"
            }
            Do:tExpr'="" %code.WriteLine($c(9)_$Replace(tExpr,"%arg","pObj."_$$$QN(tProp.Name)))
        }
    }
}

}

The output of this is:

USER>d ##class(DCDemo.JSONDateTime).Run()
Object ID 1
tObj=<OBJECT REFERENCE>[1@DCDemo.JSONDateTime]
+----------------- general information ---------------
|      oref value: 1
|      class name: DCDemo.JSONDateTime
|           %%OID: $lb("1","DCDemo.JSONDateTime")
| reference count: 2
+----------------- attribute values ------------------
|       %Concurrency = 1  <Set>
|          DateField = 40424
|               Name = "North,Richard G."
|          TimeField = 74813
|   Time_Stamp_Field = "1963-11-18 01:49:29"
+-----------------------------------------------------
 
JSON for that object:
{"$CLASSNAME":"DCDemo.JSONDateTime","$REFERENCE":"1","DateField":"1951-09-05","Name":"North,Richard G.","TimeField":"20:46:53","Time_Stamp_Field":"1963-11-18T01:49:29Z"}
 
Object from that JSON:
tObj2=<OBJECT REFERENCE>[4@DCDemo.JSONDateTime]
+----------------- general information ---------------
|      oref value: 4
|      class name: DCDemo.JSONDateTime
| reference count: 2
+----------------- attribute values ------------------
|       %Concurrency = 1  <Set>
|          DateField = 40424
|               Name = "North,Richard G."
|          TimeField = 74813
|   Time_Stamp_Field = "1963-11-18 01:49:29"
+-----------------------------------------------------

The matter of SQL -> JSON is a bit more complicated. ODBC select mode for SQL is similar to ISO 8601, but not completely (the timestamp format is different). One option would be to create a class (extending %RegisteredObject) to represent a query result with date/time fields in ISO 8601 format, and to override the same methods in it so that:

  • It can be $compose'd from a %SQL.IResultSet (done in %FromObject)
  • Based on query column metadata, dates/times/timestamps are converted to the correct format when the object is represented as a %Object/%Array or, indirectly, in JSON (done in %ToDynamicObject / %ToDynamicArray).

This could probably be done in 2016.2, but might be less work to accomplish in a future version when SQL result sets support $fromJSON/$toJSON. (I think this plan was mentioned in a different post.)

I suppose there are some possible complications with all this, depending on whether times/timestamps in your application are actually local or UTC. (Or worse, a mix...)

Timothy Leavitt · Apr 20, 2016 go to post

Here's some code from the application I'm working on that might help. The "load/delete the test classes" behavior was annoying enough that we decided to always have the classes loaded on development/testing systems.

First, I think it's useful to have a Run() method in each unit test class, or in a subclass of %UnitTest.TestCase that your unit tests will extend. This code could live somewhere else too, but it's useful to be able to say:

do ##class(my.test.class).Run()

and not have to remember/type the test suite format and /nodelete. Sample implementation:

Class Tools.UnitTest.TestCase Extends %UnitTest.TestCase
{

/// Runs the test methods in this unit test class.
ClassMethod Run(ByRef pUTManager As %UnitTest.Manager = "", pBreakOnError As %Boolean = 0)
{
    If '$IsObject(pUTManager) {
        Set pUTManager = ##class(%UnitTest.Manager).%New() //Or Tools.UnitTest.Manager if you have that
        Set pUTManager.Debug = pBreakOnError
        Set pUTManager.Display = "log,error"
    }
    Set tTestSuite = $Piece($classname(),".",1,*-1)
    Set qspec = "/noload/nodelete"
    Set tSC = $$$qualifierParseAlterDefault("UnitTest","/keepsource",.qspec,.qstruct)
    Do pUTManager.RunOneTestSuite("",$Replace(tTestSuite,".","/"),tTestSuite_":"_$classname(),.qstruct)
}

}

This allows you to specify an instance of a %UnitTest.Manager to capture the test results in, which is useful if you're running a bunch of specific unit test classes (like you suggested, from a Studio project). My team organizes tests in packages rather than in projects, which makes more sense for us.

Next up, here's our %UnitTest.Manager subclass that works with the %UnitTest.TestCase subclass shown above, allowing all the classes in a particular namespace or package (or, really, with class names that contain a particular string) to be run without deleting them afterward:

Class Tools.UnitTest.Manager Extends %UnitTest.Manager
{

/// Runs all unit tests (assuming that they're already loaded)
/// May filter by package or output to a log file rather than terminal
ClassMethod RunAllTests(pPackage As %String = "", pLogFile As %String = "") As %Status
{
    Set tSuccess = 1
    Try {
        Set tLogFileOpen = 0
        Set tOldIO = $io
        If (pLogFile '= "") {
            Open pLogFile:"WNS":10
            Set tLogFileOpen = 1
            Use pLogFile
        }
        
        Write "*** Unit tests starting at ",$zdt($h,3)," ***",!
    
        Set tBegin = $zh
    
        Set tUnitTestManager = ..%New()
        Set tUnitTestManager.Display = "log,error"
        Set tStmt = ##class(%SQL.Statement).%New()
        Set tSC = tStmt.%PrepareClassQuery("%Dictionary.ClassDefinition","SubclassOf")
        $$$THROWONERROR(tSC,tStmt.%PrepareClassQuery("%Dictionary.ClassDefinition","SubclassOf"))
        Set tRes = tStmt.%Execute("Tools.UnitTest.TestCase")
        While tRes.%Next(.tSC) {
            If $$$ISERR(tSC) $$$ThrowStatus(tSC)
            Continue:(pPackage'="")&&(tRes.%Get("Name") '[ pPackage)
            Do $classmethod(tRes.%Get("Name"),"Run",.tUnitTestManager)
        }
    
        If $IsObject(tUnitTestManager) {
            Do tUnitTestManager.SaveResult($zh-tBegin)
            Do tUnitTestManager.PrintURL()
    
            &sql(select sum(case when c.Status = 0 then 1 else 0 end) as failed,
                        sum(case when c.Status = 1 then 1 else 0 end) as passed,
                        sum(case when c.Status = 2 then 1 else 0 end) as skipped
                        into :tFailed, :tPassed, :tSkipped
                   from %UnitTest_Result.TestSuite s
                   join %UnitTest_Result.TestCase c
                     on s.Id = c.TestSuite
                  where s.TestInstance = :tUnitTestManager.LogIndex)

            If (tFailed '= 0) {
                Set tSuccess = 0
            }
        } Else {
            Write "No unit tests found matching package: ",pPackage,!
        }
    } Catch anyException {
        Set tSuccess = 0
        Write anyException.DisplayString(),!
    }
    Write !,!,"Test cases: ",tPassed," passed, ",tSkipped," skipped, ",tFailed," failed",!
    If 'tSuccess {
        Write !,"ERROR(S) OCCURRED."
    }
    Use tOldIO
    Close:tLogFileOpen pLogFile
    Quit $Select(tSuccess:1,1:$$$ERROR($$$GeneralError,"One or more errors occurred in unit tests."))
}

This could probably be tweaked to use a project instead without too much work, but I think packages are a more reasonable way of organizing unit tests.

Timothy Leavitt · Apr 21, 2016 go to post

You're really close; the key is using the stream's OID (from %Oid()). Here's a simple example; you can substitute any appropriate file path.

Class Demo.DynamicImage Extends %ZEN.Component.page
{

/// This XML block defines the contents of this page.
XData Contents [ XMLNamespace = "http://www.intersystems.com/zen" ]
{
<page xmlns="http://www.intersystems.com/zen" title="">
<image id="myImage" src="" />
<button onclick="zenPage.ChangeImage(zen('myImage'))" caption="Dynamically Change Image" />
</page>
}

ClassMethod ChangeImage(pImage As %ZEN.Component.image) [ ZenMethod ]
{
    Set tStream = ##class(%Stream.FileBinary).%New()
    Do tStream.LinkToFile(##class(%File).ManagerDirectory()_"..\CSP\broker\images\einstein.jpg")
    Set tOID = ..Encrypt(tStream.%Oid())
    Set pImage.src = "%25CSP.StreamServer.cls?STREAMOID="_tOID
}

}

I'm really curious what that image is doing in /csp/broker/...

Timothy Leavitt · Apr 22, 2016 go to post

The SVG diagram is loaded in Eclipse's internal browser, which will always be IE for you. The preference you found applies to "external" browsers.

Within the internal browser in Eclipse, you can right click and select "view source." When you do so, you should see something like this near the top:

<meta http-equiv="X-UA-Compatible" content="IE=9" />

It would be interesting to know what <meta> tag you see, if any. It would also be useful to know the value of the User-Agent header sent by the internal browser. There are several ways to find that; here's one quick option:

  1. Open a BPL class in Atelier
  2. Run the following code in Terminal:
    k ^%ISCLOG s ^%ISCLOG = 2 read x s ^%ISCLOG = 0
  3. In Atelier, right click in the BPL class and click the "Open diagram editor" popup menu item
  4. Hit enter in Terminal to stop logging.

If you then zwrite ^%ISCLOG you should see the user-agent in a $listbuild list near the end of the output. I see:

^%ISCLOG("Data",180,0)=$lb(900,,0,5532241409,"0²!t"_$c(28,16)_"IÎ"_$c(22)_"F40"_$c(133)_"¯4_ài"_$c(156)_"èB_9}%"_$c(144,155,9)_"!`"_$c(135)_"ü",2,"ENSDEMO","001000010000OoTvE12bLJWATFMLUAodU0gK1Z8HvjdbJWLK3M",,0,"en-us","OoTvE12bLJ",2,1,"/csp/ensdemo/",$lb("UnknownUser","%All","%All",64,-559038737),"","","","2016-04-22 13:28:27","2016-04-22 13:28:30","","Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.1; Win64; x64; Trident/6.0)","","",0,"",$lb($lb("%ZEN.SessionEvents","ENSDEMO",)),"","%iscmgtportal:5ykW4kOfOzwr7O8gcok8XQ--",0,"","","","","")

(It's awesome how IE says it's Mozilla, for compatibility reasons.)

Timothy Leavitt · Apr 25, 2016 go to post

In addition to %IsA (or, similarly, %Extends, which considers multiple inheritance rather than just primary superclasses), the following snippet (slightly modified from an answer I posted on one of your previous questions) may be helpful if you're looking for all of the names of unit test classes:

        Set tStmt = ##class(%SQL.Statement).%New()
        $$$ThrowOnError(tStmt.%PrepareClassQuery("%Dictionary.ClassDefinition","SubclassOf"))
        Set tRes = tStmt.%Execute("%UnitTest.TestCase")
        While tRes.%Next(.tSC) {
            $$$ThrowOnError(tSC)
            //TODO: something with tRes.%Get("Name")
        }
        $$$ThrowOnError(tSC)

If you're filtering by package - and it looks like https://github.com/litesolutions/cache-utcov/blob/master/src/utcov/ClassLookup.cls does this - then you can supply a second argument to the SubclassOf query with the package name for better performance. (i.e., Set tRes = tStmt.%Execute("%UnitTest.TestCase","Some.Package.Name."))

All of these approaches work recursively. (C extends B, B extends A -> C extends A.)

Timothy Leavitt · May 5, 2016 go to post

I don't think this is currently possible in Atelier. (I was looking for the same feature yesterday and couldn't find it.)

Using Eclipse as a Java editor, the override menu option is Source -> Override/Implement Methods...; presumably, the equivalent feature in Atelier would be in the same place, but there's nothing like that in the "Source" menu.

Timothy Leavitt · May 16, 2016 go to post

You could map the package containing the class related to that table using a package mapping, and the globals containing the table's data using global mappings.

You can see which globals the class uses in its storage definition - since the entire package is mapped, it might make sense to add a global mapping with the package name and a wildcard (*).

After taking those steps, you can insert to the table the way you usually would, without any special syntax or using zn/set $namespace.

Timothy Leavitt · May 23, 2016 go to post

The problem is that REST uses IO redirection itself, and OutputToStr changes the mnemonic routine but doesn't change it back at the end.

For a great example of the general safe approach to cleaning up after IO redirection (restoring to the previous state of everything), see %WriteJSONStreamFromObject in %ZEN.Auxiliary.jsonProvider.

Here's a simple approach that works for me, in this case:

set tOldIORedirected = ##class(%Device).ReDirectIO()
set tOldMnemonic = ##class(%Device).GetMnemonicRoutine()
set tOldIO = $io
try {
	set str=""

	//Redirect IO to the current routine - makes use of the labels defined below
	use $io::("^"_$ZNAME)

	//Enable redirection
	do ##class(%Device).ReDirectIO(1)

	if $isobject(pObj) {
		do $Method(pObj,pMethod,pArgs...)
	} elseif $$$comClassDefined(pObj) {
		do $ClassMethod(pObj,pMethod,pArgs...)
	}
} catch ex {
	set str = ""
}

//Return to original redirection/mnemonic routine settings
if (tOldMnemonic '= "") {
	use tOldIO::("^"_tOldMnemonic)
} else {
	use tOldIO
}
do ##class(%Device).ReDirectIO(tOldIORedirected)

quit str

It would be cool if something like this could work instead:

new $io
try {
	set str=""

	//Redirect IO to the current routine - makes use of the labels defined below
	use $io::("^"_$ZNAME)

	//Enable redirection
	do ##class(%Device).ReDirectIO(1)

	if $isobject(pObj) {
		do $Method(pObj,pMethod,pArgs...)
	} elseif $$$comClassDefined(pObj) {
		do $ClassMethod(pObj,pMethod,pArgs...)
	}
} catch ex {
	set str = ""
}

quit str

But $io can't be new'd.

Timothy Leavitt · May 31, 2016 go to post

Here's a sample, using %ToDynamicObject (2016.2+):

Class DC.CustomJSONSample extends %RegisteredObject
{
Property myProperty As %String [ InitialExpression = "hello" ];

Property other As %String [ InitialExpression = "world" ];

/// Rename myProperty to custom_property_name
Method %ToDynamicObject(target As %Object = "", ignoreUnknown = 0) [ ServerOnly = 1 ]
{
	Do ##super(.target,.ignoreUnknown)
	Do target.$set("custom_property_name",target.myProperty,target.$getTypeOf("myProperty"))
	Do target.$remove("myProperty")
}

ClassMethod Run()
{
	Set tObj = ..%New()
	Write tObj.$toJSON()
}

}

Output:

SAMPLES>d ##class(DC.CustomJSONSample).Run()
{"other":"world","custom_property_name":"hello"}

For other discussions with solutions/examples involving %ToDynamicObject, see:
https://community.intersystems.com/post/json-cache-and-datetime
https://community.intersystems.com/post/create-dynamic-object-object-id

Timothy Leavitt · Jun 2, 2016 go to post

Export with File > Export > General > Preferences; check "Keys Preferences" (which only appears if you've customized any preferences)

Import with File > Import > General > Preferences; select file then check "Keys Preferences"

See: http://stackoverflow.com/questions/481073/eclipse-keybindings-settings

It seems that the CSV export from Window > Preferences, General > Keys, Export CSV ... doesn't have a corresponding import feature.

Timothy Leavitt · Jun 7, 2016 go to post

The class you listed fails for me too, but it's because there's no package name, not because of the bracket mismatch. In Atelier, the class name gets an error marker:

If I change it to:

class somepackage.myclass {
//if someVar {
}

Then the file will happily sync and compile.

If adding the package doesn't fix things, it would be helpful to know what Atelier and Caché versions you're using.