You might also find some relief by querying %Dictionary.MethodDefinition WHERE SQLProc=1 or %Dictionary.QueryDefinition. If it's a really old version you might look in %Library schema.
- Log in to post comments
You might also find some relief by querying %Dictionary.MethodDefinition WHERE SQLProc=1 or %Dictionary.QueryDefinition. If it's a really old version you might look in %Library schema.
Can you elaborate on what you mean by "hide" any component under ReportDefinition. The ReportDefinition is where you define the XML document that will be created that is later used by ReportDisplay to output the data for a PDF report, Excel utilizes the ReportDefinition. Just because you have an entity(element/attribute/group) in a ReportDefinition doesn't mean it will be utilized in the ReportDisplay.
how about adding a note about the fact that each index implements a
IndexNameExists(value,.tId)
method and can be used to quickly determine if a value exists in the index?
how about adding a note about the fact that each index implements a
IndexNameExists(value,.tId)
method and can be used to quickly determine if a value exists in the index?
and since others will view this we really should include in the method GetPropertiesFromRelationship the call to %UnSwizzleAt. I don't think in the above example it would improve performance in a positive meaningful way but when you encounter large relationship graphs this can be very important.
I'm not sure when using %Library.ResultSet but there are several reasons you might want to consider using %SQL.Statement including with %SQL.Statement you can use the instance method %GetImplementationDetails which should provide you with what you are looking for.
IRIS supports
At the same time if you have the proper IRIS license you can create analytics cubes within IRIS and as you say you can connect directly and leverage those analytics cubes.
some thoughts on what you have presented.
1. It doesn't seem like the usedclassesdef.zip is available any longer
2. When examining a query plan I mostly do CONTROL-F and then search for the word looping. The statements that say "using the given ideky value" generally are not going to be of concern. I see in your query plan it has "Read index map MSDS_Serenity.CustomeInstrument.InstIndx looping on Instrument and ID." This seems like one area of concern. This is telling us I believe that a table scan is being don on CustomerInstrument. Should there be a join between Instrument and CustomerInstrument?
3. I don't think it technically makes a difference in performance but I like to write my queries with a JOIN clause so that I have a clear understanding with the relationships between tables vs. conditions that would remove rows. So I might write it like
FROM MSDS_Serenity.Kit k
JOIN MSDS_Serenity.KitContent KC on Kc.Kitid=k.Id
JOIN MSDS_Serenity.Instrument I on I.Id=kc.Instrument
JOIN MSDS_Serenity.InstrumentGroup G on G.Id=I.GroupId
JOIN MSDS_Serenity.CustomInstrument CI on CI. >>>maybe I'm incorrect but I don't see where the ci table is related to any other table, this is what is likely causing the table scan, if this table has a large number of rows this could very well be causing an issue
WHERE i.active=1 and (i.IsImpants is null or i.IsImplans!=1)
and k.loc=5 and k.id=9237
Again I don't think this makes any technical difference but it does allow me to isloate the relationship between tables vs the filtering that occurs due to a WHERE clause.
4. I'm not saying this is absolutely true but when I see DISTINCT in a query I wonder why it was added. Is it because of a lack of join that was producing duplicate rows and DISTINCT was a convenient way to "hide" the logic flaw? I could be completely off base here so please ignore if 'm completely out of line. This is a good article on DISTINCT https://wiki.c2.com/?SelectDistinctIsaCodeSmell
5. Without knowing what the stored procedures are doing this could be a place where performance is encountering issues.
In the past, I have utilized %CSP.StreamServer to accomplish this. While this may have a %CSP package name its perfectly usable in a ZEN context.
Based on
.png)
it looks like History.Views may not have indices or at least doesn't have a bitmap index. If the table had a bitmap index it shouldn't take 30 seconds to count the rows. ... but maybe I'm incorrect.... seeing the query plan would have provided that insight.
You report that the now generates in 60 seconds and previously it 28 minutes for a twelve-page report. Was the primary problem /time consumed in gathering the data or something else?
ZEN reports provide a number of methods to generate reports including GenerateReportToFile. This method allows you to specify the outputfile and mode.
I too am not entirely sure what your issue is but when I have particularly complex reports I leverage the ability to define in the ReportDefinition multiple <group> entities. Consider for example you have a data model that has
while you could define a single entity <group> which calls a single query, in my case I prefer to use Class Queries and then leverage breakOnField and create multiple <groups>, I find it makes more sense to have something like this, albeit this is psuedo code and not 100% what is entered in a Report Definition.
<group name="Batches" queryName="GetBatches">
<group name="Claims" queryName="ClaimsByBatch">
<parameter field="Batch">
<group name="Claim"
<group name="Lines" queryName="LinesByClaim">
<parameter field="Claim">
<group name="Line">
<group name="LineAdjudications" queryName="AdjudicationByClaimLine">
<parameter field="Claim">
<parameter field="Line">
<group name="Line">
</group>
</group>
</group>
</group>
</group>
</group>
</group>
In this manner each level/group is responsible for doing one thing. If needed I've also added to the <group> tag a filter element. An actual example I have is this
fields="ClassName,LocalTimeChanged,RemoteTimeChanged,Description,LocalFkeys,RemoteFkeys,LocalIndices,RemoteIndices,LocalMethods,RemoteMethods,LocalProperties,RemoteProperties,LocalQueries,RemoteQueries,LocalParameters,RemoteParameters,LocalTriggers,RemoteTrigger" filter="..Filter(..FilterSpec)">
whereby I send to my method Filter the field values specified in the attribute fields as well as a FilterSpecification to test the filter. You don't have to do it this way, its just an example of using the fields attribute and the filter attribute which is any valid objectscript expression.
when looking at the code in %ZEN.Component.toolbar it looks like the class is hard coded and can't be changed.. but I could be incorrect. Using a modern UI framework like Angular or so many others would allow you greater control.
If those fields are not used in the ReportDisplay, ie the HTML or PDF output, you could consider removing them from the ReportDefinition. If on the other hand, you do need these fields then leave them in. However, something else seems to be going on. I commonly have this style of ReportDefinition
.png)
The usage of <macrodef > just allows me to have a single set of code that injects the same attributes into the <Master> node.
When running the report in XML mode it produces
<Master runDt="10/13/2022" footerDate="October 13, 2022" runTm="08:47AM" runBy="_SYSTEM" Draft="" ISCPrime="0" Instance="HEALTHSHARE" Namespace="HSEDGE1" Server="LAPTOP-ET8APOSR" InternalURLLinkPrefix="http://localhost:52773" CustomerName="Ready Computing, Inc" CoverPageDate="October 13, 2022" CustomerLink="" PageNumbering="1" SystemMode="" FilterSpecDisplay="" ReportName="Ensemble Message Management" HeaderLine2="" Top="9999999999999" ReportClass="RC.Ens.Report.MessageManagement" ZVERSION="IRIS for Windows (x86-64) 2021.1.2 (Build 336_0_21564U) Tue Apr 12 2022 12:13:45 EDT" ReportTitle="">
I don't think attributes found in the root node, in my case Master, are emitted to the Excel file.
If I run the report in MODE=XLSX mode it produces.. note it does not show the attributes
.png)
so this is answering your question regarding your attributes at the top level. Your question regarding the elements at the ProssData node is something different. In my example, if I change
.png)
to
.png)
and then run the report in XLSX mode the attributes Namespace and TaskId do not appear in the Excel file. However, I would have to change the ReportDisplay so that I now reference
@Namespace and @TaskId
if I want the PDF and HTML output to work properly.
I cant speak for Templates but in the same area I do have Studio Add-Ins in use in VS Code.
I've run into something similar but with a different streamlet type. The subscript error indicates that it is getting a null value which it expects to be non-null. I would suggest looking at the medication streamlet and ensuring that all of the required fields are defined. It might be the Medication.OrderItem.Code is null.
Nicely done. while I haven't contributed to the project I do have a User Snippet
"Create Status Method": {
"prefix": "Method Create ClassMethod Returning %Status",
"body": [
"///${1:Description}",
"///<example>",
"/// Set tSC=##class($TM_FILENAME_BASE).${2:MethodName}()",
"///</example>",
"ClassMethod ${2:MethodName}(${3:parameters}) As %Status",
" {",
" #dim tSC \t\t\t As %Status=\\$\\$\\$OK",
" #dim eException \t As %Exception.AbstractException",
" try {",
" $0",
" }",
" catch eException {",
" Set tSC=eException.AsStatus()",
" }",
" Quit tSC",
" }"
],
"description": "Method Create ClassMethod Returning %Status"
}that utilizes the <example> Documatic special tag as I too was tired typing the ##class when testing methods. My approach is different and it's not perfect but it works for me. It also produces a class method with a consistent pattern/structure.
The class query EnumerateCategories in Ens.Config.Production should provide what you are looking for.
this data is found in Ens_Config.Item
Without additional details it may be hard to tell for certain. If it's an issue on Cache side you might consider using $Sequence in your storage model
$SEQUENCE provides a fast way for multiple processes to obtain unique (non-duplicate) integer indices for the same global variable. For each process, $SEQUENCE allocates a sequence (range) of integer values. $SEQUENCE uses this allocated sequence to assign a value to gvar and returns this gvar value. Subsequent calls to $SEQUENCE increment to the next value in the allocated sequence for that process. When a process consumes all of the integer values in its allocated sequence, its next call to $SEQUENCE automatically allocates a new sequence of integer values. $SEQUENCE automatically determines the size of the sequence of integer values to allocate. It determines the size of the allocated sequence separately for each sequence allocation. In some cases, this allocated sequence may be a single integer.
$SEQUENCE is intended to be used when multiple processes concurrently increment the same global. $SEQUENCE allocates to each concurrent process a unique range of values for the gvar global. Each process can then call $SEQUENCE to assign sequential values from its allocated range of values.
In your class definition within the you could consider changing the IdFunction from the default which is Increment to sequence
but again more detail is needed.
Users are added via HSPI Management->Settings -> System Mode->Configuration->Assignment Username https://docs.intersystems.com/hs20222/csp/docbook/DocBook.UI.Page.cls?K…
the original example(first) likely isn't the best test as all of the elements are 1 character long. My understanding is $LB data is an encoded string with the first part of the encoding containing the length of the data for an individual element. Getting the length means you can jump a number of characters to the next element. On the other hand $Piece does a character-by-character scan.
ClassMethod Test() As%Status
{
#dim tSC As%Status=$$$OK#dim eException As%Exception.AbstractExceptiontry {
settStart=$ZHforx=1:1:1000000 {
Set a=$LB("sdfdkjfdjklfdjklfdjklfds","dlkfdjklfgjklfgjklfgjklfgjkl","fdklfdsjkljklfgjkfgjkfg") Set b=$LG(a,2)
}
Write !,"List Duration: ",$zh-tStartsettStart=$ZHforx=1:1:1000000 {
Set a="sdfdkjfdjklfdjklfdjklfds_dlkfdjklfgjklfgjklfgjklfgjkl_fdklfdsjkljklfgjkfgjkfg"Set b=$P(a,"_",2)
}
Write !,"Piece Duration: ",$zh-tStartsettStart=$ZHforx=1:1:1000000 {
Set a=$LB("sdfdkjfdjklfdjklfdjklfds","dlkfdjklfgjklfgjklfgjklfgjkl","fdklfdsjkljklfgjkfgjkfg") Set b=$LG(a,3)
}
Write !,"List Duration Getting last element: ",$zh-tStartsettStart=$ZHforx=1:1:1000000 {
Set a="sdfdkjfdjklfdjklfdjklfds_dlkfdjklfgjklfgjklfgjklfgjkl_fdklfdsjkljklfgjkfgjkfg"Set b=$P(a,"_",3)
}
Write !,"Piece Duration Geting last element: ",$zh-tStart
}
catch eException {
Set tSC=eException.AsStatus()
}
Quit tSC
}
at the same time I'd much rather deal with $List and not have to worry about escaping delimeters.
Stefan is correct. Getting the query plan does 2 things
My suspicion is you are missing an index on one of your tables that are included with the INNER JOIN. I imagine the number of rows for these tables are on the order of you have
and there is a missing index on one of the Order tables but again the results of Show Plan will provide all of the information that is needed.
the query plan tells us that it is doing a table scan on every single row in SQLUser.Records.
if er was indexed in Records it would at least use the index.,
somewhat related, I commonly used in the past in IRIS Studio CRTL-G (goto) to jump to an entity within a class or routine. This dialog
.png)
allow you to goto either a Line Number or Tag or Label.
I wanted to be able to do the same in VS Code and found by looking at
https://code.visualstudio.com/shortcuts/keyboard-shortcuts-windows.pdf
in VS Code you can use CTRL-SHIFT-O.
This isn't exactly your question but the link to the shortcuts is useful and for me, CTRL-SHIFT-O is what I was looking for as I wanted to jump around without using my mouse.
The Athena IDX data is mapped/described by class definitions. Documatic would be a good place to start. The class definitions come with a rich set of meta-data including
Many of the tables support bitmap indices.
I've integrated in the past with
you could construct this with usage of <hgroup> and <vgroup> if you had a fixed number of rows.
If you had a variable number of rows you could use a tablePane where the resultset object returns the rows with the first column being the label and the second column being the cost value for items.
when you call the compiler did you call
Set tSC=##class(%SYSTEM.OBJ).Compile(ClassName,"cuk",.errorFile)and was anything contained in the output array errorFile?
With the discussion of Intersystem IRIS Studio or VS Code, I understand the initial reaction to think you are losing something using VS Code. Will you miss some things, I'm sure you will, but measured on the whole I feel like VS Code is a much better environment even if all you ever do is objectscript work. I feel like that's the critical thing to remember, there may be some differences but measured on the whole it is better, give it a try and give it a chance, especially learning the full capabilities.