while it wouldnt be difficult to build a REST API to accept an arbitrary SQL statement and return results there are several things to consider.

1. a single SQL SELECT statement might return 10s of thousands of rows making the REST service problematic with respect to timeouts and payload response size

2. you will want to make sure SQL security so that someone isnt allowed to 

- perform a delete if they dont have access to do so

- select data from tables the dont have access to.

etc.

The REST API is for

SQL Search

The InterSystems IRIS® SQL Search tool integrates with the InterSystems IRIS Natural Language Processor (NLP) to perform context-aware text search operations on unstructured data, with capabilities such as fuzzy search, regular expressions, stemming, and decompounding. Searching is performed using a standard SQL query with a WHERE clause containing special InterSystems SQL Search syntax.

Is your table using the NLP features?

The Temp file often times uses a process private global 

Process-private globals are written to the IRISTEMP database. In contrast to global variables, InterSystems IRIS does not treat a SET or KILL of a local variable or a process-private global as a journaled transaction event; rolling back the transaction has no effect on these operations.

That being said as Peter mentioned if you had an index on code_1_text you could greatly improve performance.  In fact I suspect this type of query would be completely index satisfiable, ie the query would only examine the index and would not have to go to the mastermap for even better performance.  

Depending on the legth of values for code_1_text if you do chose to add an index you might want to define the property as

Property code_!_text as %String(COLLATION="SQLUPPER(113)",MAXLEN=32000);

by setting the collation and length.  If the values exceed the maximum global subscript length and you do not do this you could encounter <subscript> errors.

A subscript has an illegal value or a global reference is too long. For further details, refer to $ZERROR. For more information on maximum length of global references, see “Determining the Maximum Length of a Subscript”.

One important note about QuickStream.  If you decide to go down this path you must make certain to Clear the QuickStream as part of your pipeline.  The Message Purge that is set up as a task will purge the message header and the associated message body but when it looks at the message body all it sees is a property called QuickStreamId and it doesnt know it should also Clear the associated QuickStream object.

You could consider using SemaphoreSpec and allow the folks placing the file in the FTP directory to also place the semaphore file in the directory on their external instruction.

Answers to your questions

1. Think of the classes that describe the globals as being a meta data layer to the globals.  The existing application will continue to run and you will now have classes that expose the data so that you can write Object and SQL code.  If you create new indices in the SQL mapped classes the global representing the new indices would only be updated if something calls the Object.%Save/Delete or SQL Insert/Update/Delete or the legacy filers are updated to manage the new indices.  The legacy application might be unlikely to do this so it would mean your new indices will never be populated which would be bad as the SQL engine would not "know" this and would attempt to read data from the new index and there would be no data.

2. You are correct... if you add new indices to the class and the legacy application is not maintaining the index then it would cause issues.

Hopefully you have a common filer for the legacy application, meaning you have one common filer to save a subject area(global).  If this is the case then it's a matter of updating the common filer.  If the legacy application has a number of places where the data is updated then all of those places would need to be updated or consider adopting an approach of a common filer.

On projects I have been involved in where we have created classes to map the existing globals to enable Object and SQL access we have added to the class

Parameter READONLY=1 

so as to ensure no one could accidentially perform an Object Save/Delete or SQL Insert/Update/Delete operation.

There were several questions asked during the pressentation.  One of them was 

 How can we restrict usage of the cubes so that we only allow users to specific data?

If this were SQL one way of approaching this is to define a View and then grant access to the View but not the table to users.  Fortunately Analytics has the same concept. Just as SQL tables are the resource that has the data and a view is defined representation of the data, Analytics has Cubes which has the data and Subject Areas.  

A subject is a virtual view of the cube but does not require the additional storage/build/sync time.  Within the Subject Area you can

1. Define a filter, consider we want to define a subject on Senior Citizens, we would define a filter on Age > X.  Then when we make the Senior Citizen Subject area to a user they will only ever see patients > X,

2. Define what dimensions are available 

3. Define listings

When building data for a cube, either a full build or a synchronization, think about it this way.  The code generated to support this essentially does a SELECT AllOfYourDimensionsMeasuresRelationships FROM SourceTable.

when you have non primary field references these expressions are evauluated as ObjectScript expressions.

These are documented but not on a single page

%expression 

%cube

%source

%sourceID although this is typically thought of being used in a detail listing specifically if you chose the option for whereas the other's are when defining the cube dimension, measures etc

Given that the routes in your enabled %CSP.REST class point to class methods you should be able to debug/step thru line by line by calling the class method.  In the case where you might have a POST/PUT for your route that calls the class method you can always define the formal specificaiton of the class method to accept parameters and then have the code in your class method look at either your parameter values or %request.Data to determine if the class method is being called by a Http call or your simple debugging.

You may have an issue with the format of the date.

What is the datatype of 

  • somedate
  • anotherdatefield

Are they %Date, %TimeStamp, %UTC, or PosixTime

I'm not certain but does MAX(somedate) cause the value to no longer be in a format that would support

WHERE dateField >= anotherDateField

When in Analyzer you can see the actual query being run aftet the query runs by clicking on the Show Query button in the toolbar

which it seems like you have done so as you have 

WHERE source.%ID IN (SELECT _DSsourceId FROM MyTable.Listing WHERE _DSqueryKey = 'en3460403753')

you can copy the query SQL Query Listing and paste into (albeit remove the portion that has

WHERE source.%ID IN (SELECT _DSsourceId FROM MyTable.Listing WHERE _DSqueryKey = 'en3460403753')

)

System Explorer->SQL and take special note of the Runtime mode of the query.

The Runtime mode has the greatest impact on columns that are 

%Date 

%UTC

as these columns will have different values based on the runtime mode(Logical/Display/ODBC)