Hi

if you have a simple Insert SQL statement in the form:

INSERT into {Table} VALUES ('abc','zyx',123,'',NULL)

then Cache SQL will replace '' with $c(0) in the global record and "" for NULL.

NULL is a valid keyword in SQL not only for testing conditions such as "WHERE {column} IS NOT NULL" but also when being used in the context of values in my INSERT example and also in Class Methods or Class Queries that are projected as SQL Stored Procedures. 

So if you have a Class Method projected as a Stored Procedure then if you call the stored procedure as follows:

call MyTable_MyProcedure('abc','xyz',123,'',NULL) 

then the values for the parameters will be

p1 = "abc"

p2 = "xyz"

p3 = 123

p4 = $c(0)

p5 = ""

Yours

Nigel

Hi

The way that I have dealt with this in the past is as follows:

1) Create a DTL that accepts the incoming HL7 message as its source and  a target class of EnsLib.HL7.Message with a message structure of 2.5.1:ACK. I am using HL7 2.5.1 but it will work with any version of HL7 from HL7 2.3 upwards (probably earlier versions as well but I have not worked with versions earlier that 2.3)

2) when you invoke the Transform() method of a DTL you will notice that there are three parameters

  1. pRequest (which is your source message)
  2. pResponse (which is the generated target message)
  3. aux

If you read the documentation, if the transform i invoked from a Business Rule then aux is an object and contains information about bthe Rule that invoked the Transform and a couple of other properties. However if you are invoking the transform from Cache ObjectScript then aux can be an instance of a class you create.

The way that I use 'aux' is as a mechanism for getting information into the DTL that is not present in either the sourc or target objects. In this example I want to send in the ACKCode and the ACKMessage.

So my DTL looks like this: (I apologise for having to paste in the class code but I have never found a way to attach classes to a Community Reply) so here goes.

My DTL Class reads as follows:

Class Example.Transformations.CreateNACKDTL Extends Ens.DataTransformDTL [ DependsOn = EnsLib.HL7.Message ]
{

Parameter IGNOREMISSINGSOURCE = 1;

Parameter REPORTERRORS = 1;

Parameter TREATEMPTYREPEATINGFIELDASNULL = 0;

XData DTL [ XMLNamespace = "http://www.intersystems.com/dtl" ]
{
<transform sourceClass='EnsLib.HL7.Message' targetClass='EnsLib.HL7.Message' sourceDocType='2.5.1:ADT_A01' targetDocType='2.5.1:ACK' create='new' language='objectscript' >
<assign value='source.{MSH:FieldSeparator}' property='target.{MSH:FieldSeparator}' action='set' />
<assign value='source.{MSH:EncodingCharacters}' property='target.{MSH:EncodingCharacters}' action='set' />
<assign value='source.{MSH:SendingApplication.NamespaceID}' property='target.{MSH:SendingApplication.NamespaceID}' action='set' />
<assign value='source.{MSH:SendingApplication.UniversalID}' property='target.{MSH:SendingApplication.UniversalID}' action='set' />
<assign value='source.{MSH:SendingApplication.UniversalIDType}' property='target.{MSH:SendingApplication.UniversalIDType}' action='set' />
<assign value='source.{MSH:SendingFacility.NamespaceID}' property='target.{MSH:SendingFacility.NamespaceID}' action='set' />
<assign value='source.{MSH:SendingFacility.UniversalID}' property='target.{MSH:SendingFacility.UniversalID}' action='set' />
<assign value='source.{MSH:SendingFacility.UniversalIDType}' property='target.{MSH:SendingFacility.UniversalIDType}' action='set' />
<assign value='source.{MSH:ReceivingApplication.NamespaceID}' property='target.{MSH:ReceivingApplication.NamespaceID}' action='set' />
<assign value='source.{MSH:ReceivingApplication.UniversalID}' property='target.{MSH:ReceivingApplication.UniversalID}' action='set' />
<assign value='source.{MSH:ReceivingApplication.UniversalIDType}' property='target.{MSH:ReceivingApplication.UniversalIDType}' action='set' />
<assign value='source.{MSH:ReceivingFacility.NamespaceID}' property='target.{MSH:ReceivingFacility.NamespaceID}' action='set' />
<assign value='source.{MSH:ReceivingFacility.UniversalID}' property='target.{MSH:ReceivingFacility.UniversalID}' action='set' />
<assign value='source.{MSH:ReceivingFacility.UniversalIDType}' property='target.{MSH:ReceivingFacility.UniversalIDType}' action='set' />
<assign value='$tr($zdt($h,3),",: ","")' property='target.{MSH:DateTimeOfMessage}' action='set' />
<assign value='source.{MSH:Security}' property='target.{MSH:Security}' action='set' />
<assign value='source.{MSH:MessageControlID}' property='target.{MSH:MessageControlID}' action='set' />
<assign value='"ACK"' property='target.{MSH:MessageType.MessageCode}' action='set' />
<assign value='source.{MSH:MessageType.TriggerEvent}' property='target.{MSH:MessageType.TriggerEvent}' action='set' />
<assign value='"ACK"' property='target.{MSH:MessageType.MessageStructure}' action='set' />
<assign value='source.{MSH:ProcessingID}' property='target.{MSH:ProcessingID}' action='set' />
<assign value='source.{MSH:VersionID}' property='target.{MSH:VersionID}' action='set' />
<assign value='source.{MSH:SequenceNumber}' property='target.{MSH:SequenceNumber}' action='set' />
<assign value='source.{MSH:ContinuationPointer}' property='target.{MSH:ContinuationPointer}' action='set' />
<assign value='source.{MSH:AcceptAcknowledgmentType}' property='target.{MSH:AcceptAcknowledgmentType}' action='set' />
<assign value='source.{MSH:ApplicationAcknowledgmentTyp}' property='target.{MSH:ApplicationAcknowledgmentTyp}' action='set' />
<assign value='source.{MSH:CountryCode}' property='target.{MSH:CountryCode}' action='set' />
<assign value='source.{MSH:PrincipalLanguageOfMessage}' property='target.{MSH:PrincipalLanguageOfMessage}' action='set' />
<assign value='source.{MSH:AltCharsetHandlingScheme}' property='target.{MSH:AltCharsetHandlingScheme}' action='set' />
<assign value='aux.ACKCode' property='target.{MSA:AcknowledgmentCode}' action='set' />
<assign value='aux.ACKMessage' property='target.{MSA:TextMessage}' action='set' />
</transform>
}

}

My AUX class definition looks like this:

Class Example.Transformations.CreateNACKDTL.AUX Extends %Persistent
{

Property ACKCode As %String;

Property ACKMessage As %String(MAXLEN = 200);

}

To generate the HL7 ACK my code reads:

classmethod GenerateACKMessage(pRequest as EnsLib.HL7.Message, byref pResponse as EnsLib.HL7.Message, pACKCode as %String(VALUELIST=",CA,CE,CR,AA,AE,AR")="AA", pACKMessage as %String(MAXLEN=500)="") as %Status

set tSC=$$$OK

try {

    set aux=##class(Example.Transformations.CreateNACKDTL.AUX).%New()

    set aux.ACKCode=pACKCode,aux.ACKMessage=pACKMessage

    set tSC=##class(Example.Transformations.CreateNACKDTL).Transform(pRequest,.pResponse,.aux) if 'tSC quit

}

catch ex {set tSC=ex.AsStatus()}

quit tSC

}

Notice that the DTL takes the SendingFaclity, SendingApplication, ReceivingFacility and ReceivingApplication and swaps them around in the DTL.

You should keep the MessageControlID the same as the incoming HL7 Message MessageControlId so that the ACK can be linked to the original HL7 request. 

The MessageTimeStamp can be updated to current Date/Time

The 'aux' mechanism is very useful. Unfortunately the documentation has one line of comment that says "if the transform is called from ObjectScript aux can contain anything you want" or words to that effect.

So I tested it in a simple example like the one above and it does indedd work and I now use it in the 30+ DTL's I amworking with at the moment.

Nigel

Hi

Even though LabTak is not really covered in this group let me give you a little insight into how LabTrak works:

The LabTrak data is stored in a number of globals. There are a couple of key globals that you need to be aware of:

^TEPI

^TDEB

^THOS

^TEPI contains all of the LabTrak Episodes and within each episode there is a sub-array of Test Sets and within that a sub array of Test Items.

The global structure was designed in pre-cache Objects and so the globals use delimiters to separate one field from another. Once you can navigate through the global structure you will find that the fields either contain data (String, Integer, Boolean etc) or they contain codes that point to one of about 50 code tables.

All of the logic of LabTrak is written in cache Objectscript routines. 

Most of the routines are hidden in the sense that the source code is not installed, just the compiled code. However there are some callable entry points into the key areas of the application.  Depending on what you want to do there are appropriate labels that can be called that will retrieve data, insert, update or delete data. You really don't want to play with these routines unless you have been given training by Trak or the InterSystems Trak Sales Engineers.

The data structures (globals) have been mapped to classes and so there are classes for all of the different logical components of the database. You can run sql queries against these sql tables but you absolutely do not want to use insert, update or delete sql statements. All updates to the database are controlled through the entry points I have mentioned before.

For basic retrieval of data your best bet is to use the table definitions. You need to create an InterSystems ODBC DNS to connect to the LabTrak database and once connected you can then view the various schemas and the tables within them. You can then write sql select statements to get the data you are looking for. I would recommend that if you want to access data in LabTrak you should use the Disaster Recovery mirror of the LabTrak database. The DR database(s) are real time copies of the production database and most reports are run against the DR data rather than production. LabTrak is a very extensive and complicated application and is very finely tuned by the Trak experts to run at optimal efficiency and with proper data integrity. You don't want to be running SQL queries against the production database as you will potentially affect the performance of the system by running large sql queries. It is far better to run these queries against the DR servers. The system operators would need to give you access to the appropriate servers and secondly bear in mind that you are working with sensitive, confidential patient data and that must be respected at all times.

Every LabTrak sites has customer specific routines that can be edited. These customer specific routines contain 'insert', 'update' and 'delete' methods and you can write code in these labels to pull data from the database and use it to create HL7 messages for example or create entries in a queue that can be processed by an Ensemble Production

The application supports HL7 interoperability that can be used to pass HL7 data into LabTrak and generate HL7 result messages to send out fro LabTrak but again you would need the appropriate training to understand how that functionality works.

As has been mentioned in one of the other replies your best bet is to connect with you LabTrak Project Manager or Sales Engineer to find out more and to find out what training material and documentation exists

Good Luck

Nigel

Hi

The type of content that is most appreciated by members of the community are articles on some aspect of the technology (Cache, Ensemble, IRIS) or specific usage of the technology that you have worked with, have a good understanding of, maybe have learnt a few tips and tricks about the  using the technology that is not covered by the core product documentation. This is especially true when you happen to make use of one of the more obscure features of the technology in a development project you have worked on and maybe battled to get it to work, found the documentation to be lacking, found that there are few if any posts on the subject in past community posts and no examples  in the various samples supplied in the Cache/Ensemble/IRIS "Samples" namespace or the "Dev" directory in the ../intersystems/..../dev directory.

To give you an example, some years back I was working on a project where we were building a prototype Robot. I was writing the Ensemble component that would interface the outside world and translate those instructions into calls to a Java code base that controlled the motors and sensors and other mechanical components of the robot. The developer I was working with knew all of the Java stuff and I knew all the Ensemble stuff and to make the two technologies talk to each other we had to make use of the Java Gateway. We read the documentation. It seemed straight forward enough. I had had a lot of experience working with most of the Ensemble Adapters so I was expecting things to go smoothly.

But they didn't. We re-read the documentation. We looked at the examples, we asked questions in the Developer Community, we contacted WRC but still we could not get it to work. Eventually my colleague fond a combination of bits and pieces of the Java Gateway that he merged together and eventually we got the interface working.

To this day I still don't understand why the gateway did not work the way the documentation said it should. I don't exactly understand how the solution we eventually put in place that did work, worked.   

At the time we were still experimenting with the Java Gateway and realised that the documentation only took us so far, it would have been great if we had been able to find an article in the Developer Community written by someone who had used the Gateway, had found some specific things that needed to be setup correctly for it to work, included some code snippets in the article and so on. If I had found such an article and it helped us get our Gateway to work (we had struggled with it for 2 months, it should have taken 2 days to get it to work) I would have sent a bottle of our famous South African Artisan Gin and a large packet of the South African delicacy, "Biltong" (dried Beef, Kudu, Springbok, Ostrich meat) to that man as a thank you.

These days the focus is on IRIS and IRIS for Health. There is huge interest in  FHIR and the various interoperability options for IHE, PIX, PDQ,  HL7, Dicom, CDA and so on. 

I have been quite active on the DC for many years and since the Covid19 Lockdown I have had more time to answer questions and I too am thinking of articles, code solutions and such like that i can write up or package for the Open Exchange applications page. I have even got to the point where I have invested in some lighting gear and a tool called Doodly which allows you to create animated videos to explain concepts or processes to achieve a desired solution or outcome. I hope to start publishing some of these articles in the near future.

So I hope that these observations will encourage you to find good subject material to write up and publish

Nigel

To add to the above reply. %Status is not just a Boolean TRUE (1) or FALSE (0). It is a complex structure where if the Status is false it is represented as 0_$list(of all of the Error Information). 

If you want to display the error code in SQL then create a calculated field "ErrorMessage" along side your field that stores the %Status value. Call that field MyClass.Status. Create a new field called MyClass.StatusErrorMessage and flag it as 'Calculated'

Then in either the OnBefore Save method or using a TRIGGER  do the following:

in the ErrorTextGet() method

method StatusErrorMessage ()

{

     quit $system.Status.GetErrorText(..Status)

}

Nigel

Hi

The way that I have down this in the past is as follows:

1) Export the classes/DTL's/CSP Pages etc into an Export XML file.

2) Create an array of the strings you want to identify

 e.g.

set array({string1})="",array({string2})=""....array({stringN})=""

classmethod FindStrings(pfile as %String="", byref plist as %String) as %Status

{

set tSC=$$$OK  

try {

   open file:("R"):0

   if '$t set vtSC=$$$ERROR(5001,"Unable to open file: "_file) quit

    use file read line

    set x="" for {

         set x=$o(pList(x)) q:x=""

         if line[pList(x) {

             // Do what ever you want to do when you find a line that contains one of the sting values you are searching for

          }

}

catch ex {

    if $ZE["<ENDOFFILE"> {set tSC=$$$OK}

     else {set tSC=ex.AsStatus()}

  }

close file

quit tSC

}

call the method as follows:

set sc=##(MyClass).FindStrings({File_Name},.array)

Yours

Nigel

Hi

Just to pad out that example:

set gbl="^%SYS",x="",file="c:\temp\myfile.txt"

open file:("WNS"):0

else  w !,"Unable to open file" quit

for  set x=$o(@gbl@(x)) q:x=""  zw @$ZR  w !

close file

for multiple globals:

set file="c:\temp\text.txt"

open file:"("WNS"):0

for gbl="^ABC,^XYZ,^PQTY,^%Nigel"  set x="" for  set x=$o(@gbl@(x)) w !!:x=""  q:x=""  use file zw @$ZR

close file

Nigel

H

There are system utilities that allow you to retrieve a list of globals based on a wildcard.

Here is some code that gets a list of Globals in a namespace. You can modify it to suit your needs:

Method GetGlobalList(ByRef globallist As %String(MAXLEN=3000) = "", globalnameprefix As %String) As %Status
{
    set $ztrap="Error",tSC=$$$OK,globallist=""
    set gbl=""
    for {
        set gbl=$o(^$GLOBAL(gbl)) q:gbl=""
        if globalnameprefix[$p(gbl,".",1) set globallist=globallist_gbl_","
    }
    set globallist=$e(globallist,1,$l(globallist)-1)
End ;
    quit tSC
Error ;
    set $ztrap="",tSC=$$$ERROR(5001,"Code Error: "_$ze) goto End
}

This uses old style $ztrap error handling and would be better written as a TRY/CATCH

I hope this helps

Yours

Nigel

I agree with Robert. I have seldom if ever seen code that calls the ResultSet Close() method. There may be temprary globals left over in the CacheTemp database but if I am not mistaken there is a system Purge method that is run as part of the standard system tasks to clear-down these temporary structures. However if you want to complete the ResultSet functionality from Start to Finish then there is no harm in calling the Close() method.

I agree with Robert. I have seldom if ever seen code that calls the ResultSet Close() method. There may be temprary globals left over in the CacheTemp database but if I am not mistaken there is a system Purge method that is run as part of the standard system tasks to clear-down these temporary structures. However if you want to complete the ResultSet functionality from Start to Finish then there is no harm in calling the Close() method.

If the servers can 'see' each other . Lets call then Server A with Namespace A and Server B with Namespace B. You do the following:

1) From the Management Portal on Server A use the System->Configuration->ECP Settings->ECP Data Servers and Add a new Server connection to Server B

2) Then within the System Configuration use the Add Remote Database to create a Database definition of the Database or Databases that Namespace B uses (Typically a namespace will be linked to one database that contains both the Application Code (Classes and Routines) and the Application Data (Globals). However it is also possible that the Application Classes and Routines live in one database and the data in another. 

3) On Server A create a new Namespace that links to the Database(s) you have mapped in step (2)

4) In the Namespace Definition of Namespace A on Server A use the Package Mapping to Map the class, selection of classes or class package into Namespace A. If the class you want to interact with is linked to an underlying Global Definition (Data, Index and Stream) then use the Global Mapping to Map the Globals used by the class into Namespace A

5) The Class (call it {PackageB}.[ClassNameB} will now appear as a class in Namespace A and if you run the methods in {PackageB}.{ClassNameB} then the code will execute in the Cache Job on Server A but any data retrievals or saves will be executed in the global(s) in the Global Database of Namespace B on Server B

There may be licensing implications to use ECP but I am not an expert on current InterSystems licensing models

The other way would be to expose the methods of Class B in Namespace B on Server B in the form of a Web Service on Server B and then invoke the methods from Server A by invoking HTTP calls to the Web Service.

Nigel 

Hi

No, the OnInit() method is called when the Business Service Starts Up, the OnTearDown() is invoked when the Business Service stops. The OnInit() is not aware of the request message and therefore It is not aware of any Request messages at this point. The ProcessInput() and more specifically the OnProcessInput() method is the first time you get to see the incoming request message and it is in the OnProcessInput() method that you decide what you are going to do with the request HL& Message, whether you route it to a Business Process based on the Ensemble HL7 Passthrough Architecture or whether you pass it to a custom Business Process However I made the assumption that your Business Service is a conventional HL7 TCP Inbound Adapter based service. 

If however it is an HTTP REST service then that is another matter altogether. If it is an HTTP REST service then by default the request will be handled by the %CSP.REST Dispatch class. The basic functionality of the %CSP.REST class is to route HTTP requests to a Business Service. You can inherit the %CSP.REST class into your own REST dispatcher class. 

I have a REST Dispatcher class in Ensemble that emulates the IRIS FHIR End Point functionality.

I have 4 csp applications defined:

/csp/endpoint

/csp/endpoint/endpointA

/csp/endpoint/endpointB

/csp/endpoint/EndPointC

All 4 csp applications invoke my custom Rest.Dispatch class (which inherits from %CSP.REST) 

I have a Business Service Class named BusinessService.MyEndPointService

I create 4 Business Services in my production

End Point General (/csp/endpoint)

End Point A (/csp/endpoint/endpointA

and so on

In the Rest Dispatch class I look at the request URL and based on the URL I invoke the OnRequest() method of the Appropriate Business Service using the Production Service Name.

However as I am writing this there is something in the back of my mind that is telling me that if you are using the EnsLib.HL7.TCP Adapter that you can reroute an incoming message to another service but I would have to go back to the documentation to remind myself of what exactly you can do and how it works.

The most common way that developers normally use is the EnsLib.HL7.MsgRouter architecture where you create routine rules that test the MSH Message Structure and you can then route the message to another specific Business Process to process that specific Message Type. This is all handled through the Management Portal->Ensemble->Build set of functions which allow you to create Business Processes, Business Rules, Transformations and so on.

If you are using HTTP REST and want more information on how I have implemented that then I would send you a far more detailed description of my implementation.

Nigel

Historically Cache and Ensemble did not support WebSockets and so you could not have two processes using the same (incoming) port but if I remember correctly IRIS supports WebSockets and though I can't remember how these work something in the depths of my mind tells me that I think WebSockets were aimed at this specific requirement

Check out the IRIS documentation on WebSockets

Nigel

Hi

Business Services support an OnInit() method and in that method you could write some code like this:

In the Business service add the following Parameters:

Parameter StartPort = 51000

Parameter LastPort = 51100

Parameter SETTINGS = "StartPort:Basic,LastPort:Basic"

/// You will need a mapping table to global in the form ^BusinessServiceProductionNames({Bussiness_Service_ClassName})={Production_Item_Name}

method OnInit() as %Status

{

    set tSC=$$$OK

    try {

        set  tPort=..#StartPort,found=0

        while 'found {

              if $l($g(^BusinessServicePorts(tPort))) {

                     set tPort=tPort=tPort+1

                     if tPort>..#LastPort {set tSC=$$$ERROR(5001,"Cannot find available Port") $$$ThrowStatus(tSC)}

                     else {continue}

             Else {

                     set ^BusinessServicePorts(tPort)=$classname(),^BusinessServicePortsIndex($classname())=tPort

                     set ..Adapter.Port=tPort,found=1 quit

              }

         }

    }

    catch ex {set tSC=ex.AsStatus()}

    if 'tSC!('found) set ..Adapter.Port=""

    quit tSC

}

method OnTearDown () as %Status

{

    set tSC=$$$OK

    try {

         if ..Adapter.Port'="" {

              kill ^BusinessServicePorts(..Adapter.Port),^BusinessServicePortsIndex($classname())

       }

    }

    catch ex {set tSC=ex.AsStatus()}

    quit tSC

}

The only thing you would need to do now is somehow notify the Client appliction would need to be notified which Port they should use to access a particular Business Service. I would suggest a WebService that accepts a request consisting of a property "BusinessServiceName" and will return ^BuinessServiceNames($classname()) and a second property "BusinessServicePort" which will return ^BusinessServicePortsIndex($classname())

You would need to supply the client with a list of available Business Service Names and what the Business Service Does. Again you could do this with a WebService which the client invokes and the WebService would return a List of Business Services and Function Description

This solution does make certain assumptions such as the client being willing/able to introduce the code to invoke the WebServices to get the list of Available Services and the Port number for a specific Business Service

The reason you need a 'LastPort" parameter is to ensure that the Business Service will not get into a loop testing every port from 51000 through to whatever the last Possible port number is in TCP/Port architecture

Nigel

Ah, ok

I'm afraid I don't have an easy answer for this one. I just tried creating an abstract class with all of the comments indicating the contents and structure and then inherited it into another new persistent class but all of the /// comments did not load into the new class. I tried compiling the class and then used the View-Class Documentation in the hope that the comments would appear in the documentation but they did not :-( 

Back to the drawing board

Nigel

As with everything in VS code you can link the code snippets at a USER level or at a WORKSPACE level

Another trick is to write snippets of code as #defines in an INCLUDE file and then reference the snippet using the syntax $$${snippet_nanme}

Then Include the include file in the classes you write.

If you work your way through the InterSystems include files you will find many examples where the ISC developers have done exactly this.

Here is an example of a #define for a block of code

#define JSONError400(%ErrorResponseMessage) ##Continue
   Set %response.Status ..#HTTP400BADREQUEST  ##Continue
   Set:'$D(tErrorResp) tErrorResp=##class(PJH.HST.JSON.Proxy.ErrorStatus).%New() ##Continue
   Set tErrorResp.error %ErrorResponseMessage_" Contact system administrator" ##Continue
   $$$WriteJSONError
   ;Do ##class(%ZEN.Auxiliary.jsonProvider).%ObjectToJSON(tErrorResp,.tVisit,,"ltw")

The #continue at the end of each line tells the .inc processor to proceed to the next line

then in your code you just write your line of code as

     do $$$JSONError400(%ErrorResponseMessage)

      .......

Nigel

We tend to forget that %SYS contains a whole load of utilities that have been around for years. Some of the utilities were wrapped into classes that could be invoked through $system (i.e. the %SYSTEM Package).

Some of the utilities are still invoked by the Management Portal. 

I answered another issue that came up in the Dev Community about how you can determine which classes and routines in %SYS are InterSystems code and those that may have been written by a developer many years ago when it was commonplace for some applications to write %Routines or %Classes especially in the MUMPS and very early Cache days where things like Package Mapping didn't exist.

The global utilities ^%G, ^%GIF and %GOF I still use periodically bcause they have such useful masking features as pointed out in other messages in this thread. Likewise ^%RO, ^%RI, %RD, ^ROUTINE, ^%RDELETE can be useful paerticularly if you want to export the .obj compiled routine code rather than the .int version. 

The commands ZLOAD, ZPRINT, ZINSERT, ZREMOVE, ZSAVE are useful for writing small little bits of routine code and once i a while I will use zload and zprint to see what code is in the routine buffer especially if I am testing some code and I get an "<UNDEF> zMyClass.int.1 zMyMethod+10" error status. Normally you open the class in studio and then select View -> Other Code and then fine the label and offset which is fine but sometines it is just quicker to execute the commands ZLOAD MyRourine.int.1  ZPRINT zMyMethod+10

Utilities like ^LOCKTAB can be useful if you want to see whats in the Lock Table and maybe release a lock. I have used this when the Management Portal fails to load properly bcause there is a rougue cache process that is consuming all available resources

^RESJOB is useful for killing off unwanted processes e.g. Rogue Processes  

The $system.Security api's can be very useful if you want to manipulate Cache Users programatically. For example your application may maintain its own User/Role classes and UI which will be referenced by the UI of the application but you also want the application Roles and Users to be created or updated as Cache Roles and Users. 

Oe of the issues with the management portal is that it will comply with the CSP session timeout limit and the example of attempting to import a large global export file for two reasons

1) It only displays N number of global names in the GOF file so even though you have clicked the "select all" check box it seems to only select those globals that have been displayed and there is no way of increasing the number of rows that it displays

2) If the load takes too long the csp session will expire and the page becomes unresponsive and you will find that only the globals that were 'checked' have been loaded whereas if you select 'run in background' and then check the 'select all' globals when it creates the background task it passes in 'true' for the parameter 'select all globals' so gets around the issue of selecting only those that are visible in the select globals form and secondly the background task has no implicit timeout associated with it so it will run for as long as it needs to. In reality it is ultimately invoking an entry point in ^%GIF in order to do the import.

I seem to remember that the Cache and Ensemble documentation used to have a section on "legacy" utilities which are all of the ones I have listed here along with ^DATABASE, ^JOURNAL and so on but I don't think that has carried over in the IRIS documentation

Nigel

Hi

It has been a generally understood convention for many many years that software houses and developers should not write classes or routines in the MGR (%SYS) database/namespace. The reason for this is that %SYS contains all of the Cache/Ensemble/IRIS system management classes and routines. By default any class or routine that is prefixed with the '%' character are mapped to all other namespaces in the Cache/Ensemble/IRIS Instance. In earlier versions of Cache many of the system configuration, security, operation utilities were written as Cache Objectscript routines. Over the years almost all of those routines and classes have been grouped into packages such as %SYSTEM and are accessed via the Cache Management Portal or through $system. $system is a shorthand way of referring to ##class(%System.{Classname}).{MethodName}(). Any classes or routines that are not prefixed with a '%' character are not mapped to other namespaces and can only be run in %SYS. 

The InterSystems developers used class or routine names that typically reflected the functionality of the class/routine, for example ^LOCKTAB, ^DATABASE, ^JOURNAL are routines that allow you to manage the Lock Table, Database utilities and Journalling utilities. Therefore it was always considered to be unwise to write classes or routines in %SYS due to the possibility that InterSystems might introduce a new class or routine that coincidentally has the same name as the routine or class your developers have written.

The general advise given to developers was as follows:

1)  If you need a routine or class to be accessible across many or all namespaces then create a database and namespace named "{Application}System" i.e. "MyCompanySystem" and then use the Namespace Routine and Package mapping feature of Namespace definitions to map those routines and/or classes to the desired namespaces where access to those routines/classes is required

2) If you absolutely have to write the classes and/or routines in %SYS then prefix the routine/class name with a 'z' (specifically lowercase 'z') as in ^%zMyRoutine or ^zMyOtherRotuine. InterSystems developers in turn would never write system classes/routines with a 'z' prefix

3) All InterSystem routines and classes will contain comments right at the start of the routine/class specifically saying that the code has been written by InterSystems and typically reads as follows:

LOCKTAB  ; LOCKTAB Utility  ; JO2010, 2/19/06
         /*
         +--------------------------------------------------------+
         | Copyright 1986-2015 by InterSystems Corporation,       |
         | Cambridge, Massachusetts, U.S.A.                       |
         | All rights reserved.                                   |
         |                                                        |
         | Confidential, unpublished property of InterSystems.    |
         |                                                        |
         | This media contains an authorized copy or copies       |
         | of material copyrighted by InterSystems and is the     |
         | confidential, unpublished property of InterSystems.    |
         | This copyright notice and any other copyright notices  |
         | included in machine readable copies must be reproduced |
         | on all authorized copies.                              |
         +--------------------------------------------------------+
         */

4) Use the %Dictionary.ClassDefinition class queries and methods to identify all classes in the %SYS namespace and when you open a class definition of a system class you will see that there is a property 'System' which if 'True' indicates that it is an InterSystems system class.

5) You can run the utility ^%ROU which will create a list of all routines in a namespace in a temporary global ^CacheTempUtil("ROU",{Name}) .  The entry point is the label DIR as in do DIR^%ROU({directory_name})

6) The databases 'CACHELIB", which  contains all of the Cache class Utility classes and 'ENSLIB' which contains the Ensemble utility classes are READ ONLY databases and therefore you cannot create routines or classes in them. %SYS on the otherhand has READ/WRITE permissions and so it was not untypical for developers to write routines in %SYS particularly in very early versions of cache that did not support Namespace Package Mapping.

I trust this helps

Nigel