cpf files.

All classes from Config package affect one of cpf file (active one by default). But you also need to save and activate modified config file. Dmitry solution does that all by default. Which is fine if you want to add one mapping, but if you want to modify several Config settings it's better to modify them and activate the changes separately. For example here's mapping several packages to %ALL.

#include %syConfig
set namespace = "%ALL"
set namespace = $zcvt(namespace, "U")
for package = "Package1","Package2","Package3" {
    kill p
    set p("Database")=pFrom
    set sc = ##Class(Config.MapPackages).Create(namespace, package,.p,,$$$CPFSave)
}
set sc = ##Class(Config.CPF).Write()
set sc = ##Class(Config.Map).MoveToActive(namespace)
set sc = ##Class(Config.Namespaces).Load(namespace)

You can add Ens.ProductionMonitorService service to your production, which would provides a monitor service for the production status The default behavior of this service is to call UpdateProduction once it notices the production is not up-to-date. Users can override the OnProcessInput method to customize the behavior.

It would be enough to automatically restart relevant items and keep production up to date.

Generally to access/manipulate info about Ensemble and it's hosts use Ens.Director class.

I usually use this proxy operation that keeps track of token:

Class Production.AuthOperation Extends Ens.BusinessOperation
{

Parameter ADAPTER = "EnsLib.HTTP.OutboundAdapter";

Property Adapter As EnsLib.HTTP.OutboundAdapter;

/// Token
Property Key As %String;

/// Number of seconds, that the key is valid
Property KeyValid As %Integer [ InitialExpression = 7200 ];

/// Time, till which the key is valid
Property ValidUntil As %TimeStamp;

Parameter INVOCATION = "Queue";

Parameter SETTINGS = "KeyValid:Basic";

/// Get token
Method OnMessage(request As Ens.Request, Output response As Ens.StringResponse) As %Status
{
    #dim sc As %Statis = $$$OK
    set now = $horolog
    if $System.SQL.DATEDIFF("second", now, ..ValidUntil)<=0 {
        // Get new key valid period
        set validUntil = $System.SQL.DATEADD("second", ..KeyValid, now)
        
        // Get key from auth server by sending user/pass
        set input = { "user": (..Adapter.%CredentialsObj.Username), "password": (..Adapter.%CredentialsObj.Password)}
        set sc = ..Adapter.Post(.httpResponse,,input)
        quit:$$$ISERR(sc) sc
        
        // Parse key from response
        set ..Key = {}.%FromJSON(httpResponse.Data).key
        set ..ValidUntil = validUntil
        
    }
    set response = ##class(Ens.StringResponse).%New(..Key)
    quit sc
}

}

What do you mean by overhead from $lb? 

I mean overhead from

  • Converting string to/from $lb
  • Converting each char to/from number

 

What is $c(166), some magic number?

Essentially. 166(dec) = 10100110(bin), where 101 means  fixstr datatype (stores a byte array whose length is upto 31 bytes) and 00110(dec) = 6(dec) which is the length of the string.

And why you encode the string as a list of codes, why not keep it as a string?

@Maks Atygaev as an advocate of that approach can clarify his position.

To elaborate a bit: we are currently discussing two approaches on how to store intermediary representations of Messagepack message.

One of approaches is to store message as $lb(code1, code2, ... ,codeN) so that each character in message is stored as a corresponding code in $lb.

Other approach is to store as is in strings, so it would look like: $c(166) _ "Fringe".

Let's say we want to encode string "Fringe", it would look like this:

  • $listbuild(166, 70, 114, 105, 110, 103, 101)
  • $c(166) _ "Fringe"

 

I maintain that approach ASIS (with $c) is better as it gives less overhead than $lb.

There is no option to directly read 166, 70, 114... in Caché, only reading symbols and getting their codes from $ascii. So there would be a considerable overhead on conversion to and from $lb abstraction.

2.  Write a routine to copy (duplicate) all the data from old classes to new ones. (which is not trivial at some point)

Why is it not trivial? If you're not changing class definition  three merges per class should be enough.

Also check that storage definition in the new class is the same as in old class, only pointing to the new globals.

MANAGEDEXTENT=0 may be easier to do, but it raises the risk of accidentally deleting old globals (what are these globals not corresponding to any class? Let's free some space)

On previous versions you can use ^%GSIZE utility:

do ^%GSIZE
Directory name: c:\intersystems\cache\mgr\mdx2json\ =>
 
All Globals? No => No
Global ^Git("TSH")
Global ^
1 global selected from 54 available globals.
Show details?? No => Yes
Device:
Right margin: 80 =>

directory: c:\intersystems\cache\mgr\mdx2json\
Page: 1                           GLOBAL SIZE                        16 Nov 2017
                                                                        11:49 AM
      Global        Blocks       Bytes Used  Packing   Contig.
      --------    --------  ---------------  -------   -------
      Git("TSH")         1              132      2 %         0
 
 
      TOTAL         Blocks       Bytes Used  Packing   Contig.
      --------    --------  ---------------  -------   -------
                         1              132      2 %         0
                                        <RETURN> to continue or '^' to STOP:

I have a Cache-compatible sql script file and each query is separate by white space.

How do you escape white spaces in a query?

 

Anyway, the general approach is:

set file = ##class(%Stream.FileCharacter).%New()
do file.LinkToFile(filename)
while 'file.AtEnd {
  set query = file.ReadLine() // ???
  set rs = ##class(%SQL.Statement).%ExecDirect(, query)
  set sc = rs.%DisplayFormatted(format, outfile)
}

Where:

  • filename - file with queries
  • format - one of  XML, HTML, PDF, TXT or CSV.
  • outfile - file to write results to

I assumed query separation by newline.

Also outfile needs to change between queries as %DisplayFormatted recreates it.