Thanks, Robert!

I realized now that DR site should be in another cloud and on a distance because of the purpose.

DR is being implemented as ASYNC Mirror in InterSystems and I think the network performance/bandwidth is the key.

All you mentioned here is fair enough, thanks.

And what I realize in the end that network bandwidth requirements totally depends on the transactions load of the PRIMARY.

The hardware requirements of DR site is an easier thing - DR site can have the same configuration as PRIMARY.

Hi Felipe!

Are you able to edit ANY files with VSCode? With ObjectScript files, it should be the same.

Just to make sure: 

1. - check out your ObjectScript files from the repository to a local file folder.

2. Open this folder in VSCode.

3. Setup the connection in VSCode to an InterSystems server and namespace you want your files to be compiled.

4. Code it!

There are also other steps of commit/push/test/patch/deploy - but it is another story.

Hi Felipe!

As @David Reche mentioned you can get it from the Command Palette menu and as you can see Command +F7 hotkey can be used for import+Compile.

I also like the option:

cos.Autocompile=true;

which gives you compilation on every Save (Command + S) and which you can setup in workspace settings of VSCode.

Also, make sure that you connect to a right server and right namespace. See the following gif which illustrates how it works:

What is great about InterSystems objects is that you exactly know what's going on with your data. 

You can look into the generated code for classes with Cache Storage and see where and when class writes the data into ^OBJ.DSTIME when the class has 

Parameter DSTIME As STRING [ Constraint = ",AUTO,MANUAL", Flags = ENUM ] = "AUTO";

So, there is no magic here:  DSTIME=AUTO parameter introduces "sets" into ^OBJ.DSTIME global in the places where records are being updated or created.

It doesn't work for non-standard SQL storage because DSTIME Parameter simply doesn't know about insert/update/delete procedures anything.

In your case you can forget about DSTIME and place this sets in Global (or inserts into some Record.ChangeHistory class) in the places, where your data is being inserted/updated/deleted.

Hi Yaniv!

As @Chris Thompson mentioned you can use DSTIME for automatic records changes tracking which will be stored in a special global. But I doubt if it works for classes with non-standard storage schema just by adding DSTIME=auto;

How do you add/update records in your application? You need to introduce special calls in your "create/update" procedures to track changes. DSTIME=auto; does exactly same adding "tracking" logic in SQL and Object requests for class changes. Maybe you can use some calls with DSTIME=manual; mode.

@Alexander Koblov  do you know if it is possible to use DSTIME somehow in this case?

Another approach which may work in your case is to use AUDIT to track the changes users made to DB and then analyze this data, e.g. like here. But I never did it for non-standard storage schema.

Hi Harkirat!

You have two questions here) Answering on the limits for a particular user.

Yes, you can limit the data user see via DeepSee interfaces.

Implement the callback %OnGetFilterSpec of your Cube which will introduce an extra filter on all the queries to the cube. You can use the $User or $Role of the current user and tie it to a dimension and so filter the data, which is available to a certain user or group of users. So all your dashboards and pivots will use this extra filter to all the data sources.

HTH