Customizing DashboardViewer may take a lot of time for testing with questionable success.

Alternate approach: Trigger your source control manually.

- create a method to export your changes. example:

#; find actual changes
   &SQL(select List(documentname) into :list 
          from  %DeepSee_Dashboard.Definition 
         where timemodified-now()+1>0 )
   set  sc=$system.OBJ.Export(list,"DeepSee.xml")

- next define an Action (in a KPI) that calls you method
http://localhost:57774/csp/docbook/DocBook.UI.Page.cls?KEY=D2IMP_ch_action

- next define a Control "button" to your main Widget and add this Action to activate your source management.

I admit. It's no an automatic & generic solution as directly inside DashboardViewer  (engineering may have a hint howto use it)  
but you get the advantage to decide which dashboard you want to manage and when
since not every change in dashboards may need to go immediately to source control
And it's definitely simpler than to do it our of Studio.

sorry!

examining the source it turns out that this parameter didn't make it to call parameters (ZENURL) nor to settings.
though I found a bunch of comments  // +DTB103 - source control
with no hint how to use it.
It might be necessary to build a private copy.
Attention on line +7 says:
/// This class should be considered as *internal*; subclassing is not supported. 

Hi Evgeny,

class %DeepSee.UserPortal.DashboardViewer  has a property saveAction as %ZEN.Datatype.string;

Container for the current save mode for source control.

http://localhost:57774/csp/documatic/%25CSP.Documatic.cls?PAGE=CLASS&LIBRARY=samples&CLASSNAME=%25DeepSee.UserPortal.DashboardViewer#PROPERTY_saveAction

if you combine this with standard $system.OBJ.Export()  this may do it.

SAMPLES>s sc=$system.OBJ.Export("B*.DFI","DeepSee.xml")
 
Exporting to XML started on 03/07/2018 09:36:32
Exporting type : Basic Dashboard Demo.dashboard.DFI
Export finished successfully.

A good friend of mine provided this solution using also  JSON types and internal $LB() types for more detailed type analysis.
Of course there's no guaranty  for the hidden types in $LB() . 
 

types ; Show COS-Datatypes ; kav ; 2018-03-04
 //
 set obj=##class(Person).%New()
 write "VALUE",?15,"JSON",?22,"$LISTBUILD",! ,$TR($J("",32)," ","-"),!
 for val="453","abcd",1234,"abc"_$c(352), -34, 1.3, -7.9, $double(1.25),obj {
   write val,?15,$$TypeOf1(val),?22,$$TypeOf2(val,.txt),.txt! }
 quit
 // Return JSON datatype by the documented way
 //
TypeOf1(val) Public
{
  quit [(val)].%GetTypeOf(0) }
 // Return datatype by the undocumented $LB() way
 //
TypeOf2(val) Public
 if $length(val)>253
    set typ=$ziswide(val)+1
  else 
    set typ=$a($lb(val),2) }
  set txt=$case(typ
    ,1:" 8bitString"
    ,2:" 16bitString"
    ,4:" nonNegativeInteger"
    ,5:" negativeInteger"
    ,6:" nonNegativeFloat"
    ,7:" negativeFloat"
    ,8:" double"
    , :" ??? never seen before")
 
 quit typ
}
d ^types
VALUE          JSON   $LISTBUILD
--------------------------------
453            string 1 8bitString
abcd           string 1 8bitString
1234           number 4 nonNegativeInteger
abcŠ           string 2 16bitString
-34            number 5 negativeInteger
1.3            number 6 nonNegativeFloat
-7.9           number 7 negativeFloat
1.25           number 8 double
1@User.Person  oref   1 8bitString

 

the code causing the problem in MyClass.2.int looks most likely like this:

Set oid=$select(i%Myproperty="":"",1:$listbuild(i%Myproperty_""))

you are inside an ObjectMethod   and miss the actual Object  reference.
This happens when you try to access a property inside a Classmethod.
 classic mistake:

ClassMethod MyMethod(1,2,3) as %Status {
 set x=..Myproperty
}

correct use: 

ClassMethod MyMethod(oref,1,2,3) as %Status {
 set x=oref.Myproperty
}

Whether external backup or backup from a mirror (also asynchronous) or Caché backup you always have to identify
the point in time when your DB is logically consistent. What I think of is no open transactions, no open dependencies.

If you know that point in time you could separate your async mirror or shadow and run any backup from there.
Or just shut down your async server instance and run snapshots.
But there might also be a time gap between master and async server.
Once completed your async server can join again and catch up whatever time that may take.   
The critical point is to know when the async server has reached consistency.
But that depends on the application.
 

for a file stream it's a physical file, for a global stream it's a global or part of it.
if you clear it you either delete the file or the global (or its part )that holds the stream.
stream in Caché describes a sequence of persistent bytes on disk that you work on with dedicated common methods.

this must not be mixed up with a stream of characters on a network connection. if you miss a character there it's gone. 

Hi Alexey,
You hit the point: "lightweight"  just documents the relation.  Full stop. No further service.
You have to service (ie. Delete) it at both ends.
If you use a serial object with OREF + Status)  you still have to service both ends.

The "heavy" variant does it centralized at one place.
Though from storage point of view you move out the additional subscript level from array.