You can add sql from the add action and place your embedded sql query and get the values in host variable. Then set the sql return value to the field directly like below.

XData DTL [ XMLNamespace = "http://www.intersystems.com/dtl" ]
{
<transform sourceClass='EnsLib.HL7.Message' targetClass='EnsLib.HL7.Message' sourceDocType='2.3.1:ADT_A01' targetDocType='2.3.1:ADT_A01' create='new' language='objectscript' >
<assign value='source.{PV1:SetIDPV1}' property='target.{PV1:SetIDPV1}' action='set' disabled='1' />
<sql>
<![CDATA[ select count(id) INTO :value from Sample.Person]]></sql>
<assign value='value' property='target.{PV1:AdmissionType}' action='set' />
</transform>
}

Hello smythe,

There are couple of ways to import your file into stream and convert to Ens.StreamContianer

First you can create a %FileCharacterStream object and link your file in to it. Create a streamcontainer object and send the created stream as a first parameter in %OnNew to Ens.StreamContiner class.

ClassMethod StreamContainer()
{
    set file="dir\file" ; Your directory and file name(C:\Intersystems\new13.txt)
    set file = ##class(%FileCharacterStream).%New()
    do file.LinkToFile(file)
    set strmContainer=##class(Ens.StreamContainer).%New(file)
    zwrite strmContainer.Stream.Read($$$MaxLocalLength)
}

Second, This way it's almost similar way. But Instead of pushing directly into Ens.Streamcontainer.  You need to create a object for SDA. Import your file through  XMLImportSDAString method to prepare SDA object.  Now you have SDA object handy. So you can do any SDA stuffs with this object. Eventually it's an implicit step for validate your SDA file structure(Keep in mind the file structure should be exact XML format of SDA). Once the SDA object generated you can use Stream method generate stream from this SDA. Now you can convert to Ens.StreamContainer. Refer the below code to implement the above steps

ClassMethod FileToSDAToEnsContainer()
{
    set file = ##class(%FileCharacterStream).%New()
    do file.LinkToFile("C:\FHIRFILE\new13.txt")
    set SDA3Container=##class(HS.SDA3.Container).%New()
    do SDA3Container.XMLImportSDAString(file.Read($$$MaxLocalLength))
    zwrite SDA3Container
    do SDA3Container.ToQuickXMLStream(.stream)
    set strmContainer=##class(Ens.StreamContainer).%New(stream)
    zwrite strmContainer.Stream.Read()
}

No. GetColumnType() method returns an integer of the datatype.You can manually get the datatype from the integer.

Class Sample.Person Extends %Persistent
{

Property FirstName As %String;

Property LastName As %String;

Property StartDate As %Library.TimeStamp ;

Property Age As %Integer;

Property TestCurrency As %Currency;

Property TestBoolean As %Boolean;

Property TestCharStream As %Stream.GlobalCharacter;

Query TestQuery() As %SQLQuery [ SqlProc ]
{
    
    select firstname,LastName,Age,TestCurrency,TestBoolean,TestCharStream from Sample.Person
}

ClassMethod GetDataTypeOfField() As %String
{
    set result = ##class(%ResultSet).%New("Sample.Person:TestQuery")
    ;	
    for I=1:1:result.GetColumnCount() {
        write "Column Name: ",result.GetColumnName(I),"  "
        write "Datatype number: ",result.GetColumnType(I),"  "
        write "DataType: ",..GetDataType(result.GetColumnType(I)),!!
    }
}
///Get datatype from the integer
ClassMethod GetDataType(type As %Integer=0) [ CodeMode = expression ]
{
$Case(type,
        1:"BINARY", 2:"DATE", 3:"DOUBLE", 4:"HANDLE",
        5:"INTEGER", 6:"LIST", 	7:"LONGVARCHAR",
        8:"TIME", 9:"TIMESTAMP", 10:"VARCHAR", 11:"STATUS",
        12:"BINARYSTREAM", 13:"CHARACTERSTREAM", 14:"NUMERIC",
        15:"CURRENCY", 16:"BOOLEAN", 17:"OID",
        18:"BIGINT", 19:"FDATE", 20:"FTIMESTAMP",
        :"")
}
}

The output when execute the method GetDataTypeOfField()

IRISMYDEV>d ##class(Sample.Person).GetDataTypeOfField()
Column Name: FirstName  Datatype number: 10  DataType: VARCHAR
 
Column Name: LastName  Datatype number: 10  DataType: VARCHAR
 
Column Name: Age  Datatype number: 5  DataType: INTEGER
 
Column Name: TestCurrency  Datatype number: 15  DataType: CURRENCY
 
Column Name: TestBoolean  Datatype number: 16  DataType: BOOLEAN
 
Column Name: TestCharStream  Datatype number: 13  DataType: CHARACTERSTREAM

refer the documentation here

Hello Scott, 

You have lot of object properties declared in your class definition. So, You should initialize the parameter to handle the unexpected fields like extending the %JSON.Adaptor in your class definitions. Otherwise it throw an error. So, Make sure the %JSONFIELDNAME was added if the JSON field name is different from your property and add parameter  %JSONIGNOREINVALIDFIELD to avoid unexpected field while loading JSON into object. Here is the article  about  the JSON adaptor

Parameter %JSONIGNOREINVALIDFIELD As BOOLEAN = 1

Yes, You can order your JSON structure by rearranging the property back and forth.

Hi Prashant,

You can get the values from %Metadata.columns.GetAt(i).ODBCType the ODBC type is referred the  datatype of the column. 12 for VARCHAR refer the ODBC type integer and datatype here

ClassMethod GetColumnType() As %String
{
    set statement = ##class(%SQL.Statement).%New()
    set tSC = statement.%PrepareClassQuery("Sample.Person","TestQuery")
    set result = statement.%Execute()
    #dim meta As %SQL.StatementMetadata
    set meta = result.%GetMetadata()
    write meta.columns.GetAt(1).ODBCType ;fetch first field
}

Hi @Gautam Rishi ,

You can use the  GetColumnType(columnnumber)  to get the datatype of the particular field in the result set object. This GetColumnType Method returns an integer values from 1 to 20.each number represent unique datatype( 10 represents VARCHAR ). If the type cannot be determined, 0 is returned.

For Dynamic SQL use %SQL.Statement It's preferable and suggested instead %Resultset. 

	set result = ##class(%ResultSet).%New("Class:Query")
	while result.Next() {
		for I=1:1:result.GetColumnCount() {
			write "Column Name: ",result.GetColumnName(I),!
			write "Datatype: ",result.GetColumnType(I),!
			
		}
	}
Column Name: FirstName
Datatype: 10
Column Name: LastName
Datatype: 10

Hello David,

Both object script and SQL transaction rollback's are captured in log once you enabled the setting is true  System > Configuration > Compatibility Settings > Edit Compatibility Settings  for rollback log

USER>write $ZV
IRIS for Windows (x86-64) 2023.1 (Build 229) Fri Apr 14 2023 17:36:18 EDT
USER>tstart
 
TL1:USER>set ^TEST=1
 
TL1:USER>trollback

log entry

 

This is nice. Anyway there are few suggestions

  1. Make sure the commands and string functions are following same format either SET or Set or set 
  2. Use %DynmaicArray instead of literal constructors  [ ]. Because it's instantiate the methods of the object like %push when accessing it. Literal constructors won't do it.
  3. Comments are essential for better understanding the program flow. Anyway if you don't want to appear your comments in INT code then use the syntax #;  instead of ;  ex: #;Test comment
  4. Use $ListNext to get the list values one by one. It rid out the 2 additional string functions $LISTLENGTH and $LISTGET usage in your case. Easy to understand 
  5. You can use $Increment instead of declaring a   questionNumber variable and do addition operation in the code.
  6. Add some conditional checks to skip the if the values in between | is empty

I have attached the code sample below. 


ClassMethod createResponse(data As %String(MAXLEN="")) As %DynamicArray
{
	
	set items = ##class(%DynamicArray).%New()
	
	#;1.- Questions splitted by "|"
	set ptr=0
	set listQuestions = $ListFromString(data, "|")
    
    #;2.- Iterate
    while $ListNext(listQuestions,ptr,questionAnswer) {
	   	#;3.- Update variables
	   	continue:questionAnswer=""
	    set question= $Piece(questionAnswer, ":", 1)
        set answer 	= $ZStrip($PIECE(questionAnswer, ":", 2), "<W") //Get rid of initial whitespace
        #;4.- Generate item
    	set item 	= 									
        	{
        	"definition": ("question "_($Increment(questionNumber))),
        	"text": (question),
        	"answer": 
			        [
			            {
			                "valueString": (answer)
			            }
			        ]
        	}
        do items.%Push(item)
    }	
    return items
}

Hope this helps  

The HS.SDA3.TimeStamp datatype class  accepts both YYYY-MM-DDT00:00:00Z and YYYY-MM-DDT00:00:00 values but truncates the additional date values if the length is more than 19 in some system methods such as IsValid, LogicalToXSD methods while performing exports. I have verified and This changes happened whenever it converts to XML. However the actual entire value is persist YYYY-MM-DDT00:00:00Z in the container object. Container object have capability to export both XML and JSON as a stream by using ToJSON() method. You can run the DTL programmatically to get the container object to verify. 

SAMPLES> write container
2@HS.SDA3.Container
SAMPLES> write container.Patient
3@HS.SDA3.Patient
SAMPLES> write container.Patient.BirthTime
1993-12-12T00:00:00Z

I used the DTL generated container object value for another DTL source to generate a FHIR discrete resource. It works. The BirthTime is same as the expected value

{
  "resourceType": "Patient",
  "birthDate": "1993-12-12T00:00:00Z"
}

Hello @Smythe Smythee 

There is no difference between ensemble and IRIS instance. In your case, the source.MemberDOB is an date( ex 01/01/2000) and the conversion is working perfectly. Can you check the input of the memberDOB before conversion and just take a quick look of the previous samples.

USER>write $ZV
IRIS for Windows (x86-64) 2023.1 (Build 229) Fri Apr 14 2023 17:36:18 EDT
USER>set DOB="01/01/1999"
 
USER>write $translate($ZDT($ZDTH(DOB),3)," ","T")_"Z"
1999-01-01T00:00:00Z

The same code is works in IRIS Interoperability DTL as well.

<assign value='$translate($ZDT($ZDTH(source.DOB),3)," ","T")_"Z"' property='target.{PID:DateTimeofBirth}' action='set' />

 

AFAIK No. There is no straightforward way to import JSON from FHIR discrete resources to SDA3 objects by %JSONImport(). Basically, there are various stages involved in converting the FHIR to HL7 and vice versa. To achieve this, Intersystems created an intermediary format called SDA. However, there are more processes involved whenever convert the bundle or discrete resource.
for example

  1. In some instances, the FHIR resource data element name (property) is not the same as the SDA property.
  2. Lots of internal DTL's are invoked or invoked at the conversion time, and an SDA object is created based on that output. Typically it's common for SDA to FHIR and vice versa. 
  3. DTL's are crucial to accomplish this conversion. Some of the data elements are not mapped in the standard FHIR to SDA or SDA to FHIR DTL transformation. HS.FHIR.DTL.vR4.SDA3.AllergyIntolerance.Allergy in this DTL the "criticality" data element is not mapped with SDA object In this case you should create your custom DTL from the already implemented DTL to convert the values to object. So If you haven't added this type of additional properties in the SDA extension class, it won't work.
  4. Metadata values and lookup tables vary from SDA to FHIR. SDA has 'A' in the lookup tables for some fields, while FHIR has 'Active'. There is a internal validation runs against every data element to verify the generated FHIR resource Every time