go to post Ashok Kumar · Sep 15, 2023 You can add sql from the add action and place your embedded sql query and get the values in host variable. Then set the sql return value to the field directly like below. XData DTL [ XMLNamespace = "http://www.intersystems.com/dtl" ] { <transform sourceClass='EnsLib.HL7.Message' targetClass='EnsLib.HL7.Message' sourceDocType='2.3.1:ADT_A01' targetDocType='2.3.1:ADT_A01' create='new' language='objectscript' > <assign value='source.{PV1:SetIDPV1}' property='target.{PV1:SetIDPV1}' action='set' disabled='1' /> <sql> <![CDATA[ select count(id) INTO :value from Sample.Person]]></sql> <assign value='value' property='target.{PV1:AdmissionType}' action='set' /> </transform> }
go to post Ashok Kumar · Sep 12, 2023 May be you can try to take a clone of the object by %ConstructClone() I know it increments the reference count. However you can keep the old object for validation.
go to post Ashok Kumar · Sep 11, 2023 Hello smythe, There are couple of ways to import your file into stream and convert to Ens.StreamContianer First you can create a %FileCharacterStream object and link your file in to it. Create a streamcontainer object and send the created stream as a first parameter in %OnNew to Ens.StreamContiner class. ClassMethod StreamContainer() { set file="dir\file" ; Your directory and file name(C:\Intersystems\new13.txt) set file = ##class(%FileCharacterStream).%New() do file.LinkToFile(file) set strmContainer=##class(Ens.StreamContainer).%New(file) zwrite strmContainer.Stream.Read($$$MaxLocalLength) } Second, This way it's almost similar way. But Instead of pushing directly into Ens.Streamcontainer. You need to create a object for SDA. Import your file through XMLImportSDAString method to prepare SDA object. Now you have SDA object handy. So you can do any SDA stuffs with this object. Eventually it's an implicit step for validate your SDA file structure(Keep in mind the file structure should be exact XML format of SDA). Once the SDA object generated you can use Stream method generate stream from this SDA. Now you can convert to Ens.StreamContainer. Refer the below code to implement the above steps ClassMethod FileToSDAToEnsContainer() { set file = ##class(%FileCharacterStream).%New() do file.LinkToFile("C:\FHIRFILE\new13.txt") set SDA3Container=##class(HS.SDA3.Container).%New() do SDA3Container.XMLImportSDAString(file.Read($$$MaxLocalLength)) zwrite SDA3Container do SDA3Container.ToQuickXMLStream(.stream) set strmContainer=##class(Ens.StreamContainer).%New(stream) zwrite strmContainer.Stream.Read() }
go to post Ashok Kumar · Sep 11, 2023 No. GetColumnType() method returns an integer of the datatype.You can manually get the datatype from the integer. Class Sample.Person Extends %Persistent { Property FirstName As %String; Property LastName As %String; Property StartDate As %Library.TimeStamp ; Property Age As %Integer; Property TestCurrency As %Currency; Property TestBoolean As %Boolean; Property TestCharStream As %Stream.GlobalCharacter; Query TestQuery() As %SQLQuery [ SqlProc ] { select firstname,LastName,Age,TestCurrency,TestBoolean,TestCharStream from Sample.Person } ClassMethod GetDataTypeOfField() As %String { set result = ##class(%ResultSet).%New("Sample.Person:TestQuery") ; for I=1:1:result.GetColumnCount() { write "Column Name: ",result.GetColumnName(I)," " write "Datatype number: ",result.GetColumnType(I)," " write "DataType: ",..GetDataType(result.GetColumnType(I)),!! } } ///Get datatype from the integer ClassMethod GetDataType(type As %Integer=0) [ CodeMode = expression ] { $Case(type, 1:"BINARY", 2:"DATE", 3:"DOUBLE", 4:"HANDLE", 5:"INTEGER", 6:"LIST", 7:"LONGVARCHAR", 8:"TIME", 9:"TIMESTAMP", 10:"VARCHAR", 11:"STATUS", 12:"BINARYSTREAM", 13:"CHARACTERSTREAM", 14:"NUMERIC", 15:"CURRENCY", 16:"BOOLEAN", 17:"OID", 18:"BIGINT", 19:"FDATE", 20:"FTIMESTAMP", :"") } } The output when execute the method GetDataTypeOfField() IRISMYDEV>d ##class(Sample.Person).GetDataTypeOfField() Column Name: FirstName Datatype number: 10 DataType: VARCHAR Column Name: LastName Datatype number: 10 DataType: VARCHAR Column Name: Age Datatype number: 5 DataType: INTEGER Column Name: TestCurrency Datatype number: 15 DataType: CURRENCY Column Name: TestBoolean Datatype number: 16 DataType: BOOLEAN Column Name: TestCharStream Datatype number: 13 DataType: CHARACTERSTREAM refer the documentation here
go to post Ashok Kumar · Sep 11, 2023 I agree, it's a wonderful idea. Enable the auto save option periodically. It's better way to prevent the implementation lost. However, keep in mind to compile the code once all the implementation completed
go to post Ashok Kumar · Sep 10, 2023 Thank you @Robert Cemper for spending time on the evaluation on the IntegrityLog-Web , CSPFileViewDownload and System-Task-REST and provided valuable feedback.
go to post Ashok Kumar · Sep 8, 2023 Hello Scott, You have lot of object properties declared in your class definition. So, You should initialize the parameter to handle the unexpected fields like extending the %JSON.Adaptor in your class definitions. Otherwise it throw an error. So, Make sure the %JSONFIELDNAME was added if the JSON field name is different from your property and add parameter %JSONIGNOREINVALIDFIELD to avoid unexpected field while loading JSON into object. Here is the article about the JSON adaptor Parameter %JSONIGNOREINVALIDFIELD As BOOLEAN = 1 Yes, You can order your JSON structure by rearranging the property back and forth.
go to post Ashok Kumar · Sep 8, 2023 Hi Prashant, You can get the values from %Metadata.columns.GetAt(i).ODBCType the ODBC type is referred the datatype of the column. 12 for VARCHAR refer the ODBC type integer and datatype here ClassMethod GetColumnType() As %String { set statement = ##class(%SQL.Statement).%New() set tSC = statement.%PrepareClassQuery("Sample.Person","TestQuery") set result = statement.%Execute() #dim meta As %SQL.StatementMetadata set meta = result.%GetMetadata() write meta.columns.GetAt(1).ODBCType ;fetch first field }
go to post Ashok Kumar · Sep 8, 2023 Hi @Gautam Rishi , You can use the GetColumnType(columnnumber) to get the datatype of the particular field in the result set object. This GetColumnType Method returns an integer values from 1 to 20.each number represent unique datatype( 10 represents VARCHAR ). If the type cannot be determined, 0 is returned. For Dynamic SQL use %SQL.Statement It's preferable and suggested instead %Resultset. set result = ##class(%ResultSet).%New("Class:Query") while result.Next() { for I=1:1:result.GetColumnCount() { write "Column Name: ",result.GetColumnName(I),! write "Datatype: ",result.GetColumnType(I),! } } Column Name: FirstNameDatatype: 10Column Name: LastNameDatatype: 10
go to post Ashok Kumar · Sep 7, 2023 This is really useful when we working long strings. If we didn't declare any stream object for the temp variable and pass to %ToJSON(temp). In that case, a %Stream.FileCharacter stream is created
go to post Ashok Kumar · Sep 7, 2023 Hello David, Both object script and SQL transaction rollback's are captured in log once you enabled the setting is true System > Configuration > Compatibility Settings > Edit Compatibility Settings for rollback log USER>write $ZVIRIS for Windows (x86-64) 2023.1 (Build 229) Fri Apr 14 2023 17:36:18 EDTUSER>tstart TL1:USER>set ^TEST=1 TL1:USER>trollback log entry
go to post Ashok Kumar · Sep 7, 2023 This is nice. Anyway there are few suggestions Make sure the commands and string functions are following same format either SET or Set or set Use %DynmaicArray instead of literal constructors [ ]. Because it's instantiate the methods of the object like %push when accessing it. Literal constructors won't do it. Comments are essential for better understanding the program flow. Anyway if you don't want to appear your comments in INT code then use the syntax #; instead of ; ex: #;Test comment Use $ListNext to get the list values one by one. It rid out the 2 additional string functions $LISTLENGTH and $LISTGET usage in your case. Easy to understand You can use $Increment instead of declaring a questionNumber variable and do addition operation in the code. Add some conditional checks to skip the if the values in between | is empty I have attached the code sample below. ClassMethod createResponse(data As %String(MAXLEN="")) As %DynamicArray { set items = ##class(%DynamicArray).%New() #;1.- Questions splitted by "|" set ptr=0 set listQuestions = $ListFromString(data, "|") #;2.- Iterate while $ListNext(listQuestions,ptr,questionAnswer) { #;3.- Update variables continue:questionAnswer="" set question= $Piece(questionAnswer, ":", 1) set answer = $ZStrip($PIECE(questionAnswer, ":", 2), "<W") //Get rid of initial whitespace #;4.- Generate item set item = { "definition": ("question "_($Increment(questionNumber))), "text": (question), "answer": [ { "valueString": (answer) } ] } do items.%Push(item) } return items } Hope this helps
go to post Ashok Kumar · Sep 5, 2023 The HS.SDA3.TimeStamp datatype class accepts both YYYY-MM-DDT00:00:00Z and YYYY-MM-DDT00:00:00 values but truncates the additional date values if the length is more than 19 in some system methods such as IsValid, LogicalToXSD methods while performing exports. I have verified and This changes happened whenever it converts to XML. However the actual entire value is persist YYYY-MM-DDT00:00:00Z in the container object. Container object have capability to export both XML and JSON as a stream by using ToJSON() method. You can run the DTL programmatically to get the container object to verify. SAMPLES> write container2@HS.SDA3.ContainerSAMPLES> write container.Patient3@HS.SDA3.PatientSAMPLES> write container.Patient.BirthTime1993-12-12T00:00:00Z I used the DTL generated container object value for another DTL source to generate a FHIR discrete resource. It works. The BirthTime is same as the expected value { "resourceType": "Patient", "birthDate": "1993-12-12T00:00:00Z" }
go to post Ashok Kumar · Sep 5, 2023 Hello @Smythe Smythee There is no difference between ensemble and IRIS instance. In your case, the source.MemberDOB is an date( ex 01/01/2000) and the conversion is working perfectly. Can you check the input of the memberDOB before conversion and just take a quick look of the previous samples. USER>write $ZVIRIS for Windows (x86-64) 2023.1 (Build 229) Fri Apr 14 2023 17:36:18 EDTUSER>set DOB="01/01/1999" USER>write $translate($ZDT($ZDTH(DOB),3)," ","T")_"Z"1999-01-01T00:00:00Z The same code is works in IRIS Interoperability DTL as well. <assign value='$translate($ZDT($ZDTH(source.DOB),3)," ","T")_"Z"' property='target.{PID:DateTimeofBirth}' action='set' />
go to post Ashok Kumar · Sep 2, 2023 AFAIK No. There is no straightforward way to import JSON from FHIR discrete resources to SDA3 objects by %JSONImport(). Basically, there are various stages involved in converting the FHIR to HL7 and vice versa. To achieve this, Intersystems created an intermediary format called SDA. However, there are more processes involved whenever convert the bundle or discrete resource.for example In some instances, the FHIR resource data element name (property) is not the same as the SDA property. Lots of internal DTL's are invoked or invoked at the conversion time, and an SDA object is created based on that output. Typically it's common for SDA to FHIR and vice versa. DTL's are crucial to accomplish this conversion. Some of the data elements are not mapped in the standard FHIR to SDA or SDA to FHIR DTL transformation. HS.FHIR.DTL.vR4.SDA3.AllergyIntolerance.Allergy in this DTL the "criticality" data element is not mapped with SDA object In this case you should create your custom DTL from the already implemented DTL to convert the values to object. So If you haven't added this type of additional properties in the SDA extension class, it won't work. Metadata values and lookup tables vary from SDA to FHIR. SDA has 'A' in the lookup tables for some fields, while FHIR has 'Active'. There is a internal validation runs against every data element to verify the generated FHIR resource Every time