First, you can access Ensemble Credentials using the Ens.Config.Credentials class.  To be clear this is NOT User definitions from the Security module.  These are defined via the Ensemble Management portal options under Ensemble -> Configure ->Credentials.

That should work for you.  I would still like to better understand what is going on in the application here that drives this.  You seem to be indicating that this is a user logging into Ensemble.   If you could detail out the workflow that is happening and how it related to Ensemble Services we might be able to better advise you.

Finally,  I want to make you aware that the LDAP interface in InterSystems technologies has a method for using groups to define everything the security model needs.   In fact that is the default method in recent versions.

The best path forward is to get your Sales Engineer (SE) involved in what you are trying to achieve.  That person would be best suited to dig into your requirements and advise you.  If, for some reason, you cannot contact your SE or don't know who that is send me a private message.  I'd be happy to help out more directly.

Ensemble Credentials are normally used to satisfy security for an Ensemble Business host.  This separates the maintenance of security from the maintenance of the actual interfaces.   The application of the security is handled completely by Ensemble in that scenario.   This does not appear to be how you are attempting to utilize this.  It would help to better understand your use case here.   What is the entry path/service  that is utilizing delegated authentication?  

No it is not 'necessary'.  However I do like to be able to have an environment that more closely matches what one might need in production.  This is both for my experience and to be able to show InterSystems technology in a manner that might occur for a client.

I do use Docker exec, thought I choose to go to BASH so I have more general access.  I actually wrote  a simple cmd file to do this and added it to a menu on my toolbar.

@echo off
docker container ls --format "table {{.Names}}\t{{.Status}}\t{{.Ports}}"
echo:
set /P Container=Container ID: 
docker exec -it %Container% bash

Let me add my experience to this comment.  I have been wading into the Docker ocean.  I am on Widows and really did not want to run a Linux VM to get Docker Containers (seemed a bit redundant to me) so Docker for Windows was the way to go.  So far this has worked extremely well for me.  I am running an Ubuntu container with Ensemble added int.   My dockerfile is a simplistic version of the one earlier in these comments.   I am having only one issue related to getting the SSH daemon to run on when the container starts.

 I hope to have all my local instances moved into containers soon.

My feeling is that this will be great for demonstrations, local development, and proofs of concept.   I would agree that for any production use having a straight Linux environment with Docker would be a more robust and stable solution.

Below is a class method that returns details of a Contact.  Note the line that that is highlighted below.  Basically any output written from your REST Classmethod becomes part of the body of the response.  So the WRITE of the output of %ToJSON() puts a JSON object in the response body.

ClassMethod ContactDetail(ContactID As %String) As %Status
{
    #dim %response as %CSP.Response
    #dim %request as %CSP.Request
    #dim ContactItem as WebREST.Data.Contact
    #dim ActionItem as WebREST.Data.Actions
    
    set tSC = $System.Status.OK()
    Try {
        if ContactID {
            set ObjSC = ""
            set ContactItem = ##Class(WebREST.Data.Contact).%OpenId(ContactID,,.ObjSC)
            if $System.Status.IsError(ObjSC) {
                // test for object not found error
                if $System.Status.GetErrorCodes(ObjSC) = "5809" {
                    set %response.Status = ..#HTTP404NOTFOUND
                } else {
                    throw ##class(%Exception.StatusException).CreateFromStatus(ObjSC)
                }
            } else {
                set ResponseObj = {}
                set ResponseObj.ContactID = ContactItem.ContactID
                set ResponseObj.FirstName = ContactItem.FirstName
                set ResponseObj.LastName = ContactItem.LastName
                set ResponseObj.Position = ContactItem.Position
                set ResponseObj.Company = ContactItem.Company.Name
                set ResponseObj.Phone = ContactItem.Phone
                set Address = {}
                set Address.Street = ContactItem.Address.Street
                set Address.City = ContactItem.Address.City
                set Address.State = ContactItem.Address.State
                set Address.PostalCode = ContactItem.Address.PostalCode
                set ResponseObj.Address = Address
                set ResponseObj.Agent = ContactItem.Agent.Name
                
                // now load each action into an array
                set Actions = []
                set ActionKey = ""
                do {
                    set ActionResponse = {}
                    set ActionItem = ContactItem.Actions.GetNext(.ActionKey)
                    if ActionItem '= "" {
                        set ActionResponse.ActionType = ActionItem.ActionType
                         set ActionResponse.DueDate = $zdate(ActionItem.DueDate,3)
                        set ActionResponse.Notes = ActionItem.Notes
                         set ActionResponse.Complete = ActionItem.Complete
                        set ActionResponse.Agent = ActionItem.Agent.Name
                        do Actions.%Push(ActionResponse)
                    }
                } while ActionKey '= ""
                set ResponseObj.Actions = Actions
                Write ResponseObj.%ToJSON()
            }
        } else {
            set %response.Status = ..#HTTP400BADREQUEST
        }
    } catch Exception {
        set tSC = Exception.AsStatus()
    }

    Quit tSC
}
 

Some questions first.  

  1. When you say 'localhost' are you implying that you are using the private Web Server contained within Cache?
  2. Is this the only application being run from the web server vs localhost?
  3. If not then are the other applications still accessible via the external web server.

Some things to check right off the bat.

  • enabling audting and seeing which user is getting the error will help
  • Use the CSP Gateway Management pages HTTP trace capability to see if the request is even making it into Cache.  It would seem so from the error, but better to confirm everything.
  • Make sure that the user that the CSP Gateway associated with the Web Server uses to communicate to Cache has access to your database.  This would be different from the person logging into your application.  This can be found in the CSP Gateway Management for the server.

Another method to consider is Delegated Authentication (http://localhost:57775/csp/docbook/DocBook.UI.Page.cls?KEY=GCAS_delegated).  The benefit here is your custom authentication code gets run BEFORE the user is granted any kind of access to the system.  The issue with relying on the routine started as soon as the login is that the user has already "breached the walls" and is in the system (and consuming a license) before you authenticate them.   If your code fails for any reason or if there is a hole that allows them to break out of the code then they will likely have complete access to at least that namespace.

With Delegated Authentication you are hooking your custom code into the authentication flow of Cache.  If there is any failure of code in this process the user gets "Access Denied".

One limitation is that you are only allowed a limited amount of code (2000 or 4000 commands if memory serves) to complete you authentication.  After that delays are inserted into the process.

Hope that helps.

Robert,

Great history lesson!  I have a question for you though.  As you were there at the  begining or close to it perhaps you might have some insight.  I came from a background in MultiValued databases (aka PICK, Universe, Unidata) joining InterSystems in 2008 when they were pushing Cache's ability to migrate those systems.   From the beginning I was amazed at the parallel evolution of both platforms.  In fact when I was preparing for my first interviews, having not heard of Cache before, I thought it was some derivative of PICK.  Conceptually both MUMPS and PICK share a lot of commonality.  Differing in implementation of course.  I have long harbored the belief that there had to be some common heritage.  Some white papers or other IP that influenced both.   Would you have any knowledge on the how the original developers of MUMPS arrived at the design concepts they embraced?  Does the name Don Nelson ring a bell?

Thanks again for the history.