Global Environment Values Approach (Qlik Sense)

Global Environment Values Approach (Qlik Sense)

With multi-stage Qlik environment (DEV, TEST, PROD etc) the load scripts are usually including some kind of a check - "In which environment am I?".

Based on the result different/additional actions might be performed. For example:

  • if the environment is not PROD then mask/scramble the client names
  • use different data connection names (providers) on each environment
  • ...

It will be nice if there is something more centralised that can provide this information (and not only) to any app on the cluster. Something like a global meta-data "storage" that can be changed/updated (by the administrators)

In general my approach is to use a dedicated Custom Property and each value to be in format key:pair. Values can be accessed for reading using a (dedicated) REST data connection in the beginning of each script. If we keep the keys the same then we can write one script and use it on every environment.

Custom Property

We will create new custom property (in my case I've named it ENVIRONMENT) and all the global values will be added to this CP.


  • create new custom property
  • add as many CP values as you want as key:value pairs (I've picket : as a separator but this can be anything). For example:
    • name:DEV
    • central repository ip:
    • central repository host:qlik.central.repo
Created ENVIRONMENT custom property

Data Connection

We will use the Repository Service REST API to get the ENVIRONMENT custom property values. We need to create a new data connection that will be used to extract the data.

During the installation QS creates few monitor_apps_REST_* data connection which are used by the documents in the Monitoring apps stream (like License Monitor or Operations Monitor). We can use one of these connectors, as a base and create new one that will return the values only for our custom property.

  • open any application and create new Data Connection (at this point there is no need the connection to be a specific type. We are creating just a placeholder)
  • In QMC -> Data Connection -> monitor_apps_REST_task -> copy the Connection string
  • go back and edit the placeholder connection (in my case Environment)
  • replace the connection string with the one from monitor_apps_REST_task and apply the change
  • go back to the application and edit the Environment connection
  • in the URL replace /task/ with /custompropertydefinition/
  • in the Query parameters section add new one:
    • Name: filter
    • Value: name eq 'ENVIRONMENT'
  • Save

At this point you can preview the data from the connection and if everything is ok you should see something like this:


Once all is set we can create a small QS script that gets the raw values and "convert" them to a more readable QS format in form of a table with two columns: ENV_NAME and ENV_VALUE

LIB CONNECT TO 'Environment';

   FROM "choiceValues" FK "__FK_choiceValues" ArrayValueAlias "@Value")
FROM JSON (wrap on) "root" PK "__KEY_root";

  subfield([@Value], ':', 1) as ENV_NAME,
  subfield([@Value], ':', 2) as ENV_VALUE
  [__FK_choiceValues] as [__KEY_root]
  NOT IsNull([__FK_choiceValues]);

DROP TABLE RestConnectorMasterTable;

This script can be placed in the beginning of each app OR even better - host the script somewhere and each app (on each environment) can Must_Include it.

Upon reload only one table is returned:


Having a centralised place for a cluster-wide variables is a nice to have. I've seen multiple approaches on how to deal with this: specific document prefixes for each env, flat files in each data folder etc. But the problem there is that there is always enough space for someone to make a mistake (overwrite the flat file when moving data is my favorite)

Another plus for custom property and data connection is that these objects are subject of security rules. Which means that we can grant everyone read access over them but very limited audience can edit them which (might 🤞) lead to less errors and maintenance.


If you have/use Qlik-CLI then the PowerShell script below will do the "dirty" setup for you. The script will:

  • create new custom property with some pre-defined values
  • create new REST data connection using the monitor_apps_REST_task as a template
  • edit the new data connection
    • change the URL
    • add the additional query parameter
# QS Central Repo Address
$computerName = "qlik.central.repo" # or localhost if started on the server itself

# Properties for the new data connection
# ideally the user/pass should be the ones
# under which the QS service is running
$dataConnProps = @{
  name = "Environment"
  userName = "domain\username"
  password = "password"

# Properties for the new custom property
# Can populate some values when creating it
# "Objects = App" is just to create the CP (cant create it without it from Qlik-CLI)
$customPropProps = @{
  name = "ENVIRONMENT"
  values = @("name:DEV", "central repository ip:")
  objects = @("App")

function createNewDataConnection {
    # Get the existing monitor_apps_REST_task connection details
    $connector = Get-QlikDataConnection -filter "name eq 'monitor_apps_REST_task'"

    # Replace the endoint to point to return custom prop details instead of a task
    # Add one more query parameter which will return data only for the Environment custom prop
    $newConnectionString = $connector.connectionString.
        Replace("/qrs/task/full", "/qrs/custompropertydefinition/full").
        Replace("queryParameters=xrfkey%20000000000000000", "queryParameters=xrfkey%20000000000000000%1filter%2name eq '$($'")

    # Create the connection
    New-QlikDataConnection -name $ -connectionstring $newConnectionString -type "QvRestConnector.exe" -username $dataConnProps.userName

function createNewCustomProperty {

    # Create the Environment custom property
    New-QlikCustomProperty -name $($ -choiceValues $($customPropProps.values) -objectTypes $($customPropProps.objects)

function connectToQlik {
    Connect-Qlik -computerName $computerName -TrustAllCerts

function main {