I had a problem: I needed to be able to join multiple tables from a source database before importing to ACL. The data was structured in a typical header-items-subitems arrangement, with each layer stored in a different table. I needed information from the items and subitems, but the dates I wanted to filter on were in the header table. ACL’s old IMPORT ODBC command would not work without complicated SELECT SQL statements in the command’s WHERE parameter, which was not a sustainable course of action. I needed to be able to import a JOINed SQL statement.
I had a program to do that, and even better, the program could be scripted. The problem was, the scripting would only work from the local machine, i.e. if it was installed on the server I could not use the program from my laptop. I needed a solution that would allow me to import data using the full power of SQL in both the AX and AN environments. I came up with a solution that I am very proud of for its extendibility, ease of use, incorporation of professional programming techniques, and all-around technical cleverness. Shortly after, AN12 was released with native SQL capabilities, which rendered my solution completely useless. I have decided to share my solution here in the hopes it might inspire others to do…something…and maybe perhaps reveal helpful techniques or ideas.
In short, my solution was to design a message queue. In its basic format, a message queue is an object that allows senders to asynchronously submit requests to a receiver. The receiver processes the messages in a First-In First-Out (FIFO) manner and notifies the sender when it is finished. Message queues are fairly ubiquitous in modern programming and most programming languages have facilities to create them. ACLScript, being a data analytic language and not a general-purpose programming language, does not. So I created them.
Here is the basic setup: AX runs a script that constantly monitors a particular Windows folder for new text files (let’s call this the ‘Receiver’ script). The client program (the ‘Sender’) creates text files in a shared directory (let’s call it the ‘queue folder’) containing the necessary instructions. I decided to use text files in a shared directory as my queue storage because it did not require any special programming or infrastructure. When a new text file is detected, the Receiver reads the file and makes sure it is in the correct format. Then it performs the instructions, saves any data in a shared location, and changes the status of the text file to a ‘Completion’ state. The Sender constantly monitors the text file it submitted and when it notices that the file has been marked as completed, it imports whatever data was saved in the shared location. Once the task is completed, the Receiver resumes monitoring the queue folder and the Sender proceeds merrily executing the remainder of its script.
The beauty of this design is that it can be used whether the sender is a client laptop running AN or AX itself. In the latter case, AX ends up sending messages to itself via 2 different scripts. Thus, the sender scripts can be used in any environment and are platform agnostic.
To truly make this process work, it is important that new tasks can easily be added to the Receiver’s repertoire, i.e. it is flexible and extendible. To accomplish this, the text files that are placed in the queue folder contain the name of a script that the Receiver should execute. It is imperative that the Receiver be isolated from errors, so before executing the script it checks for the existence of the script specified in the file. The Receiver’s ‘repertoire’ of scripts is stored in a secured folder on the server and is only accessible by the AX system user. This design allows me to easily add new scripts by simply placing them in the secured folder – no changes have to be made to the core Receiver queue script and the job that runs the script does not have to be restarted. Additions and deletions to the Receiver’s capabilities are completely seamless.
If you want to do this on your AX installation because it is so cool, here is what you need:
- A server. Let’s assume the server name is \\IaServer.
- A folder on your AX server to store and share the queue (for this example, let’s assume C:\Queue)
- A folder on your AX server to store the receiver’s repertoire (for this example, let’s assume C:\Repertoire)
- A folder on your AX server to store and share the results of the job (for this example, let’s assume C:\Data)
There are 7 scripts which are contained in an ACL project that comprise the Receiver logic. These are the scripts that constantly monitor for new queue messages and execute the desired commands. We start with the main script, called ‘_Executor’:
COMMENT //ANALYTIC TYPE IMPORT Process Queue Processes commands to the provided queue. //PARAM p_QueuePath C Queue Path Enter the path to a folder containing queue files (.queue*) END SET ECHO NONE SET SAFETY OFF SET LOOP 0 v_CurrentDateTime = "CTODT(DATE(TODAY() 'YYYYMMDD') + TIME(NOW() 'THHMMSS') 'YYYYMMDDTHHMMSS')" v_ExecutorStart = %v_CurrentDateTime% DO CreateEmptyTable DO _1_MasterProcessor WHILE %v_CurrentDateTime% - v_ExecutorStart < 0.95 DELETE v_CurrentDateTime OK DELETE v_ExecutorStart OK
The only parameter required for the main script is a path to the queue (in this example, C:\Queue). Parametrizing the path allows multiple queues to be run concurrently.
The main script starts by getting the current timestamp and creating a blank table which we can use to write to the queue text files. That script, ‘CreateEmptyTable’ looks like this:
COMMENT ** This script creates an empty table with 1 record and 1 field ** This is meant to be a placeholder from which custom tables can be ** Created from nothing ** ** There are no input values ** ** The output is a table called 'BlankTbl' with 1 record and 1 field END COMMENT Create a text file that contains "0" EXECUTE 'cmd /c echo 0 > BlankTbl.txt' COMMENT Import the text file we just created. This is the output table IMPORT PRINT TO BlankTbl "BlankTbl.FIL" FROM "BlankTbl.txt" 0 RECORD "Detail" 0 1 0 TEST 0 0 AT 1,1,0 7 "0" FIELD "Fld" C AT 1,1 SIZE 1,1 DEC 0 WID 1 PIC "" AS "" OPEN BlankTbl COMMENT Delete the text file DELETE "BlankTbl.txt" OK
Once the blank table is created, _1_MasterProcessor runs for the next 22 hours and 48 minutes (95% of a day). This allows the script to be scheduled to re-run on a daily basis so the log does not grow too large (also, resetting every now and then means any unknown instabilities that might arise from constantly running an AX script are mitigated). The code for _1_MasterProcessor looks like this:
COMMENT Get a list of unprocessed queue files. Stop executing if there are none. DIRECTORY "%p_QueuePath%\*.queuestart" TO Queue_T01 DO _2_ProcessQueueFile IF WRITE1 > 0 EXECUTE "cmd /c TIMEOUT /t 10"
The logic is simple: Check for a new queue file, process it if one exists, and then wait for 10 seconds before starting the process over again. If a queue file is found, _2_ProcessQueueFile executes, which looks like this:
COMMENT Pick the first file and rename it to show it has been selected for processing OPEN Queue_T01 v_QueueFile = ALLTRIM(REGEXREPLACE(REPLACE(ALLTRIM(File_Name) p_QueuePath+'\' '') '\.queuestart' '')) RENAME "%p_QueuePath%\%v_QueueFile%.queuestart" "%p_QueuePath%\%v_QueueFile%.queuework" OK IMPORT DELIMITED TO Queue_T02 "Queue_T02.FIL" FROM "%p_QueuePath%\%v_QueueFile%.queuework" 0 SEPARATOR TAB QUALIFIER NONE CONSECUTIVE STARTLINE 1 FIELD "SentData" C AT 1 DEC 0 WID 1000 PIC "" AS "" COMMENT Get the script that should be called and check for validity v_Script = ALLTRIM(SentData) EXECUTE 'cmd /c IF EXIST "C:\Repertoire\%v_Script%.aclscript" (Exit 0) ELSE Exit 1' v_ScriptExists = RETURN_CODE DO Err_ScriptNotValid IF v_ScriptExists = 1 DO _3_RunScript IF v_ScriptExists = 0 DELETE v_ScriptExists OK COMMENT Set the queue file status to finished RENAME "%p_QueuePath%\%v_QueueFile%.queuework" "%p_QueuePath%\%v_QueueFile%.queuedone" OK
The first step to process the file is to rename it. By default, queue files have a file type of .queuestart. The processor script renames it to a .queuework file type, which lets all of the involved parties know that it is being worked on. The file is then imported into ACL and the name of the script to execute is extracted (it should be the first record). The processor script makes sure the script exists, and if it does it executes _3_RunScript; otherwise, it runs an error script called Err_ScriptNotValid. Once _3_RunScript finishes executing, the processor renames the queue file to a .queuedone file type and finishes its execution.
The content of Err_ScriptNotValid is below, which appends an error message to the queue file so the end user can know what happened.
COMMENT If the script is not valid, append a status to the queue file OPEN BlankTbl LIST UNFORMATTED 'Script -%v_Script%- Not Valid' TO "%p_QueuePath%\%v_QueueFile%.queuework" APPEND
The content of _3_RunScript is below:
COMMENT Get the parameters that were supplied and run the script LOCATE RECORD 2 v_Params = ALLTRIM(SentData) DO "C:\Repertoire\%v_Script%.aclscript" COMMENT Update the queue file return code based on the results of the script v_ReturnCode = RETURN_CODE IF v_ReturnCode 0 DO Err_ScriptFailed OPEN BlankTbl LIST UNFORMATTED '0' TO "%p_QueuePath%\%v_QueueFile%.queuework" APPEND IF v_ReturnCode = 0 DELETE v_ReturnCode OK
_3_RunScript extracts and provides parameters from record 2. These parameters may be required for the repertoire script that will be run. Once the parameters are extracted, the repertoire script is finally executed.
It is assumed that all of the scripts in the queue’s repertoire utilize the EXECUTE command (otherwise, why have to go through this process?). EXECUTE provides a return code from the called application which is either 0 (no errors occurred) or some value other than 0 (errors occurred). If a non-zero value was returned, an error message is processed via script Err_ScriptFailed, which is below:
COMMENT If the script failed, append a status to the queue file OPEN BlankTbl LIST UNFORMATTED '1 The script did not execute properly' TO "%p_QueuePath%\%v_QueueFile%.queuework" APPEND
Those are all of the core scripts that run on the server and allow AX to be a receiver. They do not perform any of the actual work specified in the queue files; their entire function is only to listen and execute queue messages. To perform actual work, we need 2 sets of scripts that allow the client to request work be performed and give the server the capabilities to perform the work.
The first set of scripts are run by the client. They communicate with the server and then take the server’s output and perform any necessary processing. I carved out a utility script that handles all of the communications, so I do not have to recreate the wheel every time I develop a script that requires queue functionality. The utility script, called ‘SubmitQueueItem1’, is below:
COMMENT ** This script submits an item into an AX queue ** ** The inputs are: ** v_QueuePath - The path to the folder containing .queue files ** v_SessionId - A character variable containing the session name ** v_ScriptName - The name of a script to run ** v_Params - A comma delimited list of named parameters. E.g. parm1:Value1,Parm2:Value2 ** ** The output is a character variable called v_QueueItemStatus which indicates the final status of the queue item. END COMMENT Submit the requested script with parameters DO CreateEmptyTable OPEN BlankTbl LIST UNFORMATTED v_ScriptName TO "%v_QueuePath%\%v_SessionId%.queuetemp" LIST UNFORMATTED v_Params TO "%v_QueuePath%\%v_SessionId%.queuetemp" APPEND RENAME "%v_QueuePath%\%v_SessionId%.queuetemp" "%v_QueuePath%\%v_SessionId%.queuestart" OK COMMENT Wait until the server has finished processing the queue item v_QueueWorking = T SET ECHO NONE DO SubmitQueueItem2 WHILE v_QueueWorking SET ECHO ON DELETE v_QueueWorking OK COMMENT Import the finished queue file and remove the file from the server IMPORT DELIMITED TO QueueItem_T01 "QueueItem_T01.FIL" FROM "%v_QueuePath%\%v_SessionId%.queuedone" 0 SEPARATOR TAB QUALIFIER NONE CONSECUTIVE STARTLINE 1 FIELD "SentData" C AT 1 DEC 0 WID 1000 PIC "" AS "" DELETE "%v_QueuePath%\%v_SessionId%.queuedone" OK COMMENT Get the final status LOCATE RECORD 3 v_QueueItemStatus = ALLTRIM(SentData)
The script begins by creating an initial queue file in the specified queue folder. Once the file has been built, the script converts the file from a ‘queuetemp’ file type to ‘queuestart’, which acts as a flag to the server that the file is ready to be processed. At this point the script loops SubmitQueueItem2 until a .queuedone file is identified, whereupon it imports the contents of the .queuedone file and deletes that file from the directory. The status is extracted from the .queuedone file and saved in the v_QueueItemStatus variable. For reference, the code for SubmitQueueItem2 is below:
EXECUTE "cmd /c TIMEOUT /t 10 /nobreak>nul" EXECUTE 'cmd /c IF EXIST "%v_QueuePath%\%v_SessionId%.queuedone" (Exit 0) ELSE Exit 1' v_QueueWorking = F IF RETURN_CODE = 0
To use this file and complete the work on the client side, the utility script can be called as follows:
COMMENT Submit a queue request v_QueuePath = "\\IaServer\Queue" v_SessionId = 'SomeUniqueIdentifier_PerhapsUseTimestamps' v_ScriptName = "SomeImport" v_Params = "Param1:Abcd,Param2:1234" DO SubmitQueueItem1 COMMENT See if the import was successful ESCAPE ALL IF v_QueueItemStatus '0' COMMENT Import the source files IMPORT DELIMITED FROM "\\IaServer\Data\SomeFile.del" TO ...
The second set of scripts are run by the server. These scripts take any parameters that are passed via the v_Params variable and do something, in my case import data from a database using SQL. An implementation of this set of scripts might look something like this:
COMMENT ** v_Params - A comma delimited list of named parameters. E.g. parm1:Value1,Parm2:Value2 The following parameters are required: ** domain - The domain to filter by ** v_QueueFile - The name of the queue file, which will be used as the session ID. END v_Filter1 = ALLTRIM(SPLIT(SPLIT(v_Params ',' 1) ':' 2)) v_Filter2 = ALLTRIM(SPLIT(SPLIT(v_Params ',' 2) ':' 2)) EXECUTE 'someSqlApplication "SELECT * FROM a LEFT JOIN b ON a.id = b.id WHERE a.fld1=%v_Filter1% AND a.fld2 = %v_Filter2%" -saveTo "\\IaServer\Data\SomeFile.del"'
Note: I performed a cardinal sin with SQL here and opened myself to an SQL injection attack. Do NOT embed filter values directly into an SQL string like I have here – this is just for demonstration purposes. The application I used allows me to parameterize filter values, which protects me from this avenue of attack. You do not want to be the reason for a breach to your company’s databases.
That was the last piece in the queue puzzle. It seems like a lot of scripts and a lot of moving pieces, but the design allows me to only have to create and manage 2 scripts for each import: 1 script for the client to make a request and do something with the results of that request, and 1 script for the server which performs the desired request. To try and tie everything together into a nice, visual package, the below image summarizes the process and highlights what needs to be created to add new functionality.