I’ve been working some batch file based jobs for a project here for OLS. There are two sides of this; sending “clearing” files of transactions in a defined format to third party which I will call the extract, and receiving “refresh” or table updates from this third party, I’ll refer to this as the import. The extract file contains financial transaction records, and the import file contains entity information such as merchant information. The Extract File Layout is quite standard an looks something like:
————Detail and Addendum Record(s)
————Detail and Addendum Record(s)
The File Layout for Import is a list of fields per record, nothing real fancy here.
I don’t want to spend too much time about the actual mechanics of the Extract and Import jobs themselves, but rather the Operational Considerations of this and others that we have performed:
Validity – You need to decide how to handle invalid records in a file or valid records without proper supporting data (transactions for a merchant that wasn’t setup in your system) ? You can write off the bad record to an exception file, and address it later, or you can reject the full file, the approach depends by implementation and requirements. We also mark files with a .bad extension if we detect they are invalid to help prevent subsequent processing steps, like transmitting a half-baked file. We also perform duplicate file checking as well as other validation steps.
Completeness – You need to make sure that you read in the entire file or extract a complete file. Monitoring controls such as checking the number of lines and file size in an Extract file, as well as checking the last time of a file for a specific record such as a File Trailer. Reconciliation between hash totals and amounts is also a good practice. On the import side you can count the number of lines or records in a totals from a trailer record and compare that to what was imported.
Timeliness – Some extracts take minutes and others hours, scheduling and monitoring the process is essential to perform data processing on a timely basis to other parties. Monitoring “check-points” in the process as well as a % of records completed help here to detect problems proactively with monitoring. Collect job performance metrics, it is valuable to keep track of and chart the total run time of each job and compare it to its history, to detect slowdown’s or to correlate increase or decrease in process times with external events or transaction growth.
Delivery – Consideration for the delivery of the file must be made. File Transfer procedures that address file naming conventions, steps to upload a complete file (upload as a .filepart extension and the rename to the full name upon complete transfer) as well as secure delivery as well as archiving locally or remotely, compression, and any file level encryption. It is also a good practice to reconnect to the file server and perform a directory listing on the files that you uploaded to confirm that they were transferred successfully.
Security – While account numbers and such are encrypted in databases (column level, file level, internal database level) the file specifications don’t allow for encrypted card-numbers, so both file level asymmetric encryption using the public key of the recipient as well as transport level encryption to send the file (see Delivery above) need to be considered. Archival files stored on disk will also need to be stored encrypted as well.
Troubleshooting/Restart Procedures – You need to develop procedures to support the following:
· re-sending failed files
· re-running the extract or import process for a specific date
· preventing duplication or invalid files or data.
The End is Just the Beginning – Operations is just the start of a process that has no end, it requires daily care and maintenance. These processes and controls need to work in harmony on a continuous basis and be able to be enhanced based upon the results of monitoring and other operational tasks.