Log cleanup and archival
When an activity, for example, Transaction, Process Flow, etc. executes in Adeptia Suite, the corresponding logs are created. In case of Process Flow execution, temporary files – to store intermediate data, called repository files, such as source, intermediate XML data files, and target formatted data – are also generated. These logs and repository files can cause issues if they accumulate over a long period of time.
Adeptia Suite has a cleanup task that is scheduled to run daily to clean up the logs and temporary repository files older than the specified number of days (retain time). When the cleanup task runs (at 3:00 AM by default) for log database, the logs and the repository files for which the retain time has expired are moved (archived) to the log archive database table. If log archival is not enabled they are deleted permanently. When cleanup runs (at 5:00 AM by default) for log archive database table, the archived logs and the repository files for which the retain time has expired are deleted permanently.
You can define the number of days for which you want to retain the logs and the repository files in the log as well as log archive database tables by setting the respective retain time properties.
- The logs and repository files that are older than five days are cleaned up (moved to the log archive database table if log archival is enabled, else they are deleted from log database table). You can change this default configuration by setting a value for the property abpm.logs.retainTime based on your requirement.
- The logs and repository files that are older than fifteen days are cleaned up (deleted permanently from the log archive database table). You can change this default configuration by setting a value for the property abpm.archive.logRetainTime based on your requirement.
- Log cleanup does not delete the logs and repository files for the Process Flows that are in the running/waiting state.
The time at which the cleanup starts for log and log archive database tables is governed by the following properties available in the server-configure.properties file located at <AdeptiaServerInstallFolder>/AdeptiaServer/ServerKernel/etc.
- To schedule the logs cleanup, set the time using cron expression as a value for the property abpm.appmanagement.logCleanupCronExpression.The default values for these properties are set to 3:00 A.M.
- To schedule the archived logs cleanup, set the time using cron expression as a value for the property abpm.appmanagement.archiveLogCleanupCronExpression. The default values for these properties are set to 5:00 A.M.
An example cron expression to schedule the logs or archived logs cleanup at 8:00 PM is 0 0 20 * * ?. For information on cron expressions, click here.
These properties are available in the server-configure.properties file located at <AdeptiaServerInstallFolder>/AdeptiaServer/ServerKernel/etc.
To know more about the retain time and other properties that govern the logs archival and cleanup, you can refer to this page.
There are some folders that store temporary data also get cleaned up either after successful execution of Process Flow or during log cleanup. The following table lists the location of such folders, theri purpose, and the time when they get cleaned up from the database.
Location | Purpose | Cleanup Process |
ServerKernel/web/Repository | Stores repositories created during process flow execution | Cleaned during log cleanup |
ServerKernel/web/Archive | Stores the repository cleaned during data cleanup | Cleaned during archived log cleanup |
ServerKernel/web/Attachments | Stores the web service and MIME/binary attachment files | Cleaned during log cleanup |
ServerKernel/logs/applicationlogs | Stores the logs of process flow execution, activity creations and modification | Cleaned during log cleanup |
ServerKernel/web/edi | Stores the source and target files displayed in the EDI logs FileIn and FileOut links | Cleaned during log cleanup |
ServerKernel/web/Archive/edi | Stores the source and target files displayed in the EDI logs FileIn and FileOut link after cleanup | Cleaned during archived log cleanup |
ServerKernel/web/routing | Stores the source and target files displayed in B2B (Non-EDI) logs FileIn and FileOut links | Cleaned during log cleanup |
ServerKernel/web sapIdocLocation | Stores the IDoc files, received by Adeptia Suite | Cleaned during log cleanup |
ServerKernel/temp Repository | Stores the intermediate files during process flow execution | Cleaned after Process Flow execution |
ServerKernel/Recovery | Stores the process flow intermediate files requires to re-execute the process flow if process flow aborted due to system failure | Cleaned automatically after successful execution of Process Flow after restarting the build |
ServerKernel/Re-processing | Stores the process flow intermediate file requires to re-execute the process flow if process flow aborted due to the manual error | Cleaned automatically after Process Flow execution |
Using separate database server for archived logs
In case you are using Adeptia Suite to process a large number of files everyday, it is recommended that you use a separate database server for logs archival. Following are the steps that you need to follow in order to create tables for logs archival on a separate database.
Important
Before you follow the steps to create a separate database for archived logs, ensure that you have set the value for the property abpm.logs.archival.database to 2.
- Create a database (for example, Adeptia_Logs_Archive on SQL Server) on the database server where you want to archive the logs.
- On this database, run the initialize-log-<database server name>.sql script located at .../AdeptiaServer-x.x/ServerKernel/etc folder. This creates the tables where the archive logs will be stored (for example, for a database created on the SQL server run initialize-log-sqlserver.sql script and for a database created on an Oracle Server run initialize-log-oracle.sql).
- Run the create-indexes-<database server name>.sql script located at .../AdeptiaServer-x.x/ServerKernel/etc folder. This applies the indexes on the tables created in the previous step (for example, for a database created on the SQL server run create-indexes-sqlserver.sql script and for a database created on an Oracle Server run create-indexes-oracle.sql script).