Recently, I realize a specific BAM Flow which consist to create a generic solution to monitor technical flow in a BizTalk Platform.
All you have to do is to associate you data in two Activities with the Tracking Profile Editor (TPE):
- InputHalfFlow
- OutputHalfFlow
Then a custom job (inspirate by the auto generate SSIS package created when you create a new activity) allows the data to be archive in other database.
This job use the following stored procedure that allow the tracking to continuously work:
- BAM_ForceCompleteActivity
- BAM_Metadata_SpawnPartition
- BAM_CopyToArchiveTable (Specific)
- BAM_Metadata_EndArchiving
The interesting stored procedure is BAM_Metadata_SpawnPartition, this procedure create a new table structure with the same name of your activity PLUS an GUID like Myactivity_GUID and save the name of this instance table in the Table Bam_Metadata_Partitions. This allows the process to know if a archiving is in process using the flag ArchivingInProgress and the date ArchivedTime.
In my case, I need to feed an ArchivingTable that is seen by functional people using Reporting Service. So I need to see the data as soon as possible.
So my job is enabled and is scheduled every 20 sec.
So I’ve got one new line every 20 sec in the bam_metadata_partitions table.
4320 lines per day
129600 lines per month
more than 1 500 000 lines per year
Because of the archive script process, there is no job to purge this table. So you have to be careful with this
In my case, I need to be careful with the disk space, so I realize a job to purge all the archive older than 1 or 2 week