When planning delta imports from SQL, thought needs to be given to clearing down the delta tables. If you have any type of code running in response to the imported data then you have to accept that import errors will occur, some of them not even your fault!
What should happen to the rows in the Delta table that were not imported due to errors? Ideally you will let these rows stay put, giving MIIS another go at them on the next Delta Import. But what, then, should you do with the rows that were successfully imported?
The solution will depend on the number of objects you’re dealing with, how often you manage a Full Import, and what the repercussions are of missed imports. In a system where a complete set of Full Imports can be run overnight, you may well find that a few missed deltas during the day are not significant. Alternatively, if the Fulls are run weekly or less often, you are completely dependant on the reliability of your Deltas – those failed imports must get another chance.
Some possible solutions: optimistic, pessimistic and best-odds.
Assume all Delta Imports are successful. Clear out the Delta table as an automatic step before generating the new Delta table.
This approach works fine in simple systems where Full Imports can be run fairly regularly (at least once a day) to mop up any missed imports. A big problem with it is that an Import must be run after every run of the DTS. This can quickly turn into a complete pain when you’re troubleshooting.
Each row in the Delta table is independently checked for import success, and only removed once the import is confirmed.
You could perhaps write some code that checks through the import log file after each import, and only deletes lines from the Delta table that look correct in the log.
On my MIIS course, the lovely Hugh Simpson-Wells suggested an extra MA where objects are sync’d back from the metaverse just so you can compare them with the source data. This is not something I’ve ever tried, but I guess it would be the way to go if you needed 100% verification on those imports.
The best-odds approach that I adopted was to do a simple check for errors following the Delta Import task. If any errors occur, I keep the entire Delta table. Otherwise the Delta table is cleared.
The main downside of this approach is that a single (possibly trivial) error will prevent the clearing, and will cause your delta table to grow and grow. MIIS will not be bothered by being given the same import data over and over, but it does slow everything down. You can also run into problems when the same object appears multiple times in the table with conflicting instructions (like a “Delete” followed by a “Modify”). Adding a de-duping step to your DTS helps, but the only solution is to ensure that import errors are sorted out ASAP.
There are a number of technical examples I should include at this point, but this is a philosophising post so I won’t do it here. There’s more to come on how to write a VBScript to run your MIIS jobs, check for errors, and then fire off whatever else you need to do.