Batch import creates index duplicates - how to update indexes

Dear all,

I'm running into a problem with creating and automated import mechanism, which should keep existing Multiterm 2011 termbases updated.

Based on XML files, which contain entries with Multiterm concept ids (<concept>233</concept>), using the .Net SDK - the termbases with existing content should be updated to the state of the XML files:

ImportDefinition oImpDef = oImpDefs.Add("CustomImport2", "test", importDefinition);
oImpDef.ProcessImport(MultiTermIX.MtTaskType.mtScript);

the import definition is set to
- fast import
- synchronize entries on entry number
- entry number does not exist in the target termbase -> add import entry as new
- entry number exists in the target termbase -> overwrite existing entry with import entry

However, after a repeated import with the same XML file, the number of entries stays the same (as expected) but the Terms increase with duplicates during each import (expected would be to also stay the same if the XML doesn't change).

I would have expected the (apparently) automatically running termbase reorganization after the import to clean that up, but it does not. Triggering a termbase reorganization in the API

oTb.Reorganise();

also doesn't have any effect. Strangely, if I reorganize the termbase using the Multiterm 2011 client the "duplicates" in the term indexes dissapear.

Any suggestions on what is missing here? It would be great to have this process automated end-to-end and not need to manually reorganize the termbases one by one...

Thank you for looking at this,

Michal