Hi Isomorphic,
I want to use the BatchUploader as easy and manual interface - export data in an other system, import here.
The items to import have an ID from the previous system that is used to identify the items here as well (unique constraint in the DB, unique validator in the .ds.xml).
Now my upload-datasource is more virtual - accepting the rows and inserting them to the main table and child tables in its add-OperationBinding.
I can do a type=serverCustom-validator here in order to check the new records against the existing ones, even before committing them.
But how do I make sure that there are no duplicated records in the .csv itself? The records only exist in memory in the clientside BatchUploader.
Thank you & Best regards
Blama
I want to use the BatchUploader as easy and manual interface - export data in an other system, import here.
The items to import have an ID from the previous system that is used to identify the items here as well (unique constraint in the DB, unique validator in the .ds.xml).
Now my upload-datasource is more virtual - accepting the rows and inserting them to the main table and child tables in its add-OperationBinding.
I can do a type=serverCustom-validator here in order to check the new records against the existing ones, even before committing them.
But how do I make sure that there are no duplicated records in the .csv itself? The records only exist in memory in the clientside BatchUploader.
Thank you & Best regards
Blama
Comment