Because you have the ability to override the validate() method as covered in the BatchUploader docs, you can catch any error case you like and report errors.
The data in the BatchUploader exists only in the browser in the grid. So there are no server-side artifacts to clean up unless you create something temporary, in which case, obviously you'd need to clean that up yourself.
If you run into some *other* way in which BatchUploader doesn't seem to handle your use case, consider all the new DataImport APIs added in 5.0, which allow you to reuse our CSV parsing functionality without using the rest of the BatchUploader UI.
Announcement
Collapse
No announcement yet.
X
-
Best practice: Batch uploader or custom solution?
Hi Isomorphic,
I'm using 4.1p and have a question regarding the best way of solving this use case:
I have a base table with three detail tables (configuration, address, contact).
Currently, I generate a base table entry in a Dynamic Form in a Window with a TabSet. After adding the base table entry I enable the other Tabs, tell them the new PK and create the child-table entries in their respective DynamicForms.
The FormItems in the are sometimes SelectItems with FK-values, where a descriptive field is shown via optionDataSource.
Now I need to generate those entries from an Excel/CSV-upload (if both possible great, if only CSV no problem either), where one row holds the base table-data as well as the child-tables data.
Obviously the provided data can be wrong in many ways (missing fields, to long data strings, wrong date format, validators not matched and(!) wrong descriptive names for FK fields).
Example for "wrong descriptive names for FK fields":
id name
1 green
2 yellow
3 red
Used value in .csv: "gren" (not the ID (obviously, as the user doesn't care for IDs) and no matching value).
Can the built in Batch Uploader handle all these error-cases?
I'd create a new upload.ds.xml with all the needed fields from base and child tables and use that as target (Way 1).
My hope is that I get the erroneous entries marked then and can correct them in the Uploader.
After I correct all entries, I hit "commit". Most likely I'd use a VIEW as real target in the DB and distribute the data correctly to their respective tables via an instead-of trigger.
The other way (Way 2) would be to get the data somehow in a DB staging table. I don't know how I'd do it, yet, but I'm thinking of file upload, CLOB field and processing/error checking with PL/SQL as well as an edit mode where the user can correct entries
So my questions are:- What do you suggest? Way 1 (also regarding postprocessing, if you have an opinion on that), Way 2 or something else?
- Can the Batch Uploader handle all error cases I listed?
- Can the Batch Uploader be configured to be in a mode where the list is not destroyed after "commit" and only the correct records are saved, so that the user can do correct/upload-cycles?
- For information purposes: After upload, the data lives only in the client and not persisted anywhere until I hit "save" in the Uploader, correct?
Thank you & Best regards,
BlamaTags: None
Leave a comment: