Hi Isomorphic,
as written here, I have a problem with BatchUploader and the amount of data I can transfer.
It is clear that BatchUploader will hit a limit eventually, let it be 1000 or 10000 records, as the browser is blocked then and this leads to a bad user experience.
So the solution is to turn to Background processing. For me, this will consist of
I can think of two solutions:
I did not find related threads here (only this one) and also don't know about a docs topic on this.
What is your suggested solution for this use case?
Best regards
Blama
as written here, I have a problem with BatchUploader and the amount of data I can transfer.
It is clear that BatchUploader will hit a limit eventually, let it be 1000 or 10000 records, as the browser is blocked then and this leads to a bad user experience.
So the solution is to turn to Background processing. For me, this will consist of
- Upload CSV to DB as a file
- Process it and insert data into a temporary table (by "temporary table" I mean normal DB table, but used for import and not the production table. I don't mean something like this)
- Display temporary data and errors (like BatchUploader after click on "Upload", but already persisted in the DB and not in-memory)
- Possible error correction by the user
- Transfer data from temporary table to production table
I can think of two solutions:
- "Normal" custom DSRequest and some showPrompt:false setting where the request returns eventually (perhaps without a Callback, perhaps with a Callback + Notification in 12.1), but the user continues his or her work.
- "Normal" custom DSRequest that triggers a new thread and returns directly.
I did not find related threads here (only this one) and also don't know about a docs topic on this.
What is your suggested solution for this use case?
Best regards
Blama
Comment