Hi Isomorphic,
please see this thread and the performance issue there.
While writing it I had another idea how to improve BatchUploader.
For my main table I can currently import about 1000 rows only, as otherwise the response takes to long, making the browser stop waiting for the response.
The problem here is the (1st) upload part of BatchUploader, not the 2nd commit part.
The queries issued serverside are mostly the same:
If you maintained a cache of
"DataSource" - "key value" - "resulting id"
during a run of BatchUpload, you'd need to run every different country lookup only once, every different temperature lookup only once, ...
I assume that this is an easy and valuable enhancement, that would severely speed up BatchUploader, increasing the maximum number of possible records in a CSV.
Best regards
Blama
please see this thread and the performance issue there.
While writing it I had another idea how to improve BatchUploader.
For my main table I can currently import about 1000 rows only, as otherwise the response takes to long, making the browser stop waiting for the response.
The problem here is the (1st) upload part of BatchUploader, not the 2nd commit part.
The queries issued serverside are mostly the same:
- lookup country "Germany" to 455
- lookup country "Austria" to 446
- lookup country "Switzerland" to 447
- lookup temperature "cold" to 1
- lookup temperature "warm" to 2
- lookup temperature"hot" to 3
- lookup product "ABC" 1001
- lookup product "DEF" 1002
- ....
If you maintained a cache of
"DataSource" - "key value" - "resulting id"
during a run of BatchUpload, you'd need to run every different country lookup only once, every different temperature lookup only once, ...
I assume that this is an easy and valuable enhancement, that would severely speed up BatchUploader, increasing the maximum number of possible records in a CSV.
Best regards
Blama
Comment