Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
Clear All
new posts

    Saving 300 records using SaveAllEdits() function

    Hi there, I have created an import function to setEditValues to grid. When saving grid data, I'm using saveAllEdits to perform this action.
    After imported 300 records to grid, when I try proceed to save operation, I found that this saveAllEdits function take long time to operate.
    Am I using the correct function to save a huge amount of records?

    #2
    Hi,

    The first thing I would check is if all the edits are being sent in a queue. This is surely preferable.
    Then I would check what is taking so long, so have a look at the server logs.
    And what time are we talking per record?

    Also more generally: Do you know about BatchImporter?
    Also: What version of SmartClient?

    Best regards
    Blama

    Comment


      #3
      Hi,
      All the edits are being sent in a queue.
      I have checked with the BatchImporter in smartclient reference. Not found this BatchImporter.
      My version of smartclient is 12.1p

      Comment


        #4
        Ah sorry, it’s BatchUploader:
        https://smartclient.com/smartclient-....BatchUploader

        https://smartclient.com/smartclient-...id=batchUpload

        Comment


          #5
          It's unclear why are you initiating a save of 300 records from the browser. Clearly, the user is not hand-entering 300 complete records worth of data, so this seems like some kind of data schema issue.

          You also haven't said anything about what is taking time. Is it client-side, network, or server-side time?

          As far as how to approach this:

          1) if you don't need to be able to show errors and have the end user correct validation issues record-by-record, you can just make direct calls to dataSource.addData()/updataData() and this will skip most of the client-side processing

          2) if the end user is doing something like uploading a file with many new records in it, as Blama said, the BatchUploader (note: Uploader not Importer) is designed for this

          3) if the use case is that the same edit is being made to a large batch of records, you can use an "update" operation with operationBinding.allowMultiUpdate: true, and then the only data sent to the server is criteria to identify the records plus the specific field(s) to change. This approach would eliminate basically all client-side processing and make it possible to optimize the server as well (e.g. one SQL statement, not 300)

          4) if it's not quite just a normal update, you might use an operationType:"custom" operation backed with custom code, since, again, the end user is clearly not hand-entering 300 complete records which are all being saved at once with no intermediate saves, so there presumably is some way to reduce the data volume saved all at once.

          Comment

          Working...
          X