Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
Clear All
new posts

    SmartGWT with Spark SQL / Hadoop / Parquet ?

    Has anyone tried using SmartGWT with Spark SQL ? ( which sits on top of Hadoop/Parquet as the storage backend ) I would love to hear your experience with this configuration.




    #2
    We haven't tried this here at Isomorphic, but the "generic" SQL Driver is worth a shot.

    Comment


      #3
      Hi, I tried actually connecting via a Spark Thrift Server (STS) (c.f. here). I could connect to it, but then I got the following error message when my code try to fetch (the query is defined in a static datasource).

      Code:
      === 2017-05-13 18:06:16,980 [3-47] WARN RequestContext - dsRequest.execute() failed:
      java.sql.SQLException: Method not supported
        at org.apache.hive.jdbc.HiveBaseResultSet.absolute(HiveBaseResultSet.java:70)   
        at org.apache.commons.dbcp.DelegatingResultSet.absolute(DelegatingResultSet.java:373)
        at com.isomorphic.sql.SQLDataSource.executeWindowedSelect(SQLDataSource.java:2970)
        at com.isomorphic.sql.SQLDataSource.SQLExecute(SQLDataSource.java:2024)
      I gather the driver does not implement the method absolute() because the ResultSet type is TYPE_FORWARD_ONLY (see JavaDoc here)

      Is there a way to work around the problem?
      Last edited by leeyuiwah; 14 May 2017, 10:44.

      Comment


        #4
        Hi, I found another article (here) which suggested to set
        Code:
        sqlPaging
        to
        Code:
        dropAtServer
        . So here is my static datasource XML file:

        Code:
            <operationBindings>
                <operationBinding   operationType="fetch"  progressiveLoading="false"
                            sqlPaging="dropAtServer" 
                    >
        This got me pass the error message posted on #2 (Method not supported from HiveBaseResutSet.absolute()), but then I got this error:

        Code:
        === 2017-05-14 12:37:33,463 [5-49] WARN  DSTransaction - Exception thrown during onSuccess processing in class com.isomorphic.datasource.DSRequest (DataSource PERF2_APP) - changes might not have been committed
        java.sql.SQLException: Method not supported
                at org.apache.hive.jdbc.HiveConnection.commit(HiveConnection.java:742)
                at org.apache.commons.dbcp.DelegatingConnection.commit(DelegatingConnection.java:334)
                at com.isomorphic.sql.SQLTransaction.commitTransaction(SQLTransaction.java:307)
                at com.isomorphic.sql.SQLDataSource.commit(SQLDataSource.java:4673)
                at com.isomorphic.datasource.DSRequest.commit(DSRequest.java:5078)
                at com.isomorphic.datasource.DSTransaction.onSuccess(DSTransaction.java:450)
                at com.isomorphic.datasource.DSTransaction.complete(DSTransaction.java:350)
                at com.isomorphic.rpc.RPCManager.completeResponse(RPCManager.java:1385)
                at com.isomorphic.rpc.RPCManager.send(RPCManager.java:694)
                at com.isomorphic.servlet.IDACall.processRPCTransaction(IDACall.java:187)
                at com.isomorphic.servlet.IDACall.processRequest(IDACall.java:152)
                at com.isomorphic.servlet.IDACall._processRequest(IDACall.java:119)
                at com.isomorphic.servlet.IDACall.doPost(IDACall.java:79)

        Comment


          #5
          Oh I found an answer to my question #4 above. By configuring autoJoinTransactions in my server.properties file

          Code:
          sql.perf2.autoJoinTransactions: false
          Thanks for your help. Just for anyone else who also want to try this. Below are my other settings in server.properties:

          Code:
          sql.defaultDatabase: perf2 # this name is picked by me, but it can be any name
          
          sql.perf2.driver.networkProtocol: tcp
          sql.perf2.driver: org.apache.hive.jdbc.HiveDriver   # important
          sql.perf2.database.type: generic                    # important
          sql.perf2.autoJoinTransactions: false               # important
          sql.perf2.interface.type: driverManager             # important
          sql.perf2.driver.url: jdbc:hive2://host:port        # important -- pick your host:port
          sql.perf2.driver.user: someuser                     # important -- pick your username
          sql.perf2.interface.credentialsInURL: true
          sql.perf2.driver.databaseName: someDb
          sql.perf2.driver.context:

          Comment

          Working...
          X