This is an automated email from the ASF dual-hosted git repository.

jackylk pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/carbondata.git


The following commit(s) were added to refs/heads/master by this push:
     new a5ebd5f  [CARBONDATA-3737] support prestodb and prestosql
a5ebd5f is described below

commit a5ebd5f35832fc245a85d1fb1938dc0b7a7ef80b
Author: ajantha-bhat <ajanthab...@gmail.com>
AuthorDate: Thu Feb 27 10:58:11 2020 +0530

    [CARBONDATA-3737] support prestodb and prestosql
    
    Why is this PR needed?
    
    Currently, carbondata supports only prestodb. As presto is divided into two 
communities prestodb and prestosql. It is necessary for carbondata to support 
the users from both the community.
    Hence support prestodb-0.217 and prestosql-316 integration in carbon
    
    What changes were proposed in this PR?
    
    Keep a single presto module and have subfolders of common, prestodb, 
prestosql code (similar to latest spark integration structure)
    support two profiles, prestodb and prestosql
    Make prestoql as the default profile along with spark-2.3 default profile
    
    Does this PR introduce any user interface change?
    Yes. (updated the documents)
    
    Is any new testcase added?
    No (current testcases are enough for prestodb and prestosql) integration.
    Manually run UT for both prestodb and prestosql.
    
    This closes #3641
---
 docs/{presto-guide.md => prestodb-guide.md}        |  10 +-
 docs/{presto-guide.md => prestosql-guide.md}       |  47 ++++---
 docs/quick-start-guide.md                          | 142 +------------------
 integration/presto/pom.xml                         | 156 ++++++++++++++++-----
 .../presto/ColumnarVectorWrapperDirect.java        |  38 ++---
 .../presto/CarbondataColumnConstraint.java         |  16 +--
 .../presto/CarbondataConnectorFactory.java         |   0
 .../apache/carbondata/presto/CarbondataModule.java |   0
 .../carbondata/presto/CarbondataPageSource.java    |   0
 .../presto/CarbondataPageSourceProvider.java       |   0
 .../apache/carbondata/presto/CarbondataPlugin.java |   0
 .../carbondata/presto/CarbondataSplitManager.java  |   2 +-
 .../apache/carbondata/presto/PrestoFilterUtil.java |  32 ++---
 .../carbondata/presto/impl/CarbonTableReader.java  | 104 ++++++--------
 .../presto/readers/BooleanStreamReader.java        |   0
 .../presto/readers/ByteStreamReader.java           |   0
 .../presto/readers/DecimalSliceStreamReader.java   |   0
 .../presto/readers/DoubleStreamReader.java         |   0
 .../presto/readers/FloatStreamReader.java          |   0
 .../presto/readers/IntegerStreamReader.java        |   0
 .../presto/readers/LongStreamReader.java           |   0
 .../presto/readers/ObjectStreamReader.java         |   0
 .../presto/readers/PrestoVectorBlockBuilder.java   |   0
 .../presto/readers/ShortStreamReader.java          |   0
 .../presto/readers/SliceStreamReader.java          |   0
 .../presto/readers/TimestampStreamReader.java      |   0
 .../presto/CarbondataColumnConstraint.java         |  18 +--
 .../presto/CarbondataConnectorFactory.java         | 142 ++++++++++---------
 .../apache/carbondata/presto/CarbondataModule.java | 105 +++++++-------
 .../carbondata/presto/CarbondataPageSource.java    |  53 ++++---
 .../presto/CarbondataPageSourceProvider.java       |  38 ++---
 .../apache/carbondata/presto/CarbondataPlugin.java |   4 +-
 .../carbondata/presto/CarbondataSplitManager.java  |  84 ++++++-----
 .../apache/carbondata/presto/PrestoFilterUtil.java |  50 +++----
 .../carbondata/presto/impl/CarbonTableReader.java  | 114 +++++++--------
 .../presto/readers/BooleanStreamReader.java        |   8 +-
 .../presto/readers/ByteStreamReader.java           |   9 +-
 .../presto/readers/DecimalSliceStreamReader.java   |  16 +--
 .../presto/readers/DoubleStreamReader.java         |   8 +-
 .../presto/readers/FloatStreamReader.java          |   8 +-
 .../presto/readers/IntegerStreamReader.java        |   8 +-
 .../presto/readers/LongStreamReader.java           |   8 +-
 .../presto/readers/ObjectStreamReader.java         |   8 +-
 .../presto/readers/PrestoVectorBlockBuilder.java   |   2 +-
 .../presto/readers/ShortStreamReader.java          |   8 +-
 .../presto/readers/SliceStreamReader.java          |  12 +-
 .../presto/readers/TimestampStreamReader.java      |   8 +-
 .../carbondata/presto/server/PrestoServer.scala    |   0
 .../carbondata/presto/server/PrestoServer.scala    |  16 +--
 integration/spark-common-cluster-test/pom.xml      |   4 +-
 .../apache/spark/sql/common/util/QueryTest.scala   |   4 +-
 pom.xml                                            |  92 +++++++++++-
 52 files changed, 706 insertions(+), 668 deletions(-)

diff --git a/docs/presto-guide.md b/docs/prestodb-guide.md
similarity index 96%
copy from docs/presto-guide.md
copy to docs/prestodb-guide.md
index 0c49f35..9bd9a89 100644
--- a/docs/presto-guide.md
+++ b/docs/prestodb-guide.md
@@ -16,7 +16,7 @@
 -->
 
 
-# Presto guide
+# Prestodb guide
 This tutorial provides a quick introduction to using current 
integration/presto module.
 
 
@@ -234,12 +234,14 @@ Now you can use the Presto CLI on the coordinator to 
query data sources in the c
   ```
   $ git clone https://github.com/apache/carbondata
   $ cd carbondata
-  $ mvn -DskipTests -P{spark-version} -Dspark.version={spark-version-number} 
-Dhadoop.version={hadoop-version-number} clean package
+  $ mvn -DskipTests -P{spark-version} -P{prestodb/prestosql} 
-Dspark.version={spark-version-number} -Dhadoop.version={hadoop-version-number} 
clean package
   ```
   Replace the spark and hadoop version with the version used in your cluster.
-  For example, if you are using Spark 2.4.4, you would like to compile using:
+  For example, use prestodb profile and 
+  if you are using Spark 2.4.4, you would like to compile using:
+  
   ```
-  mvn -DskipTests -Pspark-2.4 -Dspark.version=2.4.4 -Dhadoop.version=2.7.2 
clean package
+  mvn -DskipTests -Pspark-2.4 -Pprestodb -Dspark.version=2.4.4 
-Dhadoop.version=2.7.2 clean package
   ```
 
   Secondly: Create a folder named 'carbondata' under $PRESTO_HOME$/plugin and
diff --git a/docs/presto-guide.md b/docs/prestosql-guide.md
similarity index 84%
rename from docs/presto-guide.md
rename to docs/prestosql-guide.md
index 0c49f35..b42a1b8 100644
--- a/docs/presto-guide.md
+++ b/docs/prestosql-guide.md
@@ -16,7 +16,7 @@
 -->
 
 
-# Presto guide
+# Prestosql guide
 This tutorial provides a quick introduction to using current 
integration/presto module.
 
 
@@ -32,32 +32,32 @@ 
https://github.com/apache/carbondata/blob/master/integration/presto/pom.xml
 and look for ```<presto.version>```
 
 _Example:_ 
-  `<presto.version>0.217</presto.version>`
-This means current version of carbon supports presto 0.217 version.   
+  `<presto.version>316</presto.version>`
+This means current version of carbon supports presto 316 version.
 
 _Note:_
 Currently carbondata supports only one version of presto, cannot handle 
multiple versions at same time. If user wish to use older version of presto, 
then need to use older version of carbon (other old branches, say branch-1.5 
and check the supported presto version in it's pom.xml file in 
integration/presto/)
 
-  1. Download that version of Presto (say 0.217) using below command:
+  1. Download that version of Presto (say 316) using below command:
   ```
-  wget 
https://repo1.maven.org/maven2/com/facebook/presto/presto-server/0.217/presto-server-0.217.tar.gz
+  wget 
https://repo1.maven.org/maven2/io/prestosql/presto-server/316/presto-server-316.tar.gz
   ```
 
-  2. Extract Presto tar file: `tar zxvf presto-server-0.217.tar.gz`.
+  2. Extract Presto tar file: `tar zxvf presto-server-316.tar.gz`.
 
-  3. Download the Presto CLI of the same presto server version (say 0.217) for 
the coordinator and name it presto.
+  3. Download the Presto CLI of the same presto server version (say 316) for 
the coordinator and name it presto.
 
   ```
-    wget 
https://repo1.maven.org/maven2/com/facebook/presto/presto-cli/0.217/presto-cli-0.217-executable.jar
+    wget 
https://repo1.maven.org/maven2/io/prestosql/presto-cli/316/presto-cli-316-executable.jar
 
-    mv presto-cli-0.217-executable.jar presto
+    mv presto-cli-316-executable.jar presto
 
     chmod +x presto
   ```
 
  ### Create Configuration Files
 
-  1. Create `etc` folder in presto-server-0.217 directory.
+  1. Create `etc` folder in presto-server-316 directory.
   2. Create `config.properties`, `jvm.config`, `log.properties`, and 
`node.properties` files.
   3. Install uuid to generate a node.id.
 
@@ -91,7 +91,7 @@ Currently carbondata supports only one version of presto, 
cannot handle multiple
 
 ##### Contents of your log.properties file
   ```
-  com.facebook.presto=INFO
+  io.prestosql=INFO
   ```
 
  The default minimum level is `INFO`. There are four levels: `DEBUG`, `INFO`, 
`WARN` and `ERROR`.
@@ -148,12 +148,12 @@ Then, `query.max-memory=<30GB * number of nodes>`.
 ### Start Presto Server on all nodes
 
 ```
-./presto-server-0.217/bin/launcher start
+./presto-server-316/bin/launcher start
 ```
 To run it as a background process.
 
 ```
-./presto-server-0.217/bin/launcher run
+./presto-server-316/bin/launcher run
 ```
 To run it in foreground.
 
@@ -176,8 +176,8 @@ Now you can use the Presto CLI on the coordinator to query 
data sources in the c
 ## Presto Single Node Setup for Carbondata
 
 ### Config presto server
-* Download presto server (0.217 is suggested and supported) : 
https://repo1.maven.org/maven2/com/facebook/presto/presto-server/
-* Finish presto configuration following 
https://prestodb.io/docs/current/installation/deployment.html.
+* Download presto server (316 is suggested and supported) : 
https://repo1.maven.org/maven2/io/prestosql/presto-server/
+* Finish presto configuration following 
https://prestosql.io/docs/current/installation/deployment.html.
   A configuration example:
   
  **config.properties**
@@ -218,8 +218,8 @@ Now you can use the Presto CLI on the coordinator to query 
data sources in the c
   
   **log.properties**
   ```
-  com.facebook.presto=DEBUG
-  com.facebook.presto.server.PluginManager=DEBUG
+  io.prestosql=DEBUG
+  io.prestosql.server.PluginManager=DEBUG
   ```
   
   **node.properties**
@@ -234,12 +234,13 @@ Now you can use the Presto CLI on the coordinator to 
query data sources in the c
   ```
   $ git clone https://github.com/apache/carbondata
   $ cd carbondata
-  $ mvn -DskipTests -P{spark-version} -Dspark.version={spark-version-number} 
-Dhadoop.version={hadoop-version-number} clean package
+  $ mvn -DskipTests -P{spark-version} -P{prestodb/prestosql} 
-Dspark.version={spark-version-number} -Dhadoop.version={hadoop-version-number} 
clean package
   ```
   Replace the spark and hadoop version with the version used in your cluster.
-  For example, if you are using Spark 2.4.4, you would like to compile using:
+  For example, use prestosql profile and  
+  if you are using Spark 2.4.4, you would like to compile using:
   ```
-  mvn -DskipTests -Pspark-2.4 -Dspark.version=2.4.4 -Dhadoop.version=2.7.2 
clean package
+  mvn -DskipTests -Pspark-2.4 -Pprestosql -Dspark.version=2.4.4 
-Dhadoop.version=2.7.2 clean package
   ```
 
   Secondly: Create a folder named 'carbondata' under $PRESTO_HOME$/plugin and
@@ -254,7 +255,7 @@ Now you can use the Presto CLI on the coordinator to query 
data sources in the c
   hive.metastore.uri=thrift://<host>:<port>
   ```
   Carbondata becomes one of the supported format of presto hive plugin, so the 
configurations and setup is similar to hive connector of presto.
-  Please refer <a>https://prestodb.io/docs/current/connector/hive.html</a> for 
more details.
+  Please refer <a>https://prestosql.io/docs/current/connector/hive.html</a> 
for more details.
   
   **Note**: Since carbon can work only with hive metastore, it is necessary 
that spark also connects to same metastore db for creating tables and updating 
tables.
   All the operations done on spark will be reflected in presto immediately. 
@@ -273,7 +274,7 @@ Now you can use the Presto CLI on the coordinator to query 
data sources in the c
     hive.s3.endpoint={value}
    ```
    
-   Please refer <a>https://prestodb.io/docs/current/connector/hive.html</a> 
for more details on S3 integration.
+   Please refer <a>https://prestosql.io/docs/current/connector/hive.html</a> 
for more details on S3 integration.
     
 ### Generate CarbonData file
 
@@ -282,7 +283,7 @@ Load data statement in Spark can be used to create 
carbondata tables. And then y
 carbondata files.
 
 ### Query carbondata in CLI of presto
-* Download presto cli client of version 0.217 : 
https://repo1.maven.org/maven2/com/facebook/presto/presto-cli
+* Download presto cli client of version 316 : 
https://repo1.maven.org/maven2/io/prestosql/presto-cli/
 
 * Start CLI:
   
diff --git a/docs/quick-start-guide.md b/docs/quick-start-guide.md
index f9f467c..4635cdb 100644
--- a/docs/quick-start-guide.md
+++ b/docs/quick-start-guide.md
@@ -405,146 +405,14 @@ Example
 either with 
[Spark](#installing-and-configuring-carbondata-to-run-locally-with-spark-shell) 
or [SDK](./sdk-guide.md) or [C++ SDK](./csdk-guide.md).
 Once the table is created,it can be queried from Presto.**
 
+Please refer the presto guide linked below.
 
-### Installing Presto
+prestodb guide  - [prestodb](./prestodb-guide.md)
 
-1. Download the 0.210 version of Presto using:
-`wget 
https://repo1.maven.org/maven2/com/facebook/presto/presto-server/0.210/presto-server-0.210.tar.gz`
+prestosql guide - [prestosql](./prestosql-guide.md)
 
-2. Extract Presto tar file: `tar zxvf presto-server-0.210.tar.gz`.
-
-3. Download the Presto CLI for the coordinator and name it presto.
-
-```
-wget 
https://repo1.maven.org/maven2/com/facebook/presto/presto-cli/0.210/presto-cli-0.210-executable.jar
-
-mv presto-cli-0.210-executable.jar presto
-
-chmod +x presto
-```
-
-### Create Configuration Files
-
-1. Create `etc` folder in presto-server-0.210 directory.
-2. Create `config.properties`, `jvm.config`, `log.properties`, and 
`node.properties` files.
-3. Install uuid to generate a node.id.
-
-  ```
-  sudo apt-get install uuid
-
-  uuid
-  ```
-
-
-##### Contents of your node.properties file
-
-```
-node.environment=production
-node.id=<generated uuid>
-node.data-dir=/home/ubuntu/data
-```
-
-##### Contents of your jvm.config file
-
-```
--server
--Xmx16G
--XX:+UseG1GC
--XX:G1HeapRegionSize=32M
--XX:+UseGCOverheadLimit
--XX:+ExplicitGCInvokesConcurrent
--XX:+HeapDumpOnOutOfMemoryError
--XX:OnOutOfMemoryError=kill -9 %p
-```
-
-##### Contents of your log.properties file
-
-```
-com.facebook.presto=INFO
-```
-
- The default minimum level is `INFO`. There are four levels: `DEBUG`, `INFO`, 
`WARN` and `ERROR`.
-
-### Coordinator Configurations
-
-##### Contents of your config.properties
-
-```
-coordinator=true
-node-scheduler.include-coordinator=false
-http-server.http.port=8086
-query.max-memory=5GB
-query.max-total-memory-per-node=5GB
-query.max-memory-per-node=3GB
-memory.heap-headroom-per-node=1GB
-discovery-server.enabled=true
-discovery.uri=http://localhost:8086
-task.max-worker-threads=4
-optimizer.dictionary-aggregation=true
-optimizer.optimize-hash-generation = false
-```
-The options `node-scheduler.include-coordinator=false` and `coordinator=true` 
indicate that the node is the coordinator and tells the coordinator not to do 
any of the computation work itself and to use the workers.
-
-**Note**: It is recommended to set `query.max-memory-per-node` to half of the 
JVM config max memory, though the workload is highly concurrent, lower value 
for `query.max-memory-per-node` is to be used.
-
-Also relation between below two configuration-properties should be like:
-If, `query.max-memory-per-node=30GB`
-Then, `query.max-memory=<30GB * number of nodes>`.
-
-### Worker Configurations
-
-##### Contents of your config.properties
-
-```
-coordinator=false
-http-server.http.port=8086
-query.max-memory=5GB
-query.max-memory-per-node=2GB
-discovery.uri=<coordinator_ip>:8086
-```
-
-**Note**: `jvm.config` and `node.properties` files are same for all the nodes 
(worker + coordinator). All the nodes should have different 
`node.id`.(generated by uuid command).
-
-### Catalog Configurations
-
-1. Create a folder named `catalog` in etc directory of presto on all the nodes 
of the cluster including the coordinator.
-
-##### Configuring Carbondata in Presto
-1. Create a file named `carbondata.properties` in the `catalog` folder and set 
the required properties on all the nodes.
-
-### Add Plugins
-
-1. Create a directory named `carbondata` in plugin directory of presto.
-2. Copy `carbondata` jars to `plugin/carbondata` directory on all nodes.
-
-### Start Presto Server on all nodes
-
-```
-./presto-server-0.210/bin/launcher start
-```
-To run it as a background process.
-
-```
-./presto-server-0.210/bin/launcher run
-```
-To run it in foreground.
-
-### Start Presto CLI
-
-```
-./presto
-```
-To connect to carbondata catalog use the following command:
-
-```
-./presto --server <coordinator_ip>:8086 --catalog carbondata --schema 
<schema_name>
-```
-Execute the following command to ensure the workers are connected.
-
-```
-select * from system.runtime.nodes;
-```
-Now you can use the Presto CLI on the coordinator to query data sources in the 
catalog using the Presto workers.
+Once installed the presto with carbonData as per above guide,
+you can use the Presto CLI on the coordinator to query data sources in the 
catalog using the Presto workers.
 
 List the schemas(databases) available
 
diff --git a/integration/presto/pom.xml b/integration/presto/pom.xml
index 110e4de..0a02994 100644
--- a/integration/presto/pom.xml
+++ b/integration/presto/pom.xml
@@ -31,7 +31,6 @@
   <packaging>presto-plugin</packaging>
 
   <properties>
-    <presto.version>0.217</presto.version>
     <httpcore.version>4.4.9</httpcore.version>
     <dev.path>${basedir}/../../dev</dev.path>
     <jacoco.append>true</jacoco.append>
@@ -228,7 +227,7 @@
       </exclusions>
     </dependency>
     <dependency>
-      <groupId>com.facebook.presto</groupId>
+      <groupId>${presto.groupid}</groupId>
       <artifactId>presto-tests</artifactId>
       <scope>test</scope>
       <version>${presto.version}</version>
@@ -354,7 +353,7 @@
 
     <!--presto integrated-->
     <dependency>
-      <groupId>com.facebook.presto</groupId>
+      <groupId>${presto.groupid}</groupId>
       <artifactId>presto-spi</artifactId>
       <version>${presto.version}</version>
       <scope>provided</scope>
@@ -370,14 +369,9 @@
       </exclusions>
     </dependency>
     <dependency>
-      <groupId>commons-lang</groupId>
-      <artifactId>commons-lang</artifactId>
-      <version>2.5</version>
-    </dependency>
-    <dependency>
-      <groupId>com.facebook.presto.hadoop</groupId>
-      <artifactId>hadoop-apache2</artifactId>
-      <version>2.7.4-3</version>
+      <groupId>${presto.hadoop.groupid}</groupId>
+      <artifactId>${presto.hadoop.artifactid}</artifactId>
+      <version>${presto.hadoop.version}</version>
       <exclusions>
         <exclusion>
           <groupId>org.antlr</groupId>
@@ -389,9 +383,8 @@
         </exclusion>
       </exclusions>
     </dependency>
-
     <dependency>
-      <groupId>com.facebook.presto</groupId>
+      <groupId>${presto.groupid}</groupId>
       <artifactId>presto-jdbc</artifactId>
       <version>${presto.version}</version>
       <exclusions>
@@ -406,7 +399,7 @@
       </exclusions>
     </dependency>
     <dependency>
-      <groupId>com.facebook.presto</groupId>
+      <groupId>${presto.groupid}</groupId>
       <artifactId>presto-hive</artifactId>
       <version>${presto.version}</version>
       <exclusions>
@@ -432,7 +425,7 @@
     <dependency>
       <groupId>io.airlift</groupId>
       <artifactId>slice</artifactId>
-      <version>0.31</version>
+      <version>${airlift.version}</version>
       <scope>provided</scope>
       <exclusions>
         <exclusion>
@@ -460,6 +453,10 @@
       <scope>test</scope>
       <exclusions>
         <exclusion>
+          <groupId>io.prestosql.hadoop</groupId>
+          <artifactId>hadoop-apache</artifactId>
+        </exclusion>
+        <exclusion>
           <groupId>org.apache.hadoop</groupId>
           <artifactId>hadoop-client</artifactId>
         </exclusion>
@@ -475,7 +472,7 @@
       <groupId>com.google.code.findbugs</groupId>
       <artifactId>jsr305</artifactId>
       <version>3.0.2</version>
-      <scope>provided</scope>
+      <scope>${presto.depndency.scope}</scope>
     </dependency>
     <dependency>
       <groupId>org.glassfish.hk2</groupId>
@@ -589,7 +586,12 @@
           <failIfNoTests>false</failIfNoTests>
         </configuration>
       </plugin>
-
+      <plugin>
+        <groupId>${presto.mvn.plugin.groupid}</groupId>
+        <artifactId>presto-maven-plugin</artifactId>
+        <version>${presto.mvn.plugin.version}</version>
+        <extensions>true</extensions>
+      </plugin>
       <plugin>
         <groupId>org.apache.maven.plugins</groupId>
         <artifactId>maven-checkstyle-plugin</artifactId>
@@ -632,26 +634,6 @@
       </plugin>
 
       <plugin>
-        <groupId>com.ning.maven.plugins</groupId>
-        <artifactId>maven-duplicate-finder-plugin</artifactId>
-        <configuration>
-          <skip>true</skip>
-        </configuration>
-      </plugin>
-      <plugin>
-        <groupId>io.takari.maven.plugins</groupId>
-        <artifactId>presto-maven-plugin</artifactId>
-        <version>0.1.12</version>
-        <extensions>true</extensions>
-      </plugin>
-      <plugin>
-        <groupId>pl.project13.maven</groupId>
-        <artifactId>git-commit-id-plugin</artifactId>
-        <configuration>
-          <skip>true</skip>
-        </configuration>
-      </plugin>
-      <plugin>
         <groupId>org.scalatest</groupId>
         <artifactId>scalatest-maven-plugin</artifactId>
         <version>1.0</version>
@@ -688,5 +670,105 @@
         <maven.test.skip>true</maven.test.skip>
       </properties>
     </profile>
+    <profile>
+      <id>prestodb</id>
+      <activation>
+        <activeByDefault>true</activeByDefault>
+      </activation>
+      <build>
+        <plugins>
+          <plugin>
+            <groupId>org.apache.maven.plugins</groupId>
+            <artifactId>maven-compiler-plugin</artifactId>
+            <configuration>
+              <excludes>
+                <exclude>src/main/prestosql</exclude>
+              </excludes>
+            </configuration>
+          </plugin>
+          <plugin>
+            <groupId>org.codehaus.mojo</groupId>
+            <artifactId>build-helper-maven-plugin</artifactId>
+            <version>3.0.0</version>
+            <executions>
+              <execution>
+                <id>add-test-source</id>
+                <phase>generate-sources</phase>
+                <goals>
+                  <goal>add-test-source</goal>
+                </goals>
+                <configuration>
+                  <sources>
+                    <source>src/test/prestodb</source>
+                  </sources>
+                </configuration>
+              </execution>
+              <execution>
+                <id>add-source</id>
+                <phase>generate-sources</phase>
+                <goals>
+                  <goal>add-source</goal>
+                </goals>
+                <configuration>
+                  <sources>
+                    <source>src/main/prestodb</source>
+                  </sources>
+                </configuration>
+              </execution>
+            </executions>
+          </plugin>
+        </plugins>
+      </build>
+    </profile>
+    <profile>
+      <id>prestosql</id>
+      <activation>
+        <activeByDefault>true</activeByDefault>
+      </activation>
+      <build>
+        <plugins>
+          <plugin>
+            <groupId>org.apache.maven.plugins</groupId>
+            <artifactId>maven-compiler-plugin</artifactId>
+            <configuration>
+              <excludes>
+                <exclude>src/main/prestodb</exclude>
+              </excludes>
+            </configuration>
+          </plugin>
+          <plugin>
+            <groupId>org.codehaus.mojo</groupId>
+            <artifactId>build-helper-maven-plugin</artifactId>
+            <version>3.0.0</version>
+            <executions>
+              <execution>
+                <id>add-test-source</id>
+                <phase>generate-sources</phase>
+                <goals>
+                  <goal>add-test-source</goal>
+                </goals>
+                <configuration>
+                  <sources>
+                    <source>src/test/prestosql</source>
+                  </sources>
+                </configuration>
+              </execution>
+              <execution>
+                <id>add-source</id>
+                <phase>generate-sources</phase>
+                <goals>
+                  <goal>add-source</goal>
+                </goals>
+                <configuration>
+                  <sources>
+                    <source>src/main/prestosql</source>
+                  </sources>
+                </configuration>
+              </execution>
+            </executions>
+          </plugin>
+        </plugins>
+      </build>
+    </profile>
   </profiles>
 </project>
\ No newline at end of file
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/ColumnarVectorWrapperDirect.java
 
b/integration/presto/src/main/java/org/apache/carbondata/presto/ColumnarVectorWrapperDirect.java
index ce6965d..5cbcdcb 100644
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/ColumnarVectorWrapperDirect.java
+++ 
b/integration/presto/src/main/java/org/apache/carbondata/presto/ColumnarVectorWrapperDirect.java
@@ -43,22 +43,22 @@ class ColumnarVectorWrapperDirect implements 
CarbonColumnVector, SequentialFill
 
   private CarbonColumnVector dictionaryVector;
 
-  private BitSet nullBitset;
+  private BitSet nullBitSet;
 
   ColumnarVectorWrapperDirect(CarbonColumnVectorImpl columnVector) {
     this.columnVector = columnVector;
     this.dictionaryVector = columnVector.getDictionaryVector();
-    this.nullBitset = new BitSet();
+    this.nullBitSet = new BitSet();
   }
 
   @Override
   public void setNullBits(BitSet nullBits) {
-    this.nullBitset = nullBits;
+    this.nullBitSet = nullBits;
   }
 
   @Override
   public void putBoolean(int rowId, boolean value) {
-    if (nullBitset.get(rowId)) {
+    if (nullBitSet.get(rowId)) {
       columnVector.putNull(rowId);
     } else {
       columnVector.putBoolean(rowId, value);
@@ -67,7 +67,7 @@ class ColumnarVectorWrapperDirect implements 
CarbonColumnVector, SequentialFill
 
   @Override
   public void putFloat(int rowId, float value) {
-    if (nullBitset.get(rowId)) {
+    if (nullBitSet.get(rowId)) {
       columnVector.putNull(rowId);
     } else {
       columnVector.putFloat(rowId, value);
@@ -76,7 +76,7 @@ class ColumnarVectorWrapperDirect implements 
CarbonColumnVector, SequentialFill
 
   @Override
   public void putShort(int rowId, short value) {
-    if (nullBitset.get(rowId)) {
+    if (nullBitSet.get(rowId)) {
       columnVector.putNull(rowId);
     } else {
       columnVector.putShort(rowId, value);
@@ -91,7 +91,7 @@ class ColumnarVectorWrapperDirect implements 
CarbonColumnVector, SequentialFill
 
   @Override
   public void putInt(int rowId, int value) {
-    if (nullBitset.get(rowId)) {
+    if (nullBitSet.get(rowId)) {
       columnVector.putNull(rowId);
     } else {
       columnVector.putInt(rowId, value);
@@ -105,7 +105,7 @@ class ColumnarVectorWrapperDirect implements 
CarbonColumnVector, SequentialFill
 
   @Override
   public void putLong(int rowId, long value) {
-    if (nullBitset.get(rowId)) {
+    if (nullBitSet.get(rowId)) {
       columnVector.putNull(rowId);
     } else {
       columnVector.putLong(rowId, value);
@@ -119,7 +119,7 @@ class ColumnarVectorWrapperDirect implements 
CarbonColumnVector, SequentialFill
 
   @Override
   public void putDecimal(int rowId, BigDecimal value, int precision) {
-    if (nullBitset.get(rowId)) {
+    if (nullBitSet.get(rowId)) {
       columnVector.putNull(rowId);
     } else {
       columnVector.putDecimal(rowId, value, precision);
@@ -129,7 +129,7 @@ class ColumnarVectorWrapperDirect implements 
CarbonColumnVector, SequentialFill
   @Override
   public void putDecimals(int rowId, int count, BigDecimal value, int 
precision) {
     for (int i = 0; i < count; i++) {
-      if (nullBitset.get(rowId)) {
+      if (nullBitSet.get(rowId)) {
         columnVector.putNull(rowId);
       } else {
         columnVector.putDecimal(rowId, value, precision);
@@ -140,7 +140,7 @@ class ColumnarVectorWrapperDirect implements 
CarbonColumnVector, SequentialFill
 
   @Override
   public void putDouble(int rowId, double value) {
-    if (nullBitset.get(rowId)) {
+    if (nullBitSet.get(rowId)) {
       columnVector.putNull(rowId);
     } else {
       columnVector.putDouble(rowId, value);
@@ -154,7 +154,7 @@ class ColumnarVectorWrapperDirect implements 
CarbonColumnVector, SequentialFill
 
   @Override
   public void putByteArray(int rowId, byte[] value) {
-    if (nullBitset.get(rowId)) {
+    if (nullBitSet.get(rowId)) {
       columnVector.putNull(rowId);
     } else {
       columnVector.putByteArray(rowId, value);
@@ -170,7 +170,7 @@ class ColumnarVectorWrapperDirect implements 
CarbonColumnVector, SequentialFill
 
   @Override
   public void putByteArray(int rowId, int offset, int length, byte[] value) {
-    if (nullBitset.get(rowId)) {
+    if (nullBitSet.get(rowId)) {
       columnVector.putNull(rowId);
     } else {
       columnVector.putByteArray(rowId, offset, length, value);
@@ -263,7 +263,7 @@ class ColumnarVectorWrapperDirect implements 
CarbonColumnVector, SequentialFill
   @Override
   public void putFloats(int rowId, int count, float[] src, int srcIndex) {
     for (int i = 0; i < count; i++) {
-      if (nullBitset.get(rowId)) {
+      if (nullBitSet.get(rowId)) {
         columnVector.putNull(rowId);
       } else {
         columnVector.putFloat(rowId, src[i]);
@@ -275,7 +275,7 @@ class ColumnarVectorWrapperDirect implements 
CarbonColumnVector, SequentialFill
   @Override
   public void putShorts(int rowId, int count, short[] src, int srcIndex) {
     for (int i = 0; i < count; i++) {
-      if (nullBitset.get(rowId)) {
+      if (nullBitSet.get(rowId)) {
         columnVector.putNull(rowId);
       } else {
         columnVector.putShort(rowId, src[i]);
@@ -287,7 +287,7 @@ class ColumnarVectorWrapperDirect implements 
CarbonColumnVector, SequentialFill
   @Override
   public void putInts(int rowId, int count, int[] src, int srcIndex) {
     for (int i = 0; i < count; i++) {
-      if (nullBitset.get(rowId)) {
+      if (nullBitSet.get(rowId)) {
         columnVector.putNull(rowId);
       } else {
         columnVector.putInt(rowId, src[i]);
@@ -299,7 +299,7 @@ class ColumnarVectorWrapperDirect implements 
CarbonColumnVector, SequentialFill
   @Override
   public void putLongs(int rowId, int count, long[] src, int srcIndex) {
     for (int i = 0; i < count; i++) {
-      if (nullBitset.get(rowId)) {
+      if (nullBitSet.get(rowId)) {
         columnVector.putNull(rowId);
       } else {
         columnVector.putLong(rowId, src[i]);
@@ -311,7 +311,7 @@ class ColumnarVectorWrapperDirect implements 
CarbonColumnVector, SequentialFill
   @Override
   public void putDoubles(int rowId, int count, double[] src, int srcIndex) {
     for (int i = 0; i < count; i++) {
-      if (nullBitset.get(rowId)) {
+      if (nullBitSet.get(rowId)) {
         columnVector.putNull(rowId);
       } else {
         columnVector.putDouble(rowId, src[i]);
@@ -323,7 +323,7 @@ class ColumnarVectorWrapperDirect implements 
CarbonColumnVector, SequentialFill
   @Override
   public void putBytes(int rowId, int count, byte[] src, int srcIndex) {
     for (int i = 0; i < count; i++) {
-      if (nullBitset.get(rowId)) {
+      if (nullBitSet.get(rowId)) {
         columnVector.putNull(rowId);
       } else {
         columnVector.putByte(rowId, src[i]);
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataColumnConstraint.java
 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/CarbondataColumnConstraint.java
similarity index 86%
copy from 
integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataColumnConstraint.java
copy to 
integration/presto/src/main/prestodb/org/apache/carbondata/presto/CarbondataColumnConstraint.java
index d6ccf1c..30d5cf6 100755
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataColumnConstraint.java
+++ 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/CarbondataColumnConstraint.java
@@ -34,20 +34,20 @@ import static 
com.google.common.base.MoreObjects.toStringHelper;
  */
 public class CarbondataColumnConstraint {
   private final String name;
-  private final boolean invertedindexed;
+  private final boolean invertedIndexed;
   private Optional<Domain> domain;
 
   @JsonCreator public CarbondataColumnConstraint(@JsonProperty("name") String 
name,
       @JsonProperty("domain") Optional<Domain> domain,
-      @JsonProperty("invertedindexed") boolean invertedindexed) {
+      @JsonProperty("invertedIndexed") boolean invertedIndexed) {
     this.name = requireNonNull(name, "name is null");
-    this.invertedindexed = requireNonNull(invertedindexed, "invertedIndexed is 
null");
+    this.invertedIndexed = requireNonNull(invertedIndexed, "invertedIndexed is 
null");
     this.domain = requireNonNull(domain, "domain is null");
   }
 
   @JsonProperty
-  public boolean isInvertedindexed() {
-    return invertedindexed;
+  public boolean isInvertedIndexed() {
+    return invertedIndexed;
   }
 
   @JsonProperty
@@ -67,7 +67,7 @@ public class CarbondataColumnConstraint {
 
   @Override
   public int hashCode() {
-    return Objects.hash(name, domain, invertedindexed);
+    return Objects.hash(name, domain, invertedIndexed);
   }
 
   @Override
@@ -82,12 +82,12 @@ public class CarbondataColumnConstraint {
 
     CarbondataColumnConstraint other = (CarbondataColumnConstraint) obj;
     return Objects.equals(this.name, other.name) && 
Objects.equals(this.domain, other.domain)
-        && Objects.equals(this.invertedindexed, other.invertedindexed);
+        && Objects.equals(this.invertedIndexed, other.invertedIndexed);
   }
 
   @Override
   public String toString() {
-    return toStringHelper(this).add("name", this.name).add("invertedindexed", 
this.invertedindexed)
+    return toStringHelper(this).add("name", this.name).add("invertedindexed", 
this.invertedIndexed)
         .add("domain", this.domain).toString();
   }
 }
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataConnectorFactory.java
 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/CarbondataConnectorFactory.java
similarity index 100%
copy from 
integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataConnectorFactory.java
copy to 
integration/presto/src/main/prestodb/org/apache/carbondata/presto/CarbondataConnectorFactory.java
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataModule.java
 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/CarbondataModule.java
similarity index 100%
copy from 
integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataModule.java
copy to 
integration/presto/src/main/prestodb/org/apache/carbondata/presto/CarbondataModule.java
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataPageSource.java
 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/CarbondataPageSource.java
similarity index 100%
copy from 
integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataPageSource.java
copy to 
integration/presto/src/main/prestodb/org/apache/carbondata/presto/CarbondataPageSource.java
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataPageSourceProvider.java
 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/CarbondataPageSourceProvider.java
similarity index 100%
copy from 
integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataPageSourceProvider.java
copy to 
integration/presto/src/main/prestodb/org/apache/carbondata/presto/CarbondataPageSourceProvider.java
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataPlugin.java
 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/CarbondataPlugin.java
similarity index 100%
copy from 
integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataPlugin.java
copy to 
integration/presto/src/main/prestodb/org/apache/carbondata/presto/CarbondataPlugin.java
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataSplitManager.java
 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/CarbondataSplitManager.java
similarity index 98%
copy from 
integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataSplitManager.java
copy to 
integration/presto/src/main/prestodb/org/apache/carbondata/presto/CarbondataSplitManager.java
index 0147b75..d6caa4f 100755
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataSplitManager.java
+++ 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/CarbondataSplitManager.java
@@ -133,7 +133,7 @@ public class CarbondataSplitManager extends 
HiveSplitManager {
     try {
 
       List<CarbonLocalMultiBlockSplit> splits =
-          carbonTableReader.getInputSplits2(cache, filters, predicate, 
configuration);
+          carbonTableReader.getInputSplits(cache, filters, predicate, 
configuration);
 
       ImmutableList.Builder<ConnectorSplit> cSplits = ImmutableList.builder();
       long index = 0;
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/PrestoFilterUtil.java
 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/PrestoFilterUtil.java
similarity index 95%
copy from 
integration/presto/src/main/java/org/apache/carbondata/presto/PrestoFilterUtil.java
copy to 
integration/presto/src/main/prestodb/org/apache/carbondata/presto/PrestoFilterUtil.java
index 70955bb..0b9a7c8 100755
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/PrestoFilterUtil.java
+++ 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/PrestoFilterUtil.java
@@ -279,42 +279,42 @@ public class PrestoFilterUtil {
     return finalFilters;
   }
 
-  private static Object convertDataByType(Object rawdata, HiveType type) {
+  private static Object convertDataByType(Object rawData, HiveType type) {
     if (type.equals(HiveType.HIVE_INT) || type.equals(HiveType.HIVE_SHORT)) {
-      return Integer.valueOf(rawdata.toString());
+      return Integer.valueOf(rawData.toString());
     } else if (type.equals(HiveType.HIVE_LONG)) {
-      return rawdata;
+      return rawData;
     } else if (type.equals(HiveType.HIVE_STRING)) {
-      if (rawdata instanceof Slice) {
-        return ((Slice) rawdata).toStringUtf8();
+      if (rawData instanceof Slice) {
+        return ((Slice) rawData).toStringUtf8();
       } else {
-        return rawdata;
+        return rawData;
       }
     } else if (type.equals(HiveType.HIVE_BOOLEAN)) {
-      return rawdata;
+      return rawData;
     } else if (type.equals(HiveType.HIVE_DATE)) {
       Calendar c = Calendar.getInstance();
       c.setTime(new Date(0));
-      c.add(Calendar.DAY_OF_YEAR, ((Long) rawdata).intValue());
+      c.add(Calendar.DAY_OF_YEAR, ((Long) rawData).intValue());
       Date date = c.getTime();
       return date.getTime() * 1000;
     }
     else if (type.getTypeInfo() instanceof DecimalTypeInfo) {
-      if (rawdata instanceof Double) {
-        return new BigDecimal((Double) rawdata);
-      } else if (rawdata instanceof Long) {
-        return new BigDecimal(new BigInteger(String.valueOf(rawdata)),
+      if (rawData instanceof Double) {
+        return new BigDecimal((Double) rawData);
+      } else if (rawData instanceof Long) {
+        return new BigDecimal(new BigInteger(String.valueOf(rawData)),
             ((DecimalTypeInfo) type.getTypeInfo()).getScale());
-      } else if (rawdata instanceof Slice) {
-        return new BigDecimal(Decimals.decodeUnscaledValue((Slice) rawdata),
+      } else if (rawData instanceof Slice) {
+        return new BigDecimal(Decimals.decodeUnscaledValue((Slice) rawData),
             ((DecimalTypeInfo) type.getTypeInfo()).getScale());
       }
     }
     else if (type.equals(HiveType.HIVE_TIMESTAMP)) {
-      return (Long) rawdata * 1000;
+      return (Long) rawData * 1000;
     }
 
-    return rawdata;
+    return rawData;
   }
 
   /**
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/impl/CarbonTableReader.java
 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/impl/CarbonTableReader.java
similarity index 86%
copy from 
integration/presto/src/main/java/org/apache/carbondata/presto/impl/CarbonTableReader.java
copy to 
integration/presto/src/main/prestodb/org/apache/carbondata/presto/impl/CarbonTableReader.java
index d648392..ac80288 100755
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/impl/CarbonTableReader.java
+++ 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/impl/CarbonTableReader.java
@@ -89,20 +89,17 @@ import static org.apache.hadoop.fs.s3a.Constants.SECRET_KEY;
  */
 public class CarbonTableReader {
 
-  // default PathFilter, accepts files in carbondata format (with .carbondata 
extension).
-  private static final PathFilter DefaultFilter = new PathFilter() {
-    @Override
-    public boolean accept(Path path) {
-      return CarbonTablePath.isCarbonDataFile(path.getName());
-    }
-  };
   public CarbonTableConfig config;
+
   /**
    * A cache for Carbon reader, with this cache,
    * metadata of a table is only read from file system once.
    */
   private AtomicReference<Map<SchemaTableName, CarbonTableCacheModel>> 
carbonCache;
 
+  /**
+   * unique query id used for query statistics
+   */
   private String queryId;
 
   /**
@@ -240,9 +237,21 @@ public class CarbonTableReader {
     return null;
   }
 
-  public List<CarbonLocalMultiBlockSplit> 
getInputSplits2(CarbonTableCacheModel tableCacheModel,
-      Expression filters, TupleDomain<HiveColumnHandle> constraints, 
Configuration config)
-      throws IOException {
+  /**
+   * Get a carbon muti-block input splits
+   *
+   * @param tableCacheModel cached table
+   * @param filters carbonData filters
+   * @param constraints presto filters
+   * @param config hadoop conf
+   * @return list of multiblock split
+   * @throws IOException
+   */
+  public List<CarbonLocalMultiBlockSplit> getInputSplits(
+      CarbonTableCacheModel tableCacheModel,
+      Expression filters,
+      TupleDomain<HiveColumnHandle> constraints,
+      Configuration config) throws IOException {
     List<CarbonLocalInputSplit> result = new ArrayList<>();
     List<CarbonLocalMultiBlockSplit> multiBlockSplitList = new ArrayList<>();
     CarbonTable carbonTable = tableCacheModel.getCarbonTable();
@@ -262,18 +271,13 @@ public class CarbonTableReader {
     PartitionInfo partitionInfo = carbonTable.getPartitionInfo();
     LoadMetadataDetails[] loadMetadataDetails = null;
     if (partitionInfo != null && partitionInfo.getPartitionType() == 
PartitionType.NATIVE_HIVE) {
-      try {
-        loadMetadataDetails = SegmentStatusManager.readTableStatusFile(
-            
CarbonTablePath.getTableStatusFilePath(carbonTable.getTablePath()));
-      } catch (IOException e) {
-        LOGGER.error(e.getMessage(), e);
-        throw e;
-      }
+      loadMetadataDetails = SegmentStatusManager.readTableStatusFile(
+          CarbonTablePath.getTableStatusFilePath(carbonTable.getTablePath()));
       filteredPartitions = findRequiredPartitions(constraints, carbonTable, 
loadMetadataDetails);
     }
     try {
       CarbonTableInputFormat.setTableInfo(config, tableInfo);
-      CarbonTableInputFormat carbonTableInputFormat =
+      CarbonTableInputFormat<Object> carbonTableInputFormat =
           createInputFormat(jobConf, carbonTable.getAbsoluteTableIdentifier(),
               new DataMapFilter(carbonTable, filters, true), 
filteredPartitions);
       Job job = Job.getInstance(jobConf);
@@ -290,76 +294,56 @@ public class CarbonTableReader {
               gson.toJson(carbonInputSplit.getDetailInfo()),
               carbonInputSplit.getFileFormat().ordinal()));
         }
-
         // Use block distribution
-        List<List<CarbonLocalInputSplit>> inputSplits = new ArrayList(
-            result.stream().map(x -> (CarbonLocalInputSplit) 
x).collect(Collectors.groupingBy(
-                carbonInput -> {
-                  if (FileFormat.ROW_V1.equals(carbonInput.getFileFormat())) {
-                    return 
carbonInput.getSegmentId().concat(carbonInput.getPath())
-                      .concat(carbonInput.getStart() + "");
-                  }
-                  return 
carbonInput.getSegmentId().concat(carbonInput.getPath());
-                })).values());
-        if (inputSplits != null) {
-          for (int j = 0; j < inputSplits.size(); j++) {
-            multiBlockSplitList.add(new 
CarbonLocalMultiBlockSplit(inputSplits.get(j),
-                inputSplits.get(j).stream().flatMap(f -> 
Arrays.stream(getLocations(f))).distinct()
-                    .toArray(String[]::new)));
-          }
+        List<List<CarbonLocalInputSplit>> inputSplits =
+            new 
ArrayList<>(result.stream().collect(Collectors.groupingBy(carbonInput -> {
+              if (FileFormat.ROW_V1.equals(carbonInput.getFileFormat())) {
+                return carbonInput.getSegmentId().concat(carbonInput.getPath())
+                    .concat(carbonInput.getStart() + "");
+              }
+              return carbonInput.getSegmentId().concat(carbonInput.getPath());
+            })).values());
+        // TODO : try to optimize the below loic as it may slowdown for huge 
splits
+        for (int j = 0; j < inputSplits.size(); j++) {
+          multiBlockSplitList.add(new 
CarbonLocalMultiBlockSplit(inputSplits.get(j),
+              inputSplits.get(j).stream().flatMap(f -> 
Arrays.stream(getLocations(f))).distinct()
+                  .toArray(String[]::new)));
         }
         LOGGER.error("Size fo MultiblockList   " + multiBlockSplitList.size());
-
       }
-
     } catch (IOException e) {
       throw new RuntimeException(e);
     }
-
     return multiBlockSplitList;
   }
 
   /**
    * Returns list of partition specs to query based on the domain constraints
    *
-   * @param constraints
-   * @param carbonTable
+   * @param constraints presto filter
+   * @param carbonTable carbon table
    * @throws IOException
    */
   private List<PartitionSpec> 
findRequiredPartitions(TupleDomain<HiveColumnHandle> constraints,
       CarbonTable carbonTable, LoadMetadataDetails[] loadMetadataDetails) 
throws IOException {
     Set<PartitionSpec> partitionSpecs = new HashSet<>();
-    List<PartitionSpec> prunePartitions = new ArrayList();
-
     for (LoadMetadataDetails loadMetadataDetail : loadMetadataDetails) {
       SegmentFileStore segmentFileStore = null;
-      try {
-        segmentFileStore =
-            new SegmentFileStore(carbonTable.getTablePath(), 
loadMetadataDetail.getSegmentFile());
-        partitionSpecs.addAll(segmentFileStore.getPartitionSpecs());
-
-      } catch (IOException e) {
-        LOGGER.error(e.getMessage(), e);
-        throw e;
-      }
+      segmentFileStore =
+          new SegmentFileStore(carbonTable.getTablePath(), 
loadMetadataDetail.getSegmentFile());
+      partitionSpecs.addAll(segmentFileStore.getPartitionSpecs());
     }
     List<String> partitionValuesFromExpression =
         PrestoFilterUtil.getPartitionFilters(carbonTable, constraints);
-
-    List<PartitionSpec> partitionSpecList = partitionSpecs.stream().filter(
-        partitionSpec -> CollectionUtils
-            .isSubCollection(partitionValuesFromExpression, 
partitionSpec.getPartitions()))
+    return partitionSpecs.stream().filter(partitionSpec -> CollectionUtils
+        .isSubCollection(partitionValuesFromExpression, 
partitionSpec.getPartitions()))
         .collect(Collectors.toList());
-
-    prunePartitions.addAll(partitionSpecList);
-
-    return prunePartitions;
   }
 
   private CarbonTableInputFormat<Object> createInputFormat(Configuration conf,
       AbsoluteTableIdentifier identifier, DataMapFilter dataMapFilter,
       List<PartitionSpec> filteredPartitions) {
-    CarbonTableInputFormat format = new CarbonTableInputFormat<Object>();
+    CarbonTableInputFormat<Object> format = new CarbonTableInputFormat<>();
     CarbonTableInputFormat
         .setTablePath(conf, 
identifier.appendWithLocalPrefix(identifier.getTablePath()));
     CarbonTableInputFormat.setFilterPredicates(conf, dataMapFilter);
@@ -407,7 +391,7 @@ public class CarbonTableReader {
    * @return
    */
   private String[] getLocations(CarbonLocalInputSplit cis) {
-    return cis.getLocations().toArray(new String[cis.getLocations().size()]);
+    return cis.getLocations().toArray(new String[0]);
   }
 
   public void setQueryId(String queryId) {
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/BooleanStreamReader.java
 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/BooleanStreamReader.java
similarity index 100%
copy from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/BooleanStreamReader.java
copy to 
integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/BooleanStreamReader.java
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/ByteStreamReader.java
 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/ByteStreamReader.java
similarity index 100%
copy from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/ByteStreamReader.java
copy to 
integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/ByteStreamReader.java
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/DecimalSliceStreamReader.java
 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/DecimalSliceStreamReader.java
similarity index 100%
copy from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/DecimalSliceStreamReader.java
copy to 
integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/DecimalSliceStreamReader.java
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/DoubleStreamReader.java
 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/DoubleStreamReader.java
similarity index 100%
copy from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/DoubleStreamReader.java
copy to 
integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/DoubleStreamReader.java
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/FloatStreamReader.java
 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/FloatStreamReader.java
similarity index 100%
copy from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/FloatStreamReader.java
copy to 
integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/FloatStreamReader.java
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/IntegerStreamReader.java
 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/IntegerStreamReader.java
similarity index 100%
copy from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/IntegerStreamReader.java
copy to 
integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/IntegerStreamReader.java
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/LongStreamReader.java
 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/LongStreamReader.java
similarity index 100%
copy from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/LongStreamReader.java
copy to 
integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/LongStreamReader.java
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/ObjectStreamReader.java
 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/ObjectStreamReader.java
similarity index 100%
copy from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/ObjectStreamReader.java
copy to 
integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/ObjectStreamReader.java
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/PrestoVectorBlockBuilder.java
 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/PrestoVectorBlockBuilder.java
similarity index 100%
copy from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/PrestoVectorBlockBuilder.java
copy to 
integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/PrestoVectorBlockBuilder.java
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/ShortStreamReader.java
 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/ShortStreamReader.java
similarity index 100%
copy from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/ShortStreamReader.java
copy to 
integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/ShortStreamReader.java
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/SliceStreamReader.java
 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/SliceStreamReader.java
similarity index 100%
copy from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/SliceStreamReader.java
copy to 
integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/SliceStreamReader.java
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/TimestampStreamReader.java
 
b/integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/TimestampStreamReader.java
similarity index 100%
copy from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/TimestampStreamReader.java
copy to 
integration/presto/src/main/prestodb/org/apache/carbondata/presto/readers/TimestampStreamReader.java
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataColumnConstraint.java
 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/CarbondataColumnConstraint.java
similarity index 84%
rename from 
integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataColumnConstraint.java
rename to 
integration/presto/src/main/prestosql/org/apache/carbondata/presto/CarbondataColumnConstraint.java
index d6ccf1c..0fbc4f5 100755
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataColumnConstraint.java
+++ 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/CarbondataColumnConstraint.java
@@ -22,10 +22,10 @@ import java.util.Optional;
 
 import static java.util.Objects.requireNonNull;
 
-import com.facebook.presto.spi.predicate.Domain;
 import com.fasterxml.jackson.annotation.JsonCreator;
 import com.fasterxml.jackson.annotation.JsonProperty;
 import com.fasterxml.jackson.annotation.JsonSetter;
+import io.prestosql.spi.predicate.Domain;
 
 import static com.google.common.base.MoreObjects.toStringHelper;
 
@@ -34,20 +34,20 @@ import static 
com.google.common.base.MoreObjects.toStringHelper;
  */
 public class CarbondataColumnConstraint {
   private final String name;
-  private final boolean invertedindexed;
+  private final boolean invertedIndexed;
   private Optional<Domain> domain;
 
   @JsonCreator public CarbondataColumnConstraint(@JsonProperty("name") String 
name,
       @JsonProperty("domain") Optional<Domain> domain,
-      @JsonProperty("invertedindexed") boolean invertedindexed) {
+      @JsonProperty("invertedIndexed") boolean invertedIndexed) {
     this.name = requireNonNull(name, "name is null");
-    this.invertedindexed = requireNonNull(invertedindexed, "invertedIndexed is 
null");
+    this.invertedIndexed = requireNonNull(invertedIndexed, "invertedIndexed is 
null");
     this.domain = requireNonNull(domain, "domain is null");
   }
 
   @JsonProperty
-  public boolean isInvertedindexed() {
-    return invertedindexed;
+  public boolean isInvertedIndexed() {
+    return invertedIndexed;
   }
 
   @JsonProperty
@@ -67,7 +67,7 @@ public class CarbondataColumnConstraint {
 
   @Override
   public int hashCode() {
-    return Objects.hash(name, domain, invertedindexed);
+    return Objects.hash(name, domain, invertedIndexed);
   }
 
   @Override
@@ -82,12 +82,12 @@ public class CarbondataColumnConstraint {
 
     CarbondataColumnConstraint other = (CarbondataColumnConstraint) obj;
     return Objects.equals(this.name, other.name) && 
Objects.equals(this.domain, other.domain)
-        && Objects.equals(this.invertedindexed, other.invertedindexed);
+        && Objects.equals(this.invertedIndexed, other.invertedIndexed);
   }
 
   @Override
   public String toString() {
-    return toStringHelper(this).add("name", this.name).add("invertedindexed", 
this.invertedindexed)
+    return toStringHelper(this).add("name", this.name).add("invertedindexed", 
this.invertedIndexed)
         .add("domain", this.domain).toString();
   }
 }
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataConnectorFactory.java
 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/CarbondataConnectorFactory.java
similarity index 62%
rename from 
integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataConnectorFactory.java
rename to 
integration/presto/src/main/prestosql/org/apache/carbondata/presto/CarbondataConnectorFactory.java
index 1284237..0f5a61f 100755
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataConnectorFactory.java
+++ 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/CarbondataConnectorFactory.java
@@ -17,11 +17,7 @@
 
 package org.apache.carbondata.presto;
 
-import java.lang.management.ManagementFactory;
-import java.lang.reflect.Constructor;
-import java.lang.reflect.Field;
-import java.lang.reflect.Method;
-import java.lang.reflect.Modifier;
+import java.lang.reflect.*;
 import java.util.Map;
 import java.util.Optional;
 import java.util.Set;
@@ -32,40 +28,6 @@ import 
org.apache.carbondata.hadoop.api.CarbonTableInputFormat;
 import org.apache.carbondata.hadoop.api.CarbonTableOutputFormat;
 import org.apache.carbondata.presto.impl.CarbonTableConfig;
 
-import com.facebook.presto.hive.HiveAnalyzeProperties;
-import com.facebook.presto.hive.HiveConnector;
-import com.facebook.presto.hive.HiveConnectorFactory;
-import com.facebook.presto.hive.HiveMetadataFactory;
-import com.facebook.presto.hive.HiveProcedureModule;
-import com.facebook.presto.hive.HiveSchemaProperties;
-import com.facebook.presto.hive.HiveSessionProperties;
-import com.facebook.presto.hive.HiveStorageFormat;
-import com.facebook.presto.hive.HiveTableProperties;
-import com.facebook.presto.hive.HiveTransactionManager;
-import com.facebook.presto.hive.NodeVersion;
-import com.facebook.presto.hive.RebindSafeMBeanServer;
-import com.facebook.presto.hive.authentication.HiveAuthenticationModule;
-import com.facebook.presto.hive.metastore.HiveMetastoreModule;
-import com.facebook.presto.hive.s3.HiveS3Module;
-import com.facebook.presto.hive.security.HiveSecurityModule;
-import com.facebook.presto.hive.security.PartitionsAwareAccessControl;
-import com.facebook.presto.spi.NodeManager;
-import com.facebook.presto.spi.PageIndexerFactory;
-import com.facebook.presto.spi.PageSorter;
-import com.facebook.presto.spi.classloader.ThreadContextClassLoader;
-import com.facebook.presto.spi.connector.Connector;
-import com.facebook.presto.spi.connector.ConnectorAccessControl;
-import com.facebook.presto.spi.connector.ConnectorContext;
-import com.facebook.presto.spi.connector.ConnectorNodePartitioningProvider;
-import com.facebook.presto.spi.connector.ConnectorPageSinkProvider;
-import com.facebook.presto.spi.connector.ConnectorPageSourceProvider;
-import com.facebook.presto.spi.connector.ConnectorSplitManager;
-import 
com.facebook.presto.spi.connector.classloader.ClassLoaderSafeConnectorPageSinkProvider;
-import 
com.facebook.presto.spi.connector.classloader.ClassLoaderSafeConnectorPageSourceProvider;
-import 
com.facebook.presto.spi.connector.classloader.ClassLoaderSafeConnectorSplitManager;
-import 
com.facebook.presto.spi.connector.classloader.ClassLoaderSafeNodePartitioningProvider;
-import com.facebook.presto.spi.procedure.Procedure;
-import com.facebook.presto.spi.type.TypeManager;
 import com.google.common.collect.ImmutableSet;
 import com.google.inject.Injector;
 import com.google.inject.Key;
@@ -75,6 +37,44 @@ import io.airlift.bootstrap.LifeCycleManager;
 import io.airlift.event.client.EventModule;
 import io.airlift.json.JsonModule;
 import io.airlift.units.DataSize;
+import io.prestosql.plugin.base.jmx.MBeanServerModule;
+import io.prestosql.plugin.hive.ConnectorObjectNameGeneratorModule;
+import io.prestosql.plugin.hive.HiveAnalyzeProperties;
+import io.prestosql.plugin.hive.HiveCatalogName;
+import io.prestosql.plugin.hive.HiveConnector;
+import io.prestosql.plugin.hive.HiveConnectorFactory;
+import io.prestosql.plugin.hive.HiveMetadataFactory;
+import io.prestosql.plugin.hive.HiveProcedureModule;
+import io.prestosql.plugin.hive.HiveSchemaProperties;
+import io.prestosql.plugin.hive.HiveSessionProperties;
+import io.prestosql.plugin.hive.HiveStorageFormat;
+import io.prestosql.plugin.hive.HiveTableProperties;
+import io.prestosql.plugin.hive.HiveTransactionManager;
+import io.prestosql.plugin.hive.NodeVersion;
+import io.prestosql.plugin.hive.authentication.HiveAuthenticationModule;
+import io.prestosql.plugin.hive.gcs.HiveGcsModule;
+import io.prestosql.plugin.hive.metastore.HiveMetastoreModule;
+import io.prestosql.plugin.hive.s3.HiveS3Module;
+import io.prestosql.plugin.hive.security.HiveSecurityModule;
+import io.prestosql.plugin.hive.security.SystemTableAwareAccessControl;
+import io.prestosql.spi.NodeManager;
+import io.prestosql.spi.PageIndexerFactory;
+import io.prestosql.spi.PageSorter;
+import io.prestosql.spi.VersionEmbedder;
+import io.prestosql.spi.classloader.ThreadContextClassLoader;
+import io.prestosql.spi.connector.Connector;
+import io.prestosql.spi.connector.ConnectorAccessControl;
+import io.prestosql.spi.connector.ConnectorContext;
+import io.prestosql.spi.connector.ConnectorNodePartitioningProvider;
+import io.prestosql.spi.connector.ConnectorPageSinkProvider;
+import io.prestosql.spi.connector.ConnectorPageSourceProvider;
+import io.prestosql.spi.connector.ConnectorSplitManager;
+import 
io.prestosql.spi.connector.classloader.ClassLoaderSafeConnectorPageSinkProvider;
+import 
io.prestosql.spi.connector.classloader.ClassLoaderSafeConnectorPageSourceProvider;
+import 
io.prestosql.spi.connector.classloader.ClassLoaderSafeConnectorSplitManager;
+import 
io.prestosql.spi.connector.classloader.ClassLoaderSafeNodePartitioningProvider;
+import io.prestosql.spi.procedure.Procedure;
+import io.prestosql.spi.type.TypeManager;
 import org.weakref.jmx.guice.MBeanModule;
 import sun.reflect.ConstructorAccessor;
 
@@ -89,6 +89,14 @@ public class CarbondataConnectorFactory extends 
HiveConnectorFactory {
 
   private final ClassLoader classLoader;
 
+  static {
+    try {
+      setCarbonEnum();
+    } catch (Exception e) {
+      throw new RuntimeException(e);
+    }
+  }
+
   public CarbondataConnectorFactory(String connectorName, ClassLoader 
classLoader) {
     super(connectorName, classLoader, Optional.empty());
     this.classLoader = requireNonNull(classLoader, "classLoader is null");
@@ -100,28 +108,36 @@ public class CarbondataConnectorFactory extends 
HiveConnectorFactory {
     requireNonNull(config, "config is null");
 
     try (ThreadContextClassLoader ignored = new 
ThreadContextClassLoader(classLoader)) {
-      Bootstrap app = new Bootstrap(new EventModule(), new MBeanModule(), new 
JsonModule(),
-          new CarbondataModule(catalogName), new HiveS3Module(catalogName),
-          new HiveMetastoreModule(catalogName, Optional.ofNullable(null)), new 
HiveSecurityModule(),
-          new HiveAuthenticationModule(), new HiveProcedureModule(), binder -> 
{
-        javax.management.MBeanServer platformMBeanServer =
-            ManagementFactory.getPlatformMBeanServer();
-        binder.bind(javax.management.MBeanServer.class)
-            .toInstance(new RebindSafeMBeanServer(platformMBeanServer));
-        binder.bind(NodeVersion.class)
-            .toInstance(new 
NodeVersion(context.getNodeManager().getCurrentNode().getVersion()));
-        binder.bind(NodeManager.class).toInstance(context.getNodeManager());
-        binder.bind(TypeManager.class).toInstance(context.getTypeManager());
-        
binder.bind(PageIndexerFactory.class).toInstance(context.getPageIndexerFactory());
-        binder.bind(PageSorter.class).toInstance(context.getPageSorter());
-        configBinder(binder).bindConfig(CarbonTableConfig.class);
-      });
-
-      Injector injector =
-          
app.strictConfig().doNotInitializeLogging().setRequiredConfigurationProperties(config)
-              .initialize();
-
-      setCarbonEnum();
+      Bootstrap app = new Bootstrap(
+          new EventModule(),
+          new MBeanModule(),
+          new ConnectorObjectNameGeneratorModule(catalogName),
+          new JsonModule(),
+          new CarbondataModule(catalogName),
+          new HiveS3Module(),
+          new HiveGcsModule(),
+          new HiveMetastoreModule(Optional.ofNullable(null)),
+          new HiveSecurityModule(),
+          new HiveAuthenticationModule(),
+          new HiveProcedureModule(),
+          new MBeanServerModule(),
+          binder -> {
+            binder.bind(NodeVersion.class).toInstance(
+                new 
NodeVersion(context.getNodeManager().getCurrentNode().getVersion()));
+            
binder.bind(NodeManager.class).toInstance(context.getNodeManager());
+            
binder.bind(VersionEmbedder.class).toInstance(context.getVersionEmbedder());
+            
binder.bind(TypeManager.class).toInstance(context.getTypeManager());
+            
binder.bind(PageIndexerFactory.class).toInstance(context.getPageIndexerFactory());
+            binder.bind(PageSorter.class).toInstance(context.getPageSorter());
+            binder.bind(HiveCatalogName.class).toInstance(new 
HiveCatalogName(catalogName));
+            configBinder(binder).bindConfig(CarbonTableConfig.class);
+          });
+
+      Injector injector = app
+          .strictConfig()
+          .doNotInitializeLogging()
+          .setRequiredConfigurationProperties(config)
+          .initialize();
 
       LifeCycleManager lifeCycleManager = 
injector.getInstance(LifeCycleManager.class);
       HiveMetadataFactory metadataFactory = 
injector.getInstance(HiveMetadataFactory.class);
@@ -140,7 +156,7 @@ public class CarbondataConnectorFactory extends 
HiveConnectorFactory {
       HiveAnalyzeProperties hiveAnalyzeProperties =
           injector.getInstance(HiveAnalyzeProperties.class);
       ConnectorAccessControl accessControl =
-          new 
PartitionsAwareAccessControl(injector.getInstance(ConnectorAccessControl.class));
+          new 
SystemTableAwareAccessControl(injector.getInstance(ConnectorAccessControl.class));
       Set<Procedure> procedures = injector.getInstance(Key.get(new 
TypeLiteral<Set<Procedure>>() {
       }));
 
@@ -164,7 +180,7 @@ public class CarbondataConnectorFactory extends 
HiveConnectorFactory {
    *
    * @throws Exception
    */
-  private void setCarbonEnum() throws Exception {
+  private static void setCarbonEnum() throws Exception {
     for (HiveStorageFormat format : HiveStorageFormat.values()) {
       if (format.name().equals("CARBON")) {
         return;
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataModule.java
 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/CarbondataModule.java
similarity index 63%
rename from 
integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataModule.java
rename to 
integration/presto/src/main/prestosql/org/apache/carbondata/presto/CarbondataModule.java
index 81b2476..b8c3c0b 100755
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataModule.java
+++ 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/CarbondataModule.java
@@ -23,58 +23,56 @@ import static java.util.Objects.requireNonNull;
 
 import org.apache.carbondata.presto.impl.CarbonTableReader;
 
-import com.facebook.presto.hive.CoercionPolicy;
-import com.facebook.presto.hive.DirectoryLister;
-import com.facebook.presto.hive.FileFormatDataSourceStats;
-import com.facebook.presto.hive.GenericHiveRecordCursorProvider;
-import com.facebook.presto.hive.HadoopDirectoryLister;
-import com.facebook.presto.hive.HdfsConfiguration;
-import com.facebook.presto.hive.HdfsConfigurationUpdater;
-import com.facebook.presto.hive.HdfsEnvironment;
-import com.facebook.presto.hive.HiveAnalyzeProperties;
-import com.facebook.presto.hive.HiveClientConfig;
-import com.facebook.presto.hive.HiveClientModule;
-import com.facebook.presto.hive.HiveCoercionPolicy;
-import com.facebook.presto.hive.HiveConnectorId;
-import com.facebook.presto.hive.HiveEventClient;
-import com.facebook.presto.hive.HiveFileWriterFactory;
-import com.facebook.presto.hive.HiveHdfsConfiguration;
-import com.facebook.presto.hive.HiveLocationService;
-import com.facebook.presto.hive.HiveMetadataFactory;
-import com.facebook.presto.hive.HiveNodePartitioningProvider;
-import com.facebook.presto.hive.HivePageSinkProvider;
-import com.facebook.presto.hive.HivePageSourceFactory;
-import com.facebook.presto.hive.HivePartitionManager;
-import com.facebook.presto.hive.HiveRecordCursorProvider;
-import com.facebook.presto.hive.HiveSessionProperties;
-import com.facebook.presto.hive.HiveSplitManager;
-import com.facebook.presto.hive.HiveTableProperties;
-import com.facebook.presto.hive.HiveTransactionManager;
-import com.facebook.presto.hive.HiveTypeTranslator;
-import com.facebook.presto.hive.HiveWriterStats;
-import com.facebook.presto.hive.LocationService;
-import com.facebook.presto.hive.NamenodeStats;
-import com.facebook.presto.hive.OrcFileWriterConfig;
-import com.facebook.presto.hive.OrcFileWriterFactory;
-import com.facebook.presto.hive.ParquetFileWriterConfig;
-import com.facebook.presto.hive.PartitionUpdate;
-import com.facebook.presto.hive.RcFileFileWriterFactory;
-import com.facebook.presto.hive.TableParameterCodec;
-import com.facebook.presto.hive.TransactionalMetadata;
-import com.facebook.presto.hive.TypeTranslator;
-import com.facebook.presto.hive.orc.DwrfPageSourceFactory;
-import com.facebook.presto.hive.orc.OrcPageSourceFactory;
-import com.facebook.presto.hive.parquet.ParquetPageSourceFactory;
-import com.facebook.presto.hive.rcfile.RcFilePageSourceFactory;
-import com.facebook.presto.spi.connector.ConnectorNodePartitioningProvider;
-import com.facebook.presto.spi.connector.ConnectorPageSinkProvider;
-import com.facebook.presto.spi.connector.ConnectorPageSourceProvider;
-import com.facebook.presto.spi.connector.ConnectorSplitManager;
 import com.google.inject.Binder;
 import com.google.inject.Scopes;
 import com.google.inject.TypeLiteral;
 import com.google.inject.multibindings.Multibinder;
 import io.airlift.event.client.EventClient;
+import io.prestosql.plugin.hive.CachingDirectoryLister;
+import io.prestosql.plugin.hive.CoercionPolicy;
+import io.prestosql.plugin.hive.DirectoryLister;
+import io.prestosql.plugin.hive.DynamicConfigurationProvider;
+import io.prestosql.plugin.hive.FileFormatDataSourceStats;
+import io.prestosql.plugin.hive.GenericHiveRecordCursorProvider;
+import io.prestosql.plugin.hive.HdfsConfiguration;
+import io.prestosql.plugin.hive.HdfsConfigurationInitializer;
+import io.prestosql.plugin.hive.HdfsEnvironment;
+import io.prestosql.plugin.hive.HiveAnalyzeProperties;
+import io.prestosql.plugin.hive.HiveCoercionPolicy;
+import io.prestosql.plugin.hive.HiveConfig;
+import io.prestosql.plugin.hive.HiveEventClient;
+import io.prestosql.plugin.hive.HiveFileWriterFactory;
+import io.prestosql.plugin.hive.HiveHdfsConfiguration;
+import io.prestosql.plugin.hive.HiveLocationService;
+import io.prestosql.plugin.hive.HiveMetadataFactory;
+import io.prestosql.plugin.hive.HiveModule;
+import io.prestosql.plugin.hive.HiveNodePartitioningProvider;
+import io.prestosql.plugin.hive.HivePageSinkProvider;
+import io.prestosql.plugin.hive.HivePageSourceFactory;
+import io.prestosql.plugin.hive.HivePartitionManager;
+import io.prestosql.plugin.hive.HiveRecordCursorProvider;
+import io.prestosql.plugin.hive.HiveSessionProperties;
+import io.prestosql.plugin.hive.HiveSplitManager;
+import io.prestosql.plugin.hive.HiveTableProperties;
+import io.prestosql.plugin.hive.HiveTransactionManager;
+import io.prestosql.plugin.hive.HiveTypeTranslator;
+import io.prestosql.plugin.hive.HiveWriterStats;
+import io.prestosql.plugin.hive.LocationService;
+import io.prestosql.plugin.hive.NamenodeStats;
+import io.prestosql.plugin.hive.OrcFileWriterConfig;
+import io.prestosql.plugin.hive.OrcFileWriterFactory;
+import io.prestosql.plugin.hive.ParquetFileWriterConfig;
+import io.prestosql.plugin.hive.PartitionUpdate;
+import io.prestosql.plugin.hive.RcFileFileWriterFactory;
+import io.prestosql.plugin.hive.TransactionalMetadata;
+import io.prestosql.plugin.hive.TypeTranslator;
+import io.prestosql.plugin.hive.orc.OrcPageSourceFactory;
+import io.prestosql.plugin.hive.parquet.ParquetPageSourceFactory;
+import io.prestosql.plugin.hive.rcfile.RcFilePageSourceFactory;
+import io.prestosql.spi.connector.ConnectorNodePartitioningProvider;
+import io.prestosql.spi.connector.ConnectorPageSinkProvider;
+import io.prestosql.spi.connector.ConnectorPageSourceProvider;
+import io.prestosql.spi.connector.ConnectorSplitManager;
 
 import static com.google.inject.multibindings.Multibinder.newSetBinder;
 import static io.airlift.configuration.ConfigBinder.configBinder;
@@ -86,26 +84,25 @@ import static 
org.weakref.jmx.guice.ExportBinder.newExporter;
 /**
  * Binds all necessary classes needed for this module.
  */
-public class CarbondataModule extends HiveClientModule {
+public class CarbondataModule extends HiveModule {
 
   private final String connectorId;
 
   public CarbondataModule(String connectorId) {
-    super(connectorId);
     this.connectorId = requireNonNull(connectorId, "connector id is null");
   }
 
   @Override
   public void configure(Binder binder) {
-    binder.bind(HiveConnectorId.class).toInstance(new 
HiveConnectorId(connectorId));
     binder.bind(TypeTranslator.class).toInstance(new HiveTypeTranslator());
     
binder.bind(CoercionPolicy.class).to(HiveCoercionPolicy.class).in(Scopes.SINGLETON);
 
-    binder.bind(HdfsConfigurationUpdater.class).in(Scopes.SINGLETON);
+    binder.bind(HdfsConfigurationInitializer.class).in(Scopes.SINGLETON);
+    newSetBinder(binder, DynamicConfigurationProvider.class);
     
binder.bind(HdfsConfiguration.class).to(HiveHdfsConfiguration.class).in(Scopes.SINGLETON);
     binder.bind(HdfsEnvironment.class).in(Scopes.SINGLETON);
-    
binder.bind(DirectoryLister.class).to(HadoopDirectoryLister.class).in(Scopes.SINGLETON);
-    configBinder(binder).bindConfig(HiveClientConfig.class);
+    
binder.bind(DirectoryLister.class).to(CachingDirectoryLister.class).in(Scopes.SINGLETON);
+    configBinder(binder).bindConfig(HiveConfig.class);
 
     binder.bind(HiveSessionProperties.class).in(Scopes.SINGLETON);
     binder.bind(HiveTableProperties.class).in(Scopes.SINGLETON);
@@ -128,7 +125,6 @@ public class CarbondataModule extends HiveClientModule {
         .in(Scopes.SINGLETON);
     binder.bind(HivePartitionManager.class).in(Scopes.SINGLETON);
     
binder.bind(LocationService.class).to(HiveLocationService.class).in(Scopes.SINGLETON);
-    binder.bind(TableParameterCodec.class).in(Scopes.SINGLETON);
     binder.bind(HiveMetadataFactory.class).in(Scopes.SINGLETON);
     binder.bind(new TypeLiteral<Supplier<TransactionalMetadata>>() {
     }).to(HiveMetadataFactory.class).in(Scopes.SINGLETON);
@@ -152,7 +148,6 @@ public class CarbondataModule extends HiveClientModule {
     Multibinder<HivePageSourceFactory> pageSourceFactoryBinder =
         newSetBinder(binder, HivePageSourceFactory.class);
     
pageSourceFactoryBinder.addBinding().to(OrcPageSourceFactory.class).in(Scopes.SINGLETON);
-    
pageSourceFactoryBinder.addBinding().to(DwrfPageSourceFactory.class).in(Scopes.SINGLETON);
     
pageSourceFactoryBinder.addBinding().to(ParquetPageSourceFactory.class).in(Scopes.SINGLETON);
     
pageSourceFactoryBinder.addBinding().to(RcFilePageSourceFactory.class).in(Scopes.SINGLETON);
 
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataPageSource.java
 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/CarbondataPageSource.java
similarity index 91%
rename from 
integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataPageSource.java
rename to 
integration/presto/src/main/prestosql/org/apache/carbondata/presto/CarbondataPageSource.java
index 7ac2ca5..d3ae8f5 100644
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataPageSource.java
+++ 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/CarbondataPageSource.java
@@ -53,17 +53,19 @@ import 
org.apache.carbondata.presto.impl.CarbonLocalMultiBlockSplit;
 import org.apache.carbondata.presto.readers.PrestoVectorBlockBuilder;
 import 
org.apache.carbondata.processing.loading.exception.CarbonDataLoadingException;
 
-import com.facebook.presto.hadoop.$internal.com.google.common.base.Throwables;
-import com.facebook.presto.hive.HiveColumnHandle;
-import com.facebook.presto.hive.HiveSplit;
-import com.facebook.presto.spi.ColumnHandle;
-import com.facebook.presto.spi.ConnectorPageSource;
-import com.facebook.presto.spi.Page;
-import com.facebook.presto.spi.PrestoException;
-import com.facebook.presto.spi.block.Block;
-import com.facebook.presto.spi.block.LazyBlock;
-import com.facebook.presto.spi.block.LazyBlockLoader;
+import com.google.common.base.Throwables;
 import com.google.common.collect.ImmutableList;
+import io.prestosql.plugin.hive.HiveColumnHandle;
+import io.prestosql.plugin.hive.HiveSplit;
+import io.prestosql.plugin.hive.HiveTableHandle;
+import io.prestosql.spi.Page;
+import io.prestosql.spi.PrestoException;
+import io.prestosql.spi.block.Block;
+import io.prestosql.spi.block.LazyBlock;
+import io.prestosql.spi.block.LazyBlockLoader;
+import io.prestosql.spi.connector.ColumnHandle;
+import io.prestosql.spi.connector.ConnectorPageSource;
+import io.prestosql.spi.connector.ConnectorTableHandle;
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.mapred.JobConf;
 import org.apache.hadoop.mapreduce.TaskAttemptID;
@@ -87,6 +89,7 @@ class CarbondataPageSource implements ConnectorPageSource {
   private Configuration hadoopConf;
   private FileFormat fileFormat;
   private List<ColumnHandle> columnHandles;
+  ConnectorTableHandle tableHandle;
   private int columnCount = 0;
   private boolean closed;
   private long sizeOfData = 0;
@@ -107,13 +110,15 @@ class CarbondataPageSource implements ConnectorPageSource 
{
   private boolean isFrstPage = true;
 
   CarbondataPageSource(CarbonTable carbonTable, String queryId, HiveSplit 
split,
-      List<ColumnHandle> columnHandles, Configuration hadoopConf, boolean 
isDirectVectorFill) {
+      List<ColumnHandle> columnHandles, ConnectorTableHandle tableHandle, 
Configuration hadoopConf,
+      boolean isDirectVectorFill) {
     this.carbonTable = carbonTable;
     this.queryId = queryId;
     this.split = split;
     this.columnHandles = columnHandles;
     this.hadoopConf = hadoopConf;
     this.isDirectVectorFill = isDirectVectorFill;
+    this.tableHandle = tableHandle;
     initialize();
   }
 
@@ -130,11 +135,12 @@ class CarbondataPageSource implements ConnectorPageSource 
{
 
   private void initializeForColumnar() {
     readSupport = new CarbonPrestoDecodeReadSupport();
-    vectorReader = createReaderForColumnar(split, columnHandles, readSupport, 
hadoopConf);
+    vectorReader =
+        createReaderForColumnar(split, columnHandles, tableHandle, 
readSupport, hadoopConf);
   }
 
   private void initializeForRow() {
-    QueryModel queryModel = createQueryModel(split, columnHandles, hadoopConf);
+    QueryModel queryModel = createQueryModel(split, tableHandle, 
columnHandles, hadoopConf);
     rowReader = new StreamRecordReader(queryModel, false);
     List<ProjectionDimension> queryDimension = 
queryModel.getProjectionDimensions();
     List<ProjectionMeasure> queryMeasures = queryModel.getProjectionMeasures();
@@ -322,7 +328,8 @@ class CarbondataPageSource implements ConnectorPageSource {
       }
       nanoEnd = System.nanoTime();
     } catch (Exception e) {
-      throw Throwables.propagate(e);
+      Throwables.throwIfUnchecked(e);
+      throw new RuntimeException(e);
     }
 
   }
@@ -344,9 +351,9 @@ class CarbondataPageSource implements ConnectorPageSource {
    * Create vector reader using the split.
    */
   private PrestoCarbonVectorizedRecordReader createReaderForColumnar(HiveSplit 
carbonSplit,
-      List<? extends ColumnHandle> columns, CarbonPrestoDecodeReadSupport 
readSupport,
-      Configuration conf) {
-    QueryModel queryModel = createQueryModel(carbonSplit, columns, conf);
+      List<? extends ColumnHandle> columns, ConnectorTableHandle tableHandle,
+      CarbonPrestoDecodeReadSupport readSupport, Configuration conf) {
+    QueryModel queryModel = createQueryModel(carbonSplit, tableHandle, 
columns, conf);
     if (isDirectVectorFill) {
       queryModel.setDirectVectorFill(true);
       queryModel.setPreFetchData(false);
@@ -371,7 +378,7 @@ class CarbondataPageSource implements ConnectorPageSource {
    * @param columns
    * @return
    */
-  private QueryModel createQueryModel(HiveSplit carbondataSplit,
+  private QueryModel createQueryModel(HiveSplit carbondataSplit, 
ConnectorTableHandle tableHandle,
       List<? extends ColumnHandle> columns, Configuration conf) {
 
     try {
@@ -384,9 +391,13 @@ class CarbondataPageSource implements ConnectorPageSource {
       conf.set(CarbonTableInputFormat.INPUT_DIR, carbonTablePath);
       conf.set("query.id", queryId);
       JobConf jobConf = new JobConf(conf);
-      CarbonTableInputFormat carbonTableInputFormat = 
createInputFormat(jobConf, carbonTable,
-          new DataMapFilter(carbonTable,
-              
PrestoFilterUtil.parseFilterExpression(carbondataSplit.getEffectivePredicate())),
+      HiveTableHandle hiveTable = (HiveTableHandle) tableHandle;
+      CarbonTableInputFormat carbonTableInputFormat = createInputFormat(
+          jobConf,
+          carbonTable,
+          new DataMapFilter(
+              carbonTable,
+              
PrestoFilterUtil.parseFilterExpression(hiveTable.getCompactEffectivePredicate())),
           carbonProjection);
       TaskAttemptContextImpl hadoopAttemptContext =
           new TaskAttemptContextImpl(jobConf, new TaskAttemptID("", 1, 
TaskType.MAP, 0, 0));
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataPageSourceProvider.java
 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/CarbondataPageSourceProvider.java
similarity index 77%
rename from 
integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataPageSourceProvider.java
rename to 
integration/presto/src/main/prestosql/org/apache/carbondata/presto/CarbondataPageSourceProvider.java
index 1f646d9..eaf0b92 100644
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataPageSourceProvider.java
+++ 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/CarbondataPageSourceProvider.java
@@ -28,20 +28,21 @@ import org.apache.carbondata.presto.impl.CarbonTableReader;
 
 import static org.apache.carbondata.presto.Types.checkType;
 
-import com.facebook.presto.hive.HdfsEnvironment;
-import com.facebook.presto.hive.HiveClientConfig;
-import com.facebook.presto.hive.HivePageSourceFactory;
-import com.facebook.presto.hive.HivePageSourceProvider;
-import com.facebook.presto.hive.HiveRecordCursorProvider;
-import com.facebook.presto.hive.HiveSplit;
-import com.facebook.presto.spi.ColumnHandle;
-import com.facebook.presto.spi.ConnectorPageSource;
-import com.facebook.presto.spi.ConnectorSession;
-import com.facebook.presto.spi.ConnectorSplit;
-import com.facebook.presto.spi.SchemaTableName;
-import com.facebook.presto.spi.connector.ConnectorTransactionHandle;
-import com.facebook.presto.spi.type.TypeManager;
 import com.google.inject.Inject;
+import io.prestosql.plugin.hive.HdfsEnvironment;
+import io.prestosql.plugin.hive.HiveConfig;
+import io.prestosql.plugin.hive.HivePageSourceFactory;
+import io.prestosql.plugin.hive.HivePageSourceProvider;
+import io.prestosql.plugin.hive.HiveRecordCursorProvider;
+import io.prestosql.plugin.hive.HiveSplit;
+import io.prestosql.spi.connector.ColumnHandle;
+import io.prestosql.spi.connector.ConnectorPageSource;
+import io.prestosql.spi.connector.ConnectorSession;
+import io.prestosql.spi.connector.ConnectorSplit;
+import io.prestosql.spi.connector.ConnectorTableHandle;
+import io.prestosql.spi.connector.ConnectorTransactionHandle;
+import io.prestosql.spi.connector.SchemaTableName;
+import io.prestosql.spi.type.TypeManager;
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.fs.Path;
 
@@ -57,26 +58,27 @@ public class CarbondataPageSourceProvider extends 
HivePageSourceProvider {
   private HdfsEnvironment hdfsEnvironment;
 
   @Inject public CarbondataPageSourceProvider(
-      HiveClientConfig hiveClientConfig,
+      HiveConfig hiveConfig,
       HdfsEnvironment hdfsEnvironment,
       Set<HiveRecordCursorProvider> cursorProviders,
       Set<HivePageSourceFactory> pageSourceFactories,
       TypeManager typeManager,
       CarbonTableReader carbonTableReader) {
-    super(hiveClientConfig, hdfsEnvironment, cursorProviders, 
pageSourceFactories, typeManager);
+    super(hiveConfig, hdfsEnvironment, cursorProviders, pageSourceFactories, 
typeManager);
     this.carbonTableReader = requireNonNull(carbonTableReader, 
"carbonTableReader is null");
     this.hdfsEnvironment = hdfsEnvironment;
   }
 
   @Override
   public ConnectorPageSource createPageSource(ConnectorTransactionHandle 
transactionHandle,
-      ConnectorSession session, ConnectorSplit split, List<ColumnHandle> 
columns) {
+      ConnectorSession session, ConnectorSplit split, ConnectorTableHandle 
table,
+      List<ColumnHandle> columns) {
     HiveSplit carbonSplit =
         checkType(split, HiveSplit.class, "split is not class HiveSplit");
     this.queryId = carbonSplit.getSchema().getProperty("queryId");
     if (this.queryId == null) {
       // Fall back to hive pagesource.
-      return super.createPageSource(transactionHandle, session, split, 
columns);
+      return super.createPageSource(transactionHandle, session, split, table, 
columns);
     }
     Configuration configuration = this.hdfsEnvironment.getConfiguration(
         new HdfsEnvironment.HdfsContext(session, carbonSplit.getDatabase(), 
carbonSplit.getTable()),
@@ -86,7 +88,7 @@ public class CarbondataPageSourceProvider extends 
HivePageSourceProvider {
     boolean isDirectVectorFill = carbonTableReader.config.getPushRowFilter() 
== null ||
         carbonTableReader.config.getPushRowFilter().equalsIgnoreCase("false");
     return new CarbondataPageSource(
-        carbonTable, queryId, carbonSplit, columns, configuration, 
isDirectVectorFill);
+        carbonTable, queryId, carbonSplit, columns, table, configuration, 
isDirectVectorFill);
   }
 
   /**
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataPlugin.java
 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/CarbondataPlugin.java
similarity index 92%
rename from 
integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataPlugin.java
rename to 
integration/presto/src/main/prestosql/org/apache/carbondata/presto/CarbondataPlugin.java
index 5ed8224..fa4e1b6 100755
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataPlugin.java
+++ 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/CarbondataPlugin.java
@@ -19,9 +19,9 @@ package org.apache.carbondata.presto;
 
 import org.apache.carbondata.core.datastore.impl.FileFactory;
 
-import com.facebook.presto.spi.Plugin;
-import com.facebook.presto.spi.connector.ConnectorFactory;
 import com.google.common.collect.ImmutableList;
+import io.prestosql.spi.Plugin;
+import io.prestosql.spi.connector.ConnectorFactory;
 
 public class CarbondataPlugin implements Plugin {
 
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataSplitManager.java
 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/CarbondataSplitManager.java
similarity index 70%
rename from 
integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataSplitManager.java
rename to 
integration/presto/src/main/prestosql/org/apache/carbondata/presto/CarbondataSplitManager.java
index 0147b75..93ebe42 100755
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/CarbondataSplitManager.java
+++ 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/CarbondataSplitManager.java
@@ -42,30 +42,32 @@ import 
org.apache.carbondata.presto.impl.CarbonLocalMultiBlockSplit;
 import org.apache.carbondata.presto.impl.CarbonTableCacheModel;
 import org.apache.carbondata.presto.impl.CarbonTableReader;
 
-import com.facebook.presto.hive.CoercionPolicy;
-import com.facebook.presto.hive.DirectoryLister;
-import com.facebook.presto.hive.ForHiveClient;
-import com.facebook.presto.hive.HdfsEnvironment;
-import com.facebook.presto.hive.HiveClientConfig;
-import com.facebook.presto.hive.HiveColumnHandle;
-import com.facebook.presto.hive.HiveSplit;
-import com.facebook.presto.hive.HiveSplitManager;
-import com.facebook.presto.hive.HiveTableLayoutHandle;
-import com.facebook.presto.hive.HiveTransactionHandle;
-import com.facebook.presto.hive.NamenodeStats;
-import com.facebook.presto.hive.metastore.SemiTransactionalHiveMetastore;
-import com.facebook.presto.hive.metastore.Table;
-import com.facebook.presto.spi.ConnectorSession;
-import com.facebook.presto.spi.ConnectorSplit;
-import com.facebook.presto.spi.ConnectorSplitSource;
-import com.facebook.presto.spi.ConnectorTableLayoutHandle;
-import com.facebook.presto.spi.FixedSplitSource;
-import com.facebook.presto.spi.HostAddress;
-import com.facebook.presto.spi.SchemaTableName;
-import com.facebook.presto.spi.TableNotFoundException;
-import com.facebook.presto.spi.connector.ConnectorTransactionHandle;
-import com.facebook.presto.spi.predicate.TupleDomain;
 import com.google.common.collect.ImmutableList;
+import io.prestosql.plugin.hive.CoercionPolicy;
+import io.prestosql.plugin.hive.DirectoryLister;
+import io.prestosql.plugin.hive.ForHive;
+import io.prestosql.plugin.hive.HdfsEnvironment;
+import io.prestosql.plugin.hive.HiveColumnHandle;
+import io.prestosql.plugin.hive.HiveConfig;
+import io.prestosql.plugin.hive.HivePartitionManager;
+import io.prestosql.plugin.hive.HiveSplit;
+import io.prestosql.plugin.hive.HiveSplitManager;
+import io.prestosql.plugin.hive.HiveTableHandle;
+import io.prestosql.plugin.hive.HiveTransactionHandle;
+import io.prestosql.plugin.hive.NamenodeStats;
+import io.prestosql.plugin.hive.metastore.SemiTransactionalHiveMetastore;
+import io.prestosql.plugin.hive.metastore.Table;
+import io.prestosql.spi.HostAddress;
+import io.prestosql.spi.VersionEmbedder;
+import io.prestosql.spi.connector.ConnectorSession;
+import io.prestosql.spi.connector.ConnectorSplit;
+import io.prestosql.spi.connector.ConnectorSplitSource;
+import io.prestosql.spi.connector.ConnectorTableHandle;
+import io.prestosql.spi.connector.ConnectorTransactionHandle;
+import io.prestosql.spi.connector.FixedSplitSource;
+import io.prestosql.spi.connector.SchemaTableName;
+import io.prestosql.spi.connector.TableNotFoundException;
+import io.prestosql.spi.predicate.TupleDomain;
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.fs.Path;
 
@@ -81,24 +83,31 @@ public class CarbondataSplitManager extends 
HiveSplitManager {
   private final Function<HiveTransactionHandle, 
SemiTransactionalHiveMetastore> metastoreProvider;
   private final HdfsEnvironment hdfsEnvironment;
 
-  @Inject public CarbondataSplitManager(HiveClientConfig hiveClientConfig,
+  @Inject public CarbondataSplitManager(
+      HiveConfig hiveConfig,
       Function<HiveTransactionHandle, SemiTransactionalHiveMetastore> 
metastoreProvider,
-      NamenodeStats namenodeStats, HdfsEnvironment hdfsEnvironment, 
DirectoryLister directoryLister,
-      @ForHiveClient ExecutorService executorService, CoercionPolicy 
coercionPolicy,
+      HivePartitionManager partitionManager,
+      NamenodeStats namenodeStats,
+      HdfsEnvironment hdfsEnvironment,
+      DirectoryLister directoryLister,
+      @ForHive ExecutorService executorService,
+      VersionEmbedder versionEmbedder,
+      CoercionPolicy coercionPolicy,
       CarbonTableReader reader) {
-    super(hiveClientConfig, metastoreProvider, namenodeStats, hdfsEnvironment, 
directoryLister,
-        executorService, coercionPolicy);
+    super(hiveConfig, metastoreProvider, partitionManager, namenodeStats, 
hdfsEnvironment,
+        directoryLister, executorService, versionEmbedder, coercionPolicy);
     this.carbonTableReader = requireNonNull(reader, "client is null");
     this.metastoreProvider = requireNonNull(metastoreProvider, "metastore is 
null");
     this.hdfsEnvironment = requireNonNull(hdfsEnvironment, "hdfsEnvironment is 
null");
   }
 
+  @Override
   public ConnectorSplitSource getSplits(ConnectorTransactionHandle 
transactionHandle,
-      ConnectorSession session, ConnectorTableLayoutHandle layoutHandle,
+      ConnectorSession session, ConnectorTableHandle tableHandle,
       SplitSchedulingStrategy splitSchedulingStrategy) {
 
-    HiveTableLayoutHandle layout = (HiveTableLayoutHandle) layoutHandle;
-    SchemaTableName schemaTableName = layout.getSchemaTableName();
+    HiveTableHandle hiveTable = (HiveTableHandle) tableHandle;
+    SchemaTableName schemaTableName = hiveTable.getSchemaTableName();
 
     // get table metadata
     SemiTransactionalHiveMetastore metastore =
@@ -107,7 +116,7 @@ public class CarbondataSplitManager extends 
HiveSplitManager {
         metastore.getTable(schemaTableName.getSchemaName(), 
schemaTableName.getTableName())
             .orElseThrow(() -> new TableNotFoundException(schemaTableName));
     if 
(!table.getStorage().getStorageFormat().getInputFormat().contains("carbon")) {
-      return super.getSplits(transactionHandle, session, layoutHandle, 
splitSchedulingStrategy);
+      return super.getSplits(transactionHandle, session, tableHandle, 
splitSchedulingStrategy);
     }
     String location = table.getStorage().getLocation();
 
@@ -120,7 +129,7 @@ public class CarbondataSplitManager extends 
HiveSplitManager {
 
     carbonTableReader.setQueryId(queryId);
     TupleDomain<HiveColumnHandle> predicate =
-        (TupleDomain<HiveColumnHandle>) layout.getCompactEffectivePredicate();
+        (TupleDomain<HiveColumnHandle>) 
hiveTable.getCompactEffectivePredicate();
     Configuration configuration = this.hdfsEnvironment.getConfiguration(
         new HdfsEnvironment.HdfsContext(session, 
schemaTableName.getSchemaName(),
             schemaTableName.getTableName()), new Path(location));
@@ -133,7 +142,7 @@ public class CarbondataSplitManager extends 
HiveSplitManager {
     try {
 
       List<CarbonLocalMultiBlockSplit> splits =
-          carbonTableReader.getInputSplits2(cache, filters, predicate, 
configuration);
+          carbonTableReader.getInputSplits(cache, filters, predicate, 
configuration);
 
       ImmutableList.Builder<ConnectorSplit> cSplits = ImmutableList.builder();
       long index = 0;
@@ -148,9 +157,10 @@ public class CarbondataSplitManager extends 
HiveSplitManager {
         properties.setProperty("queryId", queryId);
         properties.setProperty("index", String.valueOf(index));
         cSplits.add(new HiveSplit(schemaTableName.getSchemaName(), 
schemaTableName.getTableName(),
-            schemaTableName.getTableName(), "", 0, 0, 0, properties, new 
ArrayList(),
-            getHostAddresses(split.getLocations()), OptionalInt.empty(), 
false, predicate,
-            new HashMap<>(), Optional.empty(), false));
+            schemaTableName.getTableName(), 
cache.getCarbonTable().getTablePath(), 0, 0, 0,
+            properties, new ArrayList(), 
getHostAddresses(split.getLocations()),
+            OptionalInt.empty(), false, new HashMap<>(),
+            Optional.empty(), false));
       }
 
       statisticRecorder.logStatisticsAsTableDriver();
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/PrestoFilterUtil.java
 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/PrestoFilterUtil.java
similarity index 92%
rename from 
integration/presto/src/main/java/org/apache/carbondata/presto/PrestoFilterUtil.java
rename to 
integration/presto/src/main/prestosql/org/apache/carbondata/presto/PrestoFilterUtil.java
index 70955bb..e5449f0 100755
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/PrestoFilterUtil.java
+++ 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/PrestoFilterUtil.java
@@ -47,19 +47,19 @@ import 
org.apache.carbondata.core.scan.expression.conditional.ListExpression;
 import org.apache.carbondata.core.scan.expression.logical.AndExpression;
 import org.apache.carbondata.core.scan.expression.logical.OrExpression;
 
-import com.facebook.presto.hive.HiveColumnHandle;
-import com.facebook.presto.hive.HiveType;
-import com.facebook.presto.spi.PrestoException;
-import com.facebook.presto.spi.predicate.Domain;
-import com.facebook.presto.spi.predicate.Range;
-import com.facebook.presto.spi.predicate.TupleDomain;
-import com.facebook.presto.spi.type.Decimals;
-import com.facebook.presto.spi.type.Type;
 import io.airlift.slice.Slice;
+import io.prestosql.plugin.hive.HiveColumnHandle;
+import io.prestosql.plugin.hive.HiveType;
+import io.prestosql.spi.PrestoException;
+import io.prestosql.spi.predicate.Domain;
+import io.prestosql.spi.predicate.Range;
+import io.prestosql.spi.predicate.TupleDomain;
+import io.prestosql.spi.type.Decimals;
+import io.prestosql.spi.type.Type;
 import org.apache.hadoop.hive.serde2.typeinfo.DecimalTypeInfo;
 
-import static com.facebook.presto.spi.StandardErrorCode.NOT_SUPPORTED;
 import static com.google.common.base.Preconditions.checkArgument;
+import static io.prestosql.spi.StandardErrorCode.NOT_SUPPORTED;
 
 /**
  * PrestoFilterUtil create the carbonData Expression from the presto-domain
@@ -279,42 +279,42 @@ public class PrestoFilterUtil {
     return finalFilters;
   }
 
-  private static Object convertDataByType(Object rawdata, HiveType type) {
+  private static Object convertDataByType(Object rawData, HiveType type) {
     if (type.equals(HiveType.HIVE_INT) || type.equals(HiveType.HIVE_SHORT)) {
-      return Integer.valueOf(rawdata.toString());
+      return Integer.valueOf(rawData.toString());
     } else if (type.equals(HiveType.HIVE_LONG)) {
-      return rawdata;
+      return rawData;
     } else if (type.equals(HiveType.HIVE_STRING)) {
-      if (rawdata instanceof Slice) {
-        return ((Slice) rawdata).toStringUtf8();
+      if (rawData instanceof Slice) {
+        return ((Slice) rawData).toStringUtf8();
       } else {
-        return rawdata;
+        return rawData;
       }
     } else if (type.equals(HiveType.HIVE_BOOLEAN)) {
-      return rawdata;
+      return rawData;
     } else if (type.equals(HiveType.HIVE_DATE)) {
       Calendar c = Calendar.getInstance();
       c.setTime(new Date(0));
-      c.add(Calendar.DAY_OF_YEAR, ((Long) rawdata).intValue());
+      c.add(Calendar.DAY_OF_YEAR, ((Long) rawData).intValue());
       Date date = c.getTime();
       return date.getTime() * 1000;
     }
     else if (type.getTypeInfo() instanceof DecimalTypeInfo) {
-      if (rawdata instanceof Double) {
-        return new BigDecimal((Double) rawdata);
-      } else if (rawdata instanceof Long) {
-        return new BigDecimal(new BigInteger(String.valueOf(rawdata)),
+      if (rawData instanceof Double) {
+        return new BigDecimal((Double) rawData);
+      } else if (rawData instanceof Long) {
+        return new BigDecimal(new BigInteger(String.valueOf(rawData)),
             ((DecimalTypeInfo) type.getTypeInfo()).getScale());
-      } else if (rawdata instanceof Slice) {
-        return new BigDecimal(Decimals.decodeUnscaledValue((Slice) rawdata),
+      } else if (rawData instanceof Slice) {
+        return new BigDecimal(Decimals.decodeUnscaledValue((Slice) rawData),
             ((DecimalTypeInfo) type.getTypeInfo()).getScale());
       }
     }
     else if (type.equals(HiveType.HIVE_TIMESTAMP)) {
-      return (Long) rawdata * 1000;
+      return (Long) rawData * 1000;
     }
 
-    return rawdata;
+    return rawData;
   }
 
   /**
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/impl/CarbonTableReader.java
 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/impl/CarbonTableReader.java
similarity index 84%
rename from 
integration/presto/src/main/java/org/apache/carbondata/presto/impl/CarbonTableReader.java
rename to 
integration/presto/src/main/prestosql/org/apache/carbondata/presto/impl/CarbonTableReader.java
index d648392..2cb58f7 100755
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/impl/CarbonTableReader.java
+++ 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/impl/CarbonTableReader.java
@@ -58,12 +58,12 @@ import org.apache.carbondata.hadoop.api.CarbonInputFormat;
 import org.apache.carbondata.hadoop.api.CarbonTableInputFormat;
 import org.apache.carbondata.presto.PrestoFilterUtil;
 
-import com.facebook.presto.hadoop.$internal.com.google.gson.Gson;
-import 
com.facebook.presto.hadoop.$internal.org.apache.commons.collections.CollectionUtils;
-import com.facebook.presto.hive.HiveColumnHandle;
-import com.facebook.presto.spi.SchemaTableName;
-import com.facebook.presto.spi.predicate.TupleDomain;
+import com.google.gson.Gson;
 import com.google.inject.Inject;
+import io.prestosql.plugin.hive.HiveColumnHandle;
+import io.prestosql.spi.connector.SchemaTableName;
+import io.prestosql.spi.predicate.TupleDomain;
+import org.apache.commons.collections.CollectionUtils;
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.fs.Path;
 import org.apache.hadoop.fs.PathFilter;
@@ -89,20 +89,17 @@ import static org.apache.hadoop.fs.s3a.Constants.SECRET_KEY;
  */
 public class CarbonTableReader {
 
-  // default PathFilter, accepts files in carbondata format (with .carbondata 
extension).
-  private static final PathFilter DefaultFilter = new PathFilter() {
-    @Override
-    public boolean accept(Path path) {
-      return CarbonTablePath.isCarbonDataFile(path.getName());
-    }
-  };
   public CarbonTableConfig config;
+
   /**
    * A cache for Carbon reader, with this cache,
    * metadata of a table is only read from file system once.
    */
   private AtomicReference<Map<SchemaTableName, CarbonTableCacheModel>> 
carbonCache;
 
+  /**
+   * unique query id used for query statistics
+   */
   private String queryId;
 
   /**
@@ -240,9 +237,21 @@ public class CarbonTableReader {
     return null;
   }
 
-  public List<CarbonLocalMultiBlockSplit> 
getInputSplits2(CarbonTableCacheModel tableCacheModel,
-      Expression filters, TupleDomain<HiveColumnHandle> constraints, 
Configuration config)
-      throws IOException {
+  /**
+   * Get a carbon muti-block input splits
+   *
+   * @param tableCacheModel cached table
+   * @param filters carbonData filters
+   * @param constraints presto filters
+   * @param config hadoop conf
+   * @return list of multiblock split
+   * @throws IOException
+   */
+  public List<CarbonLocalMultiBlockSplit> getInputSplits(
+      CarbonTableCacheModel tableCacheModel,
+      Expression filters,
+      TupleDomain<HiveColumnHandle> constraints,
+      Configuration config) throws IOException {
     List<CarbonLocalInputSplit> result = new ArrayList<>();
     List<CarbonLocalMultiBlockSplit> multiBlockSplitList = new ArrayList<>();
     CarbonTable carbonTable = tableCacheModel.getCarbonTable();
@@ -262,18 +271,13 @@ public class CarbonTableReader {
     PartitionInfo partitionInfo = carbonTable.getPartitionInfo();
     LoadMetadataDetails[] loadMetadataDetails = null;
     if (partitionInfo != null && partitionInfo.getPartitionType() == 
PartitionType.NATIVE_HIVE) {
-      try {
-        loadMetadataDetails = SegmentStatusManager.readTableStatusFile(
-            
CarbonTablePath.getTableStatusFilePath(carbonTable.getTablePath()));
-      } catch (IOException e) {
-        LOGGER.error(e.getMessage(), e);
-        throw e;
-      }
+      loadMetadataDetails = SegmentStatusManager.readTableStatusFile(
+          CarbonTablePath.getTableStatusFilePath(carbonTable.getTablePath()));
       filteredPartitions = findRequiredPartitions(constraints, carbonTable, 
loadMetadataDetails);
     }
     try {
       CarbonTableInputFormat.setTableInfo(config, tableInfo);
-      CarbonTableInputFormat carbonTableInputFormat =
+      CarbonTableInputFormat<Object> carbonTableInputFormat =
           createInputFormat(jobConf, carbonTable.getAbsoluteTableIdentifier(),
               new DataMapFilter(carbonTable, filters, true), 
filteredPartitions);
       Job job = Job.getInstance(jobConf);
@@ -290,76 +294,56 @@ public class CarbonTableReader {
               gson.toJson(carbonInputSplit.getDetailInfo()),
               carbonInputSplit.getFileFormat().ordinal()));
         }
-
         // Use block distribution
-        List<List<CarbonLocalInputSplit>> inputSplits = new ArrayList(
-            result.stream().map(x -> (CarbonLocalInputSplit) 
x).collect(Collectors.groupingBy(
-                carbonInput -> {
-                  if (FileFormat.ROW_V1.equals(carbonInput.getFileFormat())) {
-                    return 
carbonInput.getSegmentId().concat(carbonInput.getPath())
-                      .concat(carbonInput.getStart() + "");
-                  }
-                  return 
carbonInput.getSegmentId().concat(carbonInput.getPath());
-                })).values());
-        if (inputSplits != null) {
-          for (int j = 0; j < inputSplits.size(); j++) {
-            multiBlockSplitList.add(new 
CarbonLocalMultiBlockSplit(inputSplits.get(j),
-                inputSplits.get(j).stream().flatMap(f -> 
Arrays.stream(getLocations(f))).distinct()
-                    .toArray(String[]::new)));
-          }
+        List<List<CarbonLocalInputSplit>> inputSplits =
+            new 
ArrayList<>(result.stream().collect(Collectors.groupingBy(carbonInput -> {
+              if (FileFormat.ROW_V1.equals(carbonInput.getFileFormat())) {
+                return carbonInput.getSegmentId().concat(carbonInput.getPath())
+                    .concat(carbonInput.getStart() + "");
+              }
+              return carbonInput.getSegmentId().concat(carbonInput.getPath());
+            })).values());
+        // TODO : try to optimize the below loic as it may slowdown for huge 
splits
+        for (int j = 0; j < inputSplits.size(); j++) {
+          multiBlockSplitList.add(new 
CarbonLocalMultiBlockSplit(inputSplits.get(j),
+              inputSplits.get(j).stream().flatMap(f -> 
Arrays.stream(getLocations(f))).distinct()
+                  .toArray(String[]::new)));
         }
         LOGGER.error("Size fo MultiblockList   " + multiBlockSplitList.size());
-
       }
-
     } catch (IOException e) {
       throw new RuntimeException(e);
     }
-
     return multiBlockSplitList;
   }
 
   /**
    * Returns list of partition specs to query based on the domain constraints
    *
-   * @param constraints
-   * @param carbonTable
+   * @param constraints presto filter
+   * @param carbonTable carbon table
    * @throws IOException
    */
   private List<PartitionSpec> 
findRequiredPartitions(TupleDomain<HiveColumnHandle> constraints,
       CarbonTable carbonTable, LoadMetadataDetails[] loadMetadataDetails) 
throws IOException {
     Set<PartitionSpec> partitionSpecs = new HashSet<>();
-    List<PartitionSpec> prunePartitions = new ArrayList();
-
     for (LoadMetadataDetails loadMetadataDetail : loadMetadataDetails) {
       SegmentFileStore segmentFileStore = null;
-      try {
-        segmentFileStore =
-            new SegmentFileStore(carbonTable.getTablePath(), 
loadMetadataDetail.getSegmentFile());
-        partitionSpecs.addAll(segmentFileStore.getPartitionSpecs());
-
-      } catch (IOException e) {
-        LOGGER.error(e.getMessage(), e);
-        throw e;
-      }
+      segmentFileStore =
+          new SegmentFileStore(carbonTable.getTablePath(), 
loadMetadataDetail.getSegmentFile());
+      partitionSpecs.addAll(segmentFileStore.getPartitionSpecs());
     }
     List<String> partitionValuesFromExpression =
         PrestoFilterUtil.getPartitionFilters(carbonTable, constraints);
-
-    List<PartitionSpec> partitionSpecList = partitionSpecs.stream().filter(
-        partitionSpec -> CollectionUtils
-            .isSubCollection(partitionValuesFromExpression, 
partitionSpec.getPartitions()))
+    return partitionSpecs.stream().filter(partitionSpec -> CollectionUtils
+        .isSubCollection(partitionValuesFromExpression, 
partitionSpec.getPartitions()))
         .collect(Collectors.toList());
-
-    prunePartitions.addAll(partitionSpecList);
-
-    return prunePartitions;
   }
 
   private CarbonTableInputFormat<Object> createInputFormat(Configuration conf,
       AbsoluteTableIdentifier identifier, DataMapFilter dataMapFilter,
       List<PartitionSpec> filteredPartitions) {
-    CarbonTableInputFormat format = new CarbonTableInputFormat<Object>();
+    CarbonTableInputFormat<Object> format = new CarbonTableInputFormat<>();
     CarbonTableInputFormat
         .setTablePath(conf, 
identifier.appendWithLocalPrefix(identifier.getTablePath()));
     CarbonTableInputFormat.setFilterPredicates(conf, dataMapFilter);
@@ -407,7 +391,7 @@ public class CarbonTableReader {
    * @return
    */
   private String[] getLocations(CarbonLocalInputSplit cis) {
-    return cis.getLocations().toArray(new String[cis.getLocations().size()]);
+    return cis.getLocations().toArray(new String[0]);
   }
 
   public void setQueryId(String queryId) {
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/BooleanStreamReader.java
 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/BooleanStreamReader.java
similarity index 92%
rename from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/BooleanStreamReader.java
rename to 
integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/BooleanStreamReader.java
index a1433a2..f667935 100644
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/BooleanStreamReader.java
+++ 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/BooleanStreamReader.java
@@ -20,10 +20,10 @@ package org.apache.carbondata.presto.readers;
 import org.apache.carbondata.core.metadata.datatype.DataType;
 import 
org.apache.carbondata.core.scan.result.vector.impl.CarbonColumnVectorImpl;
 
-import com.facebook.presto.spi.block.Block;
-import com.facebook.presto.spi.block.BlockBuilder;
-import com.facebook.presto.spi.type.BooleanType;
-import com.facebook.presto.spi.type.Type;
+import io.prestosql.spi.block.Block;
+import io.prestosql.spi.block.BlockBuilder;
+import io.prestosql.spi.type.BooleanType;
+import io.prestosql.spi.type.Type;
 
 public class BooleanStreamReader extends CarbonColumnVectorImpl
     implements PrestoVectorBlockBuilder {
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/ByteStreamReader.java
 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/ByteStreamReader.java
similarity index 92%
rename from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/ByteStreamReader.java
rename to 
integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/ByteStreamReader.java
index 6f69c82..8be3f9d 100644
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/ByteStreamReader.java
+++ 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/ByteStreamReader.java
@@ -14,16 +14,15 @@
  * See the License for the specific language governing permissions and
  * limitations under the License.
  */
-
 package org.apache.carbondata.presto.readers;
 
 import org.apache.carbondata.core.metadata.datatype.DataType;
 import 
org.apache.carbondata.core.scan.result.vector.impl.CarbonColumnVectorImpl;
 
-import com.facebook.presto.spi.block.Block;
-import com.facebook.presto.spi.block.BlockBuilder;
-import com.facebook.presto.spi.type.AbstractType;
-import com.facebook.presto.spi.type.TinyintType;
+import io.prestosql.spi.block.Block;
+import io.prestosql.spi.block.BlockBuilder;
+import io.prestosql.spi.type.AbstractType;
+import io.prestosql.spi.type.TinyintType;
 
 /**
  * Class for Reading the Byte(tiny int) value and setting it in Block
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/DecimalSliceStreamReader.java
 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/DecimalSliceStreamReader.java
similarity index 92%
rename from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/DecimalSliceStreamReader.java
rename to 
integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/DecimalSliceStreamReader.java
index 50e1e06..7bfaf19 100644
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/DecimalSliceStreamReader.java
+++ 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/DecimalSliceStreamReader.java
@@ -25,19 +25,19 @@ import static java.math.RoundingMode.HALF_UP;
 import org.apache.carbondata.core.metadata.datatype.DataType;
 import 
org.apache.carbondata.core.scan.result.vector.impl.CarbonColumnVectorImpl;
 
-import com.facebook.presto.spi.block.Block;
-import com.facebook.presto.spi.block.BlockBuilder;
-import com.facebook.presto.spi.type.DecimalType;
-import com.facebook.presto.spi.type.Decimals;
-import com.facebook.presto.spi.type.Type;
 import io.airlift.slice.Slice;
+import io.prestosql.spi.block.Block;
+import io.prestosql.spi.block.BlockBuilder;
+import io.prestosql.spi.type.DecimalType;
+import io.prestosql.spi.type.Decimals;
+import io.prestosql.spi.type.Type;
 
-import static com.facebook.presto.spi.type.Decimals.encodeUnscaledValue;
-import static com.facebook.presto.spi.type.Decimals.isShortDecimal;
-import static com.facebook.presto.spi.type.Decimals.rescale;
 import static com.google.common.base.Preconditions.checkArgument;
 import static com.google.common.base.Preconditions.checkState;
 import static io.airlift.slice.Slices.utf8Slice;
+import static io.prestosql.spi.type.Decimals.encodeUnscaledValue;
+import static io.prestosql.spi.type.Decimals.isShortDecimal;
+import static io.prestosql.spi.type.Decimals.rescale;
 
 /**
  * Reader for DecimalValues
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/DoubleStreamReader.java
 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/DoubleStreamReader.java
similarity index 92%
rename from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/DoubleStreamReader.java
rename to 
integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/DoubleStreamReader.java
index ebb9372..ed793ed 100644
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/DoubleStreamReader.java
+++ 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/DoubleStreamReader.java
@@ -20,10 +20,10 @@ package org.apache.carbondata.presto.readers;
 import org.apache.carbondata.core.metadata.datatype.DataType;
 import 
org.apache.carbondata.core.scan.result.vector.impl.CarbonColumnVectorImpl;
 
-import com.facebook.presto.spi.block.Block;
-import com.facebook.presto.spi.block.BlockBuilder;
-import com.facebook.presto.spi.type.DoubleType;
-import com.facebook.presto.spi.type.Type;
+import io.prestosql.spi.block.Block;
+import io.prestosql.spi.block.BlockBuilder;
+import io.prestosql.spi.type.DoubleType;
+import io.prestosql.spi.type.Type;
 
 /**
  * Class for Reading the Double value and setting it in Block
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/FloatStreamReader.java
 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/FloatStreamReader.java
similarity index 92%
rename from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/FloatStreamReader.java
rename to 
integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/FloatStreamReader.java
index 0bbdb66..c814947 100644
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/FloatStreamReader.java
+++ 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/FloatStreamReader.java
@@ -20,10 +20,10 @@ package org.apache.carbondata.presto.readers;
 import org.apache.carbondata.core.metadata.datatype.DataType;
 import 
org.apache.carbondata.core.scan.result.vector.impl.CarbonColumnVectorImpl;
 
-import com.facebook.presto.spi.block.Block;
-import com.facebook.presto.spi.block.BlockBuilder;
-import com.facebook.presto.spi.type.AbstractType;
-import com.facebook.presto.spi.type.RealType;
+import io.prestosql.spi.block.Block;
+import io.prestosql.spi.block.BlockBuilder;
+import io.prestosql.spi.type.AbstractType;
+import io.prestosql.spi.type.RealType;
 
 /**
  * Class for Reading the Float(real) value and setting it in Block
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/IntegerStreamReader.java
 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/IntegerStreamReader.java
similarity index 92%
rename from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/IntegerStreamReader.java
rename to 
integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/IntegerStreamReader.java
index 688908b..1393d03 100644
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/IntegerStreamReader.java
+++ 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/IntegerStreamReader.java
@@ -20,10 +20,10 @@ package org.apache.carbondata.presto.readers;
 import org.apache.carbondata.core.metadata.datatype.DataType;
 import 
org.apache.carbondata.core.scan.result.vector.impl.CarbonColumnVectorImpl;
 
-import com.facebook.presto.spi.block.Block;
-import com.facebook.presto.spi.block.BlockBuilder;
-import com.facebook.presto.spi.type.IntegerType;
-import com.facebook.presto.spi.type.Type;
+import io.prestosql.spi.block.Block;
+import io.prestosql.spi.block.BlockBuilder;
+import io.prestosql.spi.type.IntegerType;
+import io.prestosql.spi.type.Type;
 
 public class IntegerStreamReader extends CarbonColumnVectorImpl
     implements PrestoVectorBlockBuilder {
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/LongStreamReader.java
 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/LongStreamReader.java
similarity index 92%
rename from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/LongStreamReader.java
rename to 
integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/LongStreamReader.java
index c5e1285..12ef777 100644
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/LongStreamReader.java
+++ 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/LongStreamReader.java
@@ -20,10 +20,10 @@ package org.apache.carbondata.presto.readers;
 import org.apache.carbondata.core.metadata.datatype.DataType;
 import 
org.apache.carbondata.core.scan.result.vector.impl.CarbonColumnVectorImpl;
 
-import com.facebook.presto.spi.block.Block;
-import com.facebook.presto.spi.block.BlockBuilder;
-import com.facebook.presto.spi.type.BigintType;
-import com.facebook.presto.spi.type.Type;
+import io.prestosql.spi.block.Block;
+import io.prestosql.spi.block.BlockBuilder;
+import io.prestosql.spi.type.BigintType;
+import io.prestosql.spi.type.Type;
 
 public class LongStreamReader extends CarbonColumnVectorImpl implements 
PrestoVectorBlockBuilder {
 
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/ObjectStreamReader.java
 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/ObjectStreamReader.java
similarity index 91%
rename from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/ObjectStreamReader.java
rename to 
integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/ObjectStreamReader.java
index e18e839..5855f57 100644
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/ObjectStreamReader.java
+++ 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/ObjectStreamReader.java
@@ -20,10 +20,10 @@ package org.apache.carbondata.presto.readers;
 import org.apache.carbondata.core.metadata.datatype.DataType;
 import 
org.apache.carbondata.core.scan.result.vector.impl.CarbonColumnVectorImpl;
 
-import com.facebook.presto.spi.block.Block;
-import com.facebook.presto.spi.block.BlockBuilder;
-import com.facebook.presto.spi.type.IntegerType;
-import com.facebook.presto.spi.type.Type;
+import io.prestosql.spi.block.Block;
+import io.prestosql.spi.block.BlockBuilder;
+import io.prestosql.spi.type.IntegerType;
+import io.prestosql.spi.type.Type;
 
 /**
  * Class to read the Object Stream
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/PrestoVectorBlockBuilder.java
 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/PrestoVectorBlockBuilder.java
similarity index 95%
rename from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/PrestoVectorBlockBuilder.java
rename to 
integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/PrestoVectorBlockBuilder.java
index 001e4c4..2d0d9d7 100644
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/PrestoVectorBlockBuilder.java
+++ 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/PrestoVectorBlockBuilder.java
@@ -17,7 +17,7 @@
 
 package org.apache.carbondata.presto.readers;
 
-import com.facebook.presto.spi.block.Block;
+import io.prestosql.spi.block.Block;
 
 public interface PrestoVectorBlockBuilder {
 
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/ShortStreamReader.java
 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/ShortStreamReader.java
similarity index 92%
rename from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/ShortStreamReader.java
rename to 
integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/ShortStreamReader.java
index 62d1590..fc95484 100644
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/ShortStreamReader.java
+++ 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/ShortStreamReader.java
@@ -20,10 +20,10 @@ package org.apache.carbondata.presto.readers;
 import org.apache.carbondata.core.metadata.datatype.DataType;
 import 
org.apache.carbondata.core.scan.result.vector.impl.CarbonColumnVectorImpl;
 
-import com.facebook.presto.spi.block.Block;
-import com.facebook.presto.spi.block.BlockBuilder;
-import com.facebook.presto.spi.type.SmallintType;
-import com.facebook.presto.spi.type.Type;
+import io.prestosql.spi.block.Block;
+import io.prestosql.spi.block.BlockBuilder;
+import io.prestosql.spi.type.SmallintType;
+import io.prestosql.spi.type.Type;
 
 public class ShortStreamReader extends CarbonColumnVectorImpl implements 
PrestoVectorBlockBuilder {
 
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/SliceStreamReader.java
 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/SliceStreamReader.java
similarity index 94%
rename from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/SliceStreamReader.java
rename to 
integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/SliceStreamReader.java
index 51deb54..e103b4a 100644
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/SliceStreamReader.java
+++ 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/SliceStreamReader.java
@@ -24,13 +24,13 @@ import 
org.apache.carbondata.core.scan.result.vector.CarbonDictionary;
 import 
org.apache.carbondata.core.scan.result.vector.impl.CarbonColumnVectorImpl;
 import org.apache.carbondata.core.util.ByteUtil;
 
-import com.facebook.presto.spi.block.Block;
-import com.facebook.presto.spi.block.BlockBuilder;
-import com.facebook.presto.spi.block.DictionaryBlock;
-import com.facebook.presto.spi.block.VariableWidthBlock;
-import com.facebook.presto.spi.type.Type;
-import com.facebook.presto.spi.type.VarcharType;
 import io.airlift.slice.Slices;
+import io.prestosql.spi.block.Block;
+import io.prestosql.spi.block.BlockBuilder;
+import io.prestosql.spi.block.DictionaryBlock;
+import io.prestosql.spi.block.VariableWidthBlock;
+import io.prestosql.spi.type.Type;
+import io.prestosql.spi.type.VarcharType;
 
 import static io.airlift.slice.Slices.wrappedBuffer;
 
diff --git 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/TimestampStreamReader.java
 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/TimestampStreamReader.java
similarity index 92%
rename from 
integration/presto/src/main/java/org/apache/carbondata/presto/readers/TimestampStreamReader.java
rename to 
integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/TimestampStreamReader.java
index e95455c..1ad15ea 100644
--- 
a/integration/presto/src/main/java/org/apache/carbondata/presto/readers/TimestampStreamReader.java
+++ 
b/integration/presto/src/main/prestosql/org/apache/carbondata/presto/readers/TimestampStreamReader.java
@@ -20,10 +20,10 @@ package org.apache.carbondata.presto.readers;
 import org.apache.carbondata.core.metadata.datatype.DataType;
 import 
org.apache.carbondata.core.scan.result.vector.impl.CarbonColumnVectorImpl;
 
-import com.facebook.presto.spi.block.Block;
-import com.facebook.presto.spi.block.BlockBuilder;
-import com.facebook.presto.spi.type.TimestampType;
-import com.facebook.presto.spi.type.Type;
+import io.prestosql.spi.block.Block;
+import io.prestosql.spi.block.BlockBuilder;
+import io.prestosql.spi.type.TimestampType;
+import io.prestosql.spi.type.Type;
 
 public class TimestampStreamReader extends CarbonColumnVectorImpl
     implements PrestoVectorBlockBuilder {
diff --git 
a/integration/presto/src/test/scala/org/apache/carbondata/presto/server/PrestoServer.scala
 
b/integration/presto/src/test/prestodb/org/apache/carbondata/presto/server/PrestoServer.scala
similarity index 100%
copy from 
integration/presto/src/test/scala/org/apache/carbondata/presto/server/PrestoServer.scala
copy to 
integration/presto/src/test/prestodb/org/apache/carbondata/presto/server/PrestoServer.scala
diff --git 
a/integration/presto/src/test/scala/org/apache/carbondata/presto/server/PrestoServer.scala
 
b/integration/presto/src/test/prestosql/org/apache/carbondata/presto/server/PrestoServer.scala
similarity index 93%
rename from 
integration/presto/src/test/scala/org/apache/carbondata/presto/server/PrestoServer.scala
rename to 
integration/presto/src/test/prestosql/org/apache/carbondata/presto/server/PrestoServer.scala
index 672e90f..0f17715 100644
--- 
a/integration/presto/src/test/scala/org/apache/carbondata/presto/server/PrestoServer.scala
+++ 
b/integration/presto/src/test/prestosql/org/apache/carbondata/presto/server/PrestoServer.scala
@@ -23,14 +23,14 @@ import java.util.{Locale, Optional, Properties}
 import scala.collection.JavaConverters._
 import scala.util.{Failure, Success, Try}
 
-import com.facebook.presto.Session
-import com.facebook.presto.execution.QueryIdGenerator
-import com.facebook.presto.jdbc.PrestoStatement
-import com.facebook.presto.metadata.SessionPropertyManager
-import com.facebook.presto.spi.`type`.TimeZoneKey.UTC_KEY
-import com.facebook.presto.spi.security.Identity
-import com.facebook.presto.tests.DistributedQueryRunner
 import com.google.common.collect.ImmutableMap
+import io.prestosql.Session
+import io.prestosql.execution.QueryIdGenerator
+import io.prestosql.jdbc.PrestoStatement
+import io.prestosql.metadata.SessionPropertyManager
+import io.prestosql.spi.`type`.TimeZoneKey.UTC_KEY
+import io.prestosql.spi.security.Identity
+import io.prestosql.tests.DistributedQueryRunner
 import org.slf4j.{Logger, LoggerFactory}
 
 import org.apache.carbondata.presto.CarbondataPlugin
@@ -145,7 +145,7 @@ class PrestoServer {
    * @return
    */
   private def createJdbcConnection(dbName: String) = {
-    val JDBC_DRIVER = "com.facebook.presto.jdbc.PrestoDriver"
+    val JDBC_DRIVER = "io.prestosql.jdbc.PrestoDriver"
     var DB_URL : String = null
     if (dbName == null) {
       DB_URL = "jdbc:presto://localhost:8086/carbondata/default"
diff --git a/integration/spark-common-cluster-test/pom.xml 
b/integration/spark-common-cluster-test/pom.xml
index 87214d1..12426ff 100644
--- a/integration/spark-common-cluster-test/pom.xml
+++ b/integration/spark-common-cluster-test/pom.xml
@@ -81,9 +81,9 @@
       <scope>test</scope>
     </dependency>
     <dependency>
-      <groupId>com.facebook.presto</groupId>
+      <groupId>${presto.groupid}</groupId>
       <artifactId>presto-jdbc</artifactId>
-      <version>0.210</version>
+      <version>${presto.version}</version>
       <exclusions>
         <exclusion>
           <groupId>org.antlr</groupId>
diff --git 
a/integration/spark-common-cluster-test/src/test/scala/org/apache/spark/sql/common/util/QueryTest.scala
 
b/integration/spark-common-cluster-test/src/test/scala/org/apache/spark/sql/common/util/QueryTest.scala
index ab267f5..38f168c 100644
--- 
a/integration/spark-common-cluster-test/src/test/scala/org/apache/spark/sql/common/util/QueryTest.scala
+++ 
b/integration/spark-common-cluster-test/src/test/scala/org/apache/spark/sql/common/util/QueryTest.scala
@@ -24,7 +24,7 @@ import java.util.{Locale, Properties}
 import scala.collection.JavaConversions._
 import scala.util.{Failure, Success, Try}
 
-import com.facebook.presto.jdbc.PrestoStatement
+import io.prestosql.jdbc.{PrestoConnection, PrestoStatement}
 import 
org.apache.spark.sql.carbondata.execution.datasources.CarbonFileIndexReplaceRule
 import org.apache.spark.sql.catalyst.plans._
 import org.apache.spark.sql.catalyst.util._
@@ -258,7 +258,7 @@ object QueryTest {
      * @return
      */
     private def createJdbcConnection(dbName: String, url: String) = {
-      val JDBC_DRIVER = "com.facebook.presto.jdbc.PrestoDriver"
+      val JDBC_DRIVER = "io.prestosql.jdbc.PrestoDriver"
       var DB_URL : String = null
       if (StringUtils.isEmpty(dbName)) {
         DB_URL = "jdbc:presto://"+ url + "/carbondata/default"
diff --git a/pom.xml b/pom.xml
index 77b5b48..7b8b8b7 100644
--- a/pom.xml
+++ b/pom.xml
@@ -121,6 +121,7 @@
   </modules>
 
   <properties>
+    <!--    default properties-->
     <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
     <snappy.version>1.1.2.6</snappy.version>
     <hadoop.version>2.7.2</hadoop.version>
@@ -136,6 +137,16 @@
     <spark.master.url>local[2]</spark.master.url>
     <hdfs.url>local</hdfs.url>
     <presto.jdbc.url>localhost:8086</presto.jdbc.url>
+    <!--    prestosql by default-->
+    <presto.version>316</presto.version>
+    <presto.groupid>io.prestosql</presto.groupid>
+    <presto.hadoop.groupid>io.prestosql.hadoop</presto.hadoop.groupid>
+    <presto.hadoop.artifactid>hadoop-apache</presto.hadoop.artifactid>
+    <presto.hadoop.version>3.2.0-2</presto.hadoop.version>
+    <airlift.version>0.36</airlift.version>
+    <presto.mvn.plugin.groupid>io.prestosql</presto.mvn.plugin.groupid>
+    <presto.mvn.plugin.version>6</presto.mvn.plugin.version>
+    <presto.depndency.scope>compile</presto.depndency.scope>
     <!--todo:this can be enabled when presto tests need to be run-->
     
<!--<spark.hadoop.hive.metastore.uris>thrift://localhost:8086</spark.hadoop.hive.metastore.uris>-->
     <suite.name>org.apache.carbondata.cluster.sdv.suite.SDVSuites</suite.name>
@@ -557,8 +568,6 @@
                 
<sourceDirectory>${basedir}/integration/spark/src/main/java</sourceDirectory>
                 
<sourceDirectory>${basedir}/integration/hive/src/main/scala</sourceDirectory>
                 
<sourceDirectory>${basedir}/integration/hive/src/main/java</sourceDirectory>
-                
<sourceDirectory>${basedir}/integration/presto/src/main/scala</sourceDirectory>
-                
<sourceDirectory>${basedir}/integration/presto/src/main/java</sourceDirectory>
                 
<sourceDirectory>${basedir}/streaming/src/main/java</sourceDirectory>
                 
<sourceDirectory>${basedir}/streaming/src/main/scala</sourceDirectory>
                 
<sourceDirectory>${basedir}/sdk/sdk/src/main/java</sourceDirectory>
@@ -604,8 +613,6 @@
                 
<sourceDirectory>${basedir}/integration/spark/src/main/java</sourceDirectory>
                 
<sourceDirectory>${basedir}/integration/hive/src/main/scala</sourceDirectory>
                 
<sourceDirectory>${basedir}/integration/hive/src/main/java</sourceDirectory>
-                
<sourceDirectory>${basedir}/integration/presto/src/main/scala</sourceDirectory>
-                
<sourceDirectory>${basedir}/integration/presto/src/main/java</sourceDirectory>
                 
<sourceDirectory>${basedir}/streaming/src/main/java</sourceDirectory>
                 
<sourceDirectory>${basedir}/streaming/src/main/scala</sourceDirectory>
                 
<sourceDirectory>${basedir}/sdk/sdk/src/main/java</sourceDirectory>
@@ -667,6 +674,83 @@
         <script.exetension>.bat</script.exetension>
       </properties>
     </profile>
+    <!--    prestodb-->
+    <profile>
+      <id>prestodb</id>
+      <properties>
+        <presto.version>0.217</presto.version>
+        <scala.version>2.11.8</scala.version>
+        <airlift.version>0.31</airlift.version>
+        <presto.groupid>com.facebook.presto</presto.groupid>
+        
<presto.hadoop.groupid>com.facebook.presto.hadoop</presto.hadoop.groupid>
+        <presto.hadoop.artifactid>hadoop-apache2</presto.hadoop.artifactid>
+        <presto.hadoop.version>2.7.4-3</presto.hadoop.version>
+        
<presto.mvn.plugin.groupid>io.takari.maven.plugins</presto.mvn.plugin.groupid>
+        <presto.mvn.plugin.version>0.1.12</presto.mvn.plugin.version>
+        <presto.depndency.scope>provided</presto.depndency.scope>
+      </properties>
+      <build>
+        <plugins>
+          <plugin>
+            <groupId>org.eluder.coveralls</groupId>
+            <artifactId>coveralls-maven-plugin</artifactId>
+            <version>4.3.0</version>
+            <configuration>
+              <repoToken>opPwqWW41vYppv6KISea3u1TJvE1ugJ5Y</repoToken>
+              <sourceEncoding>UTF-8</sourceEncoding>
+              <jacocoReports>
+                
<jacocoReport>${basedir}/target/carbondata-coverage-report/carbondata-coverage-report.xml
+                </jacocoReport>
+              </jacocoReports>
+              <sourceDirectories>
+                
<sourceDirectory>${basedir}/integration/presto/src/main/scala</sourceDirectory>
+                
<sourceDirectory>${basedir}/integration/presto/src/main/java</sourceDirectory>
+              </sourceDirectories>
+            </configuration>
+          </plugin>
+        </plugins>
+      </build>
+    </profile>
+    <!--    prestosql-->
+    <profile>
+      <id>prestosql</id>
+      <activation>
+        <activeByDefault>true</activeByDefault>
+      </activation>
+      <properties>
+        <presto.version>316</presto.version>
+        <scala.version>2.11.8</scala.version>
+        <airlift.version>0.36</airlift.version>
+        <presto.groupid>io.prestosql</presto.groupid>
+        <presto.hadoop.groupid>io.prestosql.hadoop</presto.hadoop.groupid>
+        <presto.hadoop.artifactid>hadoop-apache</presto.hadoop.artifactid>
+        <presto.hadoop.version>3.2.0-2</presto.hadoop.version>
+        <presto.mvn.plugin.groupid>io.prestosql</presto.mvn.plugin.groupid>
+        <presto.mvn.plugin.version>6</presto.mvn.plugin.version>
+        <presto.depndency.scope>compile</presto.depndency.scope>
+      </properties>
+      <build>
+        <plugins>
+          <plugin>
+            <groupId>org.eluder.coveralls</groupId>
+            <artifactId>coveralls-maven-plugin</artifactId>
+            <version>4.3.0</version>
+            <configuration>
+              <repoToken>opPwqWW41vYppv6KISea3u1TJvE1ugJ5Y</repoToken>
+              <sourceEncoding>UTF-8</sourceEncoding>
+              <jacocoReports>
+                
<jacocoReport>${basedir}/target/carbondata-coverage-report/carbondata-coverage-report.xml
+                </jacocoReport>
+              </jacocoReports>
+              <sourceDirectories>
+                
<sourceDirectory>${basedir}/integration/presto/src/main/scala</sourceDirectory>
+                
<sourceDirectory>${basedir}/integration/presto/src/main/java</sourceDirectory>
+              </sourceDirectories>
+            </configuration>
+          </plugin>
+        </plugins>
+      </build>
+    </profile>
   </profiles>
 
 </project>

Reply via email to