Github user Hexiaoqiao commented on a diff in the pull request:
https://github.com/apache/incubator-carbondata/pull/611#discussion_r104309680
--- Diff: docs/installation-guide.md ---
@@ -40,42 +40,46 @@ followed by :
### Procedure
-* [Build the
CarbonData](https://cwiki.apache.org/confluence/display/CARBONDATA/Building+CarbonData+And+IDE+Configuration)
project and get the assembly jar from
"./assembly/target/scala-2.10/carbondata_xxx.jar" and put in the
``"<SPARK_HOME>/carbonlib"`` folder.
+1. [Build the
CarbonData](https://github.com/apache/incubator-carbondata/blob/master/build/README.md)
project and get the assembly jar from
`./assembly/target/scala-2.1x/carbondata_xxx.jar`.
- NOTE: Create the carbonlib folder if it does not exists inside
``"<SPARK_HOME>"`` path.
+2. Copy `./assembly/target/scala-2.1x/carbondata_xxx.jar` to
`<SPARK_HOME>/carbonlib` folder.
-* Add the carbonlib folder path in the Spark classpath. (Edit
``"<SPARK_HOME>/conf/spark-env.sh"`` file and modify the value of
SPARK_CLASSPATH by appending ``"<SPARK_HOME>/carbonlib/*"`` to the existing
value)
+ **NOTE**: Create the carbonlib folder if it does not exist inside
`<SPARK_HOME>` path.
-* Copy the carbon.properties.template to
``"<SPARK_HOME>/conf/carbon.properties"`` folder from "./conf/" of CarbonData
repository.
+3. Add the carbonlib folder path in the Spark classpath. (Edit
`<SPARK_HOME>/conf/spark-env.sh` file and modify the value of `SPARK_CLASSPATH`
by appending `<SPARK_HOME>/carbonlib/*` to the existing value)
-* Copy the "carbonplugins" folder to ``"<SPARK_HOME>/carbonlib"`` folder
from "./processing/" folder of CarbonData repository.
+4. Copy the `./conf/carbon.properties.template` file from CarbonData
repository to `<SPARK_HOME>/conf/` folder and rename the file to
`carbon.properties`.
- NOTE: carbonplugins will contain .kettle folder.
+5. Copy the `./processing/carbonplugins` folder from CarbonData repository
to `<SPARK_HOME>/carbonlib/` folder.
+
+ **NOTE**: carbonplugins will contain .kettle folder.
+
+6. Repeat Step 2 to Step 5 in all the nodes of the cluster.
-* In Spark node, configure the properties mentioned in the following table
in ``"<SPARK_HOME>/conf/spark-defaults.conf"`` file.
+7. In Spark node[master], configure the properties mentioned in the
following table in `<SPARK_HOME>/conf/spark-defaults.conf` file.
-| Property | Value | Description |
-|---------------------------------|-----------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------|
-| carbon.kettle.home | $SPARK_HOME /carbonlib/carbonplugins | Path that
will be used by CarbonData internally to create graph for loading the data |
-| spark.driver.extraJavaOptions |
-Dcarbon.properties.filepath=$SPARK_HOME/conf/carbon.properties | A string of
extra JVM options to pass to the driver. For instance, GC settings or other
logging. |
-| spark.executor.extraJavaOptions |
-Dcarbon.properties.filepath=$SPARK_HOME/conf/carbon.properties | A string of
extra JVM options to pass to executors. For instance, GC settings or other
logging. NOTE: You can enter multiple values separated by space. |
+ | Property | Value | Description |
+
|---------------------------------|-----------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------|
+ | carbon.kettle.home | "<SPARK_HOME>"/carbonlib/carbonplugins | Path
that will be used by CarbonData internally to create graph for loading the data
|
--- End diff --
@sraghunandan
Please use `$SPARK_HOME` rather than `"<SPARK_HOME>"` cause markdown will
treat it as annotate and not show it.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---