+1, Please raise an Issue for improvement
On Thu, Dec 22, 2016 at 7:24 AM, Kumar Vishal
wrote:
> Hi Sujith,
> +1 I think this will be a good optimization for dictionary column.
>
> -Regards
> Kumar Vishal
>
> On Mon, Dec 12, 2016 at 3:26 AM, sujith chacko <
>
Gin-zhj created CARBONDATA-554:
--
Summary: Maven build failed when run command "mvn clean install
-DskipTests"
Key: CARBONDATA-554
URL: https://issues.apache.org/jira/browse/CARBONDATA-554
Project:
Please provide executor side log
--
View this message in context:
http://apache-carbondata-mailing-list-archive.1130556.n5.nabble.com/same-query-and-I-change-the-value-than-throw-a-error-tp4811p4893.html
Sent from the Apache CarbonData Mailing List archive mailing list archive at
Nabble.com.
Hi 251469031,
Thanks for showing interest in carbon. For your question please refer the
explanation below.
scala> val dataFilePath = new File("hdfs://master:9000/
carbondata/sample.csv").getCanonicalPath
dataFilePath: String = /home/hadoop/carbondata/hdfs:/
master:9000/carbondata/sample.csv
If
Rahul Kumar created CARBONDATA-553:
--
Summary: Create integration test-case for dataframe API
Key: CARBONDATA-553
URL: https://issues.apache.org/jira/browse/CARBONDATA-553
Project: CarbonData
Please find the following item in carbon.properties file, give a proper
path(hdfs://master:9000/)
carbon.ddl.base.hdfs.url
During loading, will combine this url and data file path.
BTW, better to provide the version number.
--
View this message in context:
I think the root cause is metadata lock type.
Please add "carbon.lock.type" configuration to carbon.properties as
following.
#Local mode
carbon.lock.type=LOCALLOCK
#Cluster mode
carbon.lock.type=HDFSLOCK
--
View this message in context:
Hi team,
Looks like I've met the same problem about dictionary file is locked. Could
you share what changes you made about the configuration?
ERROR 23-12 09:55:26,222 - Executor task launch worker-0
java.lang.RuntimeException: Dictionary file vehsyspwrmod is locked for
updation. Please try after
Well, In the source code of carbondata, the filetype is determined as :
if (property.startsWith(CarbonUtil.HDFS_PREFIX)) {
storeDefaultFileType = FileType.HDFS;
}
and CarbonUtil.HDFS_PREFIX="hdfs://"
but when I run the following script, the dataFilePath is still local:
Hi
This is because that you use cluster mode, but the input file is local file.
1.If you use cluster mode, please load hadoop files
2.If you just want to load local files, please use local mode.
李寅威 wrote
> Hi,
>
> when i run the following script:
>
>
> scala>val dataFilePath = new
>
Hi,
when i run the following script:
scala>val dataFilePath = new File("/carbondata/pt/sample.csv").getCanonicalPath
scala>cc.sql(s"load data inpath '$dataFilePath' into table test_table")
is turns out:
org.apache.carbondata.processing.etl.DataLoadingException: The input file does
not
11 matches
Mail list logo