[jira] [Updated] (SQOOP-3374) Assigning HDFS path to --bindir is giving error "java.lang.reflect.InvocationTargetException"

2018-09-03 Thread Amit Joshi (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3374?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Amit Joshi updated SQOOP-3374:
--
Priority: Blocker  (was: Minor)

> Assigning HDFS path to --bindir is giving error 
> "java.lang.reflect.InvocationTargetException"
> -
>
> Key: SQOOP-3374
> URL: https://issues.apache.org/jira/browse/SQOOP-3374
> Project: Sqoop
>  Issue Type: Wish
>  Components: sqoop2-api
>Reporter: Amit Joshi
>Priority: Blocker
>
> When I am trying to assign the HDFS directory path to --bindir in my sqoop 
> command, it is throwing error "java.lang.reflect.InvocationTargetException".
> My sqoop query looks like this:
> sqoop import -connect connection_string --username username --password-file 
> file_path --query 'select * from EDW_PROD.RXCLM_LINE_FACT_DENIED 
> PARTITION(RXCLM_LINE_FACTP201808) where $CONDITIONS' --as-parquetfile 
> --compression-codec org.apache.hadoop.io.compress.SnappyCodec --append 
> --target-dir target_dir *-bindir hdfs://user/projects/* --split-by RX_ID 
> --null-string '/N' --null-non-string '/N' --fields-terminated-by ',' -m 10
>  
> It is creating folder "hdfs:" in my home directory.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (SQOOP-3374) Assigning HDFS path to --bindir is giving error "java.lang.reflect.InvocationTargetException"

2018-08-29 Thread Amit Joshi (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3374?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Amit Joshi updated SQOOP-3374:
--
Description: 
When I am trying to assign the HDFS directory path to --bindir in my sqoop 
command, it is throwing error "java.lang.reflect.InvocationTargetException".

My sqoop query looks like this:

sqoop import -connect connection_string --username username --password-file 
file_path --query 'select * from EDW_PROD.RXCLM_LINE_FACT_DENIED 
PARTITION(RXCLM_LINE_FACTP201808) where $CONDITIONS' --as-parquetfile 
--compression-codec org.apache.hadoop.io.compress.SnappyCodec --append 
--target-dir target_dir *-bindir hdfs://user/projects/* --split-by RX_ID 
--null-string '/N' --null-non-string '/N' --fields-terminated-by ',' -m 10

 

It is creating folder "hdfs:" in my home directory.

  was:
When I am trying to assign the HDFS directory path to --bindir in my sqoop 
command, it is throwing error "java.lang.reflect.InvocationTargetException".

My sqoop query looks like this:

sqoop import -connect connection_string --username username --password-file 
file_path --query 'select * from EDW_PROD.RXCLM_LINE_FACT_DENIED 
PARTITION(RXCLM_LINE_FACTP201808) where $CONDITIONS' --as-parquetfile 
--compression-codec org.apache.hadoop.io.compress.SnappyCodec --append 
--target-dir target_dir *-bindir hdfs://user/projects/* --split-by RX_ID 
--null-string '/N' --null-non-string '/N' --fields-terminated-by ',' -m 10


> Assigning HDFS path to --bindir is giving error 
> "java.lang.reflect.InvocationTargetException"
> -
>
> Key: SQOOP-3374
> URL: https://issues.apache.org/jira/browse/SQOOP-3374
> Project: Sqoop
>  Issue Type: Wish
>  Components: sqoop2-api
>Reporter: Amit Joshi
>Priority: Minor
>
> When I am trying to assign the HDFS directory path to --bindir in my sqoop 
> command, it is throwing error "java.lang.reflect.InvocationTargetException".
> My sqoop query looks like this:
> sqoop import -connect connection_string --username username --password-file 
> file_path --query 'select * from EDW_PROD.RXCLM_LINE_FACT_DENIED 
> PARTITION(RXCLM_LINE_FACTP201808) where $CONDITIONS' --as-parquetfile 
> --compression-codec org.apache.hadoop.io.compress.SnappyCodec --append 
> --target-dir target_dir *-bindir hdfs://user/projects/* --split-by RX_ID 
> --null-string '/N' --null-non-string '/N' --fields-terminated-by ',' -m 10
>  
> It is creating folder "hdfs:" in my home directory.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (SQOOP-3374) Assigning HDFS path to --bindir is giving error "java.lang.reflect.InvocationTargetException"

2018-08-29 Thread Amit Joshi (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3374?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Amit Joshi updated SQOOP-3374:
--
Description: 
When I am trying to assign the HDFS directory path to --bindir in my sqoop 
command, it is throwing error "java.lang.reflect.InvocationTargetException".

My sqoop query looks like this:

sqoop import -connect connection_string --username username --password-file 
file_path --query 'select * from EDW_PROD.RXCLM_LINE_FACT_DENIED 
PARTITION(RXCLM_LINE_FACTP201808) where $CONDITIONS' --as-parquetfile 
--compression-codec org.apache.hadoop.io.compress.SnappyCodec --append 
--target-dir target_dir *-bindir hdfs://user/projects/* --split-by RX_ID 
--null-string '/N' --null-non-string '/N' --fields-terminated-by ',' -m 10

  was:
When I am trying to assign the HDFS directory path to --bindir in my sqoop 
command, it is throwing error "java.lang.reflect.InvocationTargetException".

My sqoop query looks like this:

sqoop import --connect connection_string --username username --password-file 
file_path --query 'select * from EDW_PROD.RXCLM_LINE_FACT_DENIED 
PARTITION(RXCLM_LINE_FACTP201808) where $CONDITIONS' --as-parquetfile 
--compression-codec org.apache.hadoop.io.compress.SnappyCodec --append 
--target-dir target_dir *--bindir hdfs://user/projects/* --split-by RX_ID 
--null-string '/N' --null-non-string '/N' --fields-terminated-by ',' -m 10


> Assigning HDFS path to --bindir is giving error 
> "java.lang.reflect.InvocationTargetException"
> -
>
> Key: SQOOP-3374
> URL: https://issues.apache.org/jira/browse/SQOOP-3374
> Project: Sqoop
>  Issue Type: Wish
>  Components: sqoop2-api
>Reporter: Amit Joshi
>Priority: Minor
>
> When I am trying to assign the HDFS directory path to --bindir in my sqoop 
> command, it is throwing error "java.lang.reflect.InvocationTargetException".
> My sqoop query looks like this:
> sqoop import -connect connection_string --username username --password-file 
> file_path --query 'select * from EDW_PROD.RXCLM_LINE_FACT_DENIED 
> PARTITION(RXCLM_LINE_FACTP201808) where $CONDITIONS' --as-parquetfile 
> --compression-codec org.apache.hadoop.io.compress.SnappyCodec --append 
> --target-dir target_dir *-bindir hdfs://user/projects/* --split-by RX_ID 
> --null-string '/N' --null-non-string '/N' --fields-terminated-by ',' -m 10



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (SQOOP-3374) Assigning HDFS path to --bindir is giving error "java.lang.reflect.InvocationTargetException"

2018-08-29 Thread Amit Joshi (JIRA)


 [ 
https://issues.apache.org/jira/browse/SQOOP-3374?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Amit Joshi updated SQOOP-3374:
--
Description: 
When I am trying to assign the HDFS directory path to --bindir in my sqoop 
command, it is throwing error "java.lang.reflect.InvocationTargetException".

My sqoop query looks like this:

sqoop import --connect connection_string --username username --password-file 
file_path --query 'select * from EDW_PROD.RXCLM_LINE_FACT_DENIED 
PARTITION(RXCLM_LINE_FACTP201808) where $CONDITIONS' --as-parquetfile 
--compression-codec org.apache.hadoop.io.compress.SnappyCodec --append 
--target-dir target_dir *--bindir hdfs://user/projects/* --split-by RX_ID 
--null-string '/N' --null-non-string '/N' --fields-terminated-by ',' -m 10

  was:
When I am trying to assign the HDFS directory path to --bindir in my sqoop 
command, it is throwing error "java.lang.reflect.InvocationTargetException".

I am passing the below value:

--bindir hdfs://user/projects/


> Assigning HDFS path to --bindir is giving error 
> "java.lang.reflect.InvocationTargetException"
> -
>
> Key: SQOOP-3374
> URL: https://issues.apache.org/jira/browse/SQOOP-3374
> Project: Sqoop
>  Issue Type: Wish
>  Components: sqoop2-api
>Reporter: Amit Joshi
>Priority: Minor
>
> When I am trying to assign the HDFS directory path to --bindir in my sqoop 
> command, it is throwing error "java.lang.reflect.InvocationTargetException".
> My sqoop query looks like this:
> sqoop import --connect connection_string --username username --password-file 
> file_path --query 'select * from EDW_PROD.RXCLM_LINE_FACT_DENIED 
> PARTITION(RXCLM_LINE_FACTP201808) where $CONDITIONS' --as-parquetfile 
> --compression-codec org.apache.hadoop.io.compress.SnappyCodec --append 
> --target-dir target_dir *--bindir hdfs://user/projects/* --split-by RX_ID 
> --null-string '/N' --null-non-string '/N' --fields-terminated-by ',' -m 10



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3042) Sqoop does not clear compile directory under /tmp/sqoop-/compile automatically

2018-08-29 Thread Amit Joshi (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3042?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16596043#comment-16596043
 ] 

Amit Joshi commented on SQOOP-3042:
---

[~dvoros] Sure. Created a new ticket 
[SQOOP-3374|https://issues.apache.org/jira/browse/SQOOP-3374]

> Sqoop does not clear compile directory under /tmp/sqoop-/compile 
> automatically
> 
>
> Key: SQOOP-3042
> URL: https://issues.apache.org/jira/browse/SQOOP-3042
> Project: Sqoop
>  Issue Type: Bug
>Affects Versions: 1.4.6
>Reporter: Eric Lin
>Assignee: Eric Lin
>Priority: Critical
>  Labels: patch
> Fix For: 3.0.0
>
> Attachments: SQOOP-3042.1.patch, SQOOP-3042.2.patch, 
> SQOOP-3042.4.patch, SQOOP-3042.5.patch, SQOOP-3042.6.patch, 
> SQOOP-3042.7.patch, SQOOP-3042.9.patch
>
>
> After running sqoop, all the temp files generated by ClassWriter are left 
> behind on disk, so anyone can check those JAVA files to see the schema of 
> those tables that Sqoop has been interacting with. By default, the directory 
> is under /tmp/sqoop-/compile.
> In class org.apache.sqoop.SqoopOptions, function getNonceJarDir(), I can see 
> that we did add "deleteOnExit" on the temp dir:
> {code}
> for (int attempts = 0; attempts < MAX_DIR_CREATE_ATTEMPTS; attempts++) {
>   hashDir = new File(baseDir, RandomHash.generateMD5String());
>   while (hashDir.exists()) {
> hashDir = new File(baseDir, RandomHash.generateMD5String());
>   }
>   if (hashDir.mkdirs()) {
> // We created the directory. Use it.
> // If this directory is not actually filled with files, delete it
> // when the JVM quits.
> hashDir.deleteOnExit();
> break;
>   }
> }
> {code}
> However, I believe it failed to delete due to directory is not empty.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (SQOOP-3374) Assigning HDFS path to --bindir is giving error "java.lang.reflect.InvocationTargetException"

2018-08-29 Thread Amit Joshi (JIRA)
Amit Joshi created SQOOP-3374:
-

 Summary: Assigning HDFS path to --bindir is giving error 
"java.lang.reflect.InvocationTargetException"
 Key: SQOOP-3374
 URL: https://issues.apache.org/jira/browse/SQOOP-3374
 Project: Sqoop
  Issue Type: Wish
  Components: sqoop2-api
Reporter: Amit Joshi


When I am trying to assign the HDFS directory path to --bindir in my sqoop 
command, it is throwing error "java.lang.reflect.InvocationTargetException".

I am passing the below value:

--bindir hdfs://user/projects/



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3042) Sqoop does not clear compile directory under /tmp/sqoop-/compile automatically

2018-08-28 Thread Amit Joshi (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3042?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16594982#comment-16594982
 ] 

Amit Joshi commented on SQOOP-3042:
---

[~dvoros] Thanks but getting error while assigning hdfs dir path to the 
--bindir option.

Its java.lang.reflect.InvocationTargetException

> Sqoop does not clear compile directory under /tmp/sqoop-/compile 
> automatically
> 
>
> Key: SQOOP-3042
> URL: https://issues.apache.org/jira/browse/SQOOP-3042
> Project: Sqoop
>  Issue Type: Bug
>Affects Versions: 1.4.6
>Reporter: Eric Lin
>Assignee: Eric Lin
>Priority: Critical
>  Labels: patch
> Fix For: 3.0.0
>
> Attachments: SQOOP-3042.1.patch, SQOOP-3042.2.patch, 
> SQOOP-3042.4.patch, SQOOP-3042.5.patch, SQOOP-3042.6.patch, 
> SQOOP-3042.7.patch, SQOOP-3042.9.patch
>
>
> After running sqoop, all the temp files generated by ClassWriter are left 
> behind on disk, so anyone can check those JAVA files to see the schema of 
> those tables that Sqoop has been interacting with. By default, the directory 
> is under /tmp/sqoop-/compile.
> In class org.apache.sqoop.SqoopOptions, function getNonceJarDir(), I can see 
> that we did add "deleteOnExit" on the temp dir:
> {code}
> for (int attempts = 0; attempts < MAX_DIR_CREATE_ATTEMPTS; attempts++) {
>   hashDir = new File(baseDir, RandomHash.generateMD5String());
>   while (hashDir.exists()) {
> hashDir = new File(baseDir, RandomHash.generateMD5String());
>   }
>   if (hashDir.mkdirs()) {
> // We created the directory. Use it.
> // If this directory is not actually filled with files, delete it
> // when the JVM quits.
> hashDir.deleteOnExit();
> break;
>   }
> }
> {code}
> However, I believe it failed to delete due to directory is not empty.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3042) Sqoop does not clear compile directory under /tmp/sqoop-/compile automatically

2018-08-28 Thread Amit Joshi (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3042?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16594785#comment-16594785
 ] 

Amit Joshi commented on SQOOP-3042:
---

Is there a sqoop property available to configure the compile directory?

> Sqoop does not clear compile directory under /tmp/sqoop-/compile 
> automatically
> 
>
> Key: SQOOP-3042
> URL: https://issues.apache.org/jira/browse/SQOOP-3042
> Project: Sqoop
>  Issue Type: Bug
>Affects Versions: 1.4.6
>Reporter: Eric Lin
>Assignee: Eric Lin
>Priority: Critical
>  Labels: patch
> Fix For: 3.0.0
>
> Attachments: SQOOP-3042.1.patch, SQOOP-3042.2.patch, 
> SQOOP-3042.4.patch, SQOOP-3042.5.patch, SQOOP-3042.6.patch, 
> SQOOP-3042.7.patch, SQOOP-3042.9.patch
>
>
> After running sqoop, all the temp files generated by ClassWriter are left 
> behind on disk, so anyone can check those JAVA files to see the schema of 
> those tables that Sqoop has been interacting with. By default, the directory 
> is under /tmp/sqoop-/compile.
> In class org.apache.sqoop.SqoopOptions, function getNonceJarDir(), I can see 
> that we did add "deleteOnExit" on the temp dir:
> {code}
> for (int attempts = 0; attempts < MAX_DIR_CREATE_ATTEMPTS; attempts++) {
>   hashDir = new File(baseDir, RandomHash.generateMD5String());
>   while (hashDir.exists()) {
> hashDir = new File(baseDir, RandomHash.generateMD5String());
>   }
>   if (hashDir.mkdirs()) {
> // We created the directory. Use it.
> // If this directory is not actually filled with files, delete it
> // when the JVM quits.
> hashDir.deleteOnExit();
> break;
>   }
> }
> {code}
> However, I believe it failed to delete due to directory is not empty.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3042) Sqoop does not clear compile directory under /tmp/sqoop-/compile automatically

2018-08-28 Thread Amit Joshi (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3042?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16594781#comment-16594781
 ] 

Amit Joshi commented on SQOOP-3042:
---

[~dvoros] Thanks for this info

> Sqoop does not clear compile directory under /tmp/sqoop-/compile 
> automatically
> 
>
> Key: SQOOP-3042
> URL: https://issues.apache.org/jira/browse/SQOOP-3042
> Project: Sqoop
>  Issue Type: Bug
>Affects Versions: 1.4.6
>Reporter: Eric Lin
>Assignee: Eric Lin
>Priority: Critical
>  Labels: patch
> Fix For: 3.0.0
>
> Attachments: SQOOP-3042.1.patch, SQOOP-3042.2.patch, 
> SQOOP-3042.4.patch, SQOOP-3042.5.patch, SQOOP-3042.6.patch, 
> SQOOP-3042.7.patch, SQOOP-3042.9.patch
>
>
> After running sqoop, all the temp files generated by ClassWriter are left 
> behind on disk, so anyone can check those JAVA files to see the schema of 
> those tables that Sqoop has been interacting with. By default, the directory 
> is under /tmp/sqoop-/compile.
> In class org.apache.sqoop.SqoopOptions, function getNonceJarDir(), I can see 
> that we did add "deleteOnExit" on the temp dir:
> {code}
> for (int attempts = 0; attempts < MAX_DIR_CREATE_ATTEMPTS; attempts++) {
>   hashDir = new File(baseDir, RandomHash.generateMD5String());
>   while (hashDir.exists()) {
> hashDir = new File(baseDir, RandomHash.generateMD5String());
>   }
>   if (hashDir.mkdirs()) {
> // We created the directory. Use it.
> // If this directory is not actually filled with files, delete it
> // when the JVM quits.
> hashDir.deleteOnExit();
> break;
>   }
> }
> {code}
> However, I believe it failed to delete due to directory is not empty.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SQOOP-3042) Sqoop does not clear compile directory under /tmp/sqoop-/compile automatically

2018-08-28 Thread Amit Joshi (JIRA)


[ 
https://issues.apache.org/jira/browse/SQOOP-3042?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16594714#comment-16594714
 ] 

Amit Joshi commented on SQOOP-3042:
---

Hi guys, may I know which version of sqoop has this change incorporated into?

> Sqoop does not clear compile directory under /tmp/sqoop-/compile 
> automatically
> 
>
> Key: SQOOP-3042
> URL: https://issues.apache.org/jira/browse/SQOOP-3042
> Project: Sqoop
>  Issue Type: Bug
>Affects Versions: 1.4.6
>Reporter: Eric Lin
>Assignee: Eric Lin
>Priority: Critical
>  Labels: patch
> Fix For: 3.0.0
>
> Attachments: SQOOP-3042.1.patch, SQOOP-3042.2.patch, 
> SQOOP-3042.4.patch, SQOOP-3042.5.patch, SQOOP-3042.6.patch, 
> SQOOP-3042.7.patch, SQOOP-3042.9.patch
>
>
> After running sqoop, all the temp files generated by ClassWriter are left 
> behind on disk, so anyone can check those JAVA files to see the schema of 
> those tables that Sqoop has been interacting with. By default, the directory 
> is under /tmp/sqoop-/compile.
> In class org.apache.sqoop.SqoopOptions, function getNonceJarDir(), I can see 
> that we did add "deleteOnExit" on the temp dir:
> {code}
> for (int attempts = 0; attempts < MAX_DIR_CREATE_ATTEMPTS; attempts++) {
>   hashDir = new File(baseDir, RandomHash.generateMD5String());
>   while (hashDir.exists()) {
> hashDir = new File(baseDir, RandomHash.generateMD5String());
>   }
>   if (hashDir.mkdirs()) {
> // We created the directory. Use it.
> // If this directory is not actually filled with files, delete it
> // when the JVM quits.
> hashDir.deleteOnExit();
> break;
>   }
> }
> {code}
> However, I believe it failed to delete due to directory is not empty.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)