In all probability sqoop loses connection to the path of file on hdfs.

if the file is there then you can create a hive external table to it and do
an insert/select from that table to the target hive table.

you can also bcp out data from MSSQL table scp the file into HDFS and load
it from there into Hive table.

There are other ways of using JDBC say through Spark etc.

HTH


Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 14 September 2016 at 11:34, Priyanka Raghuvanshi <priyan...@winjit.com>
wrote:

>
> We are importing SQL server data into hive using Sqoop.Usually, it works
> but in a scenario throws following exception:
>
>
> FAILED: SemanticException Line 2:17 Invalid path
> ''hdfs://server_name.local:8020/user/root/_STC_CurrentLocation'': No
> files matching path hdfs://server_name.local:8020/
> user/root/_STC_CurrentLocation
>
> We can access that file on  "server _ip:50070/user/root/_STC_CurrentLocation"
> through HDFS web browser.
>
>
>
>

Reply via email to