Try lower case webhdfs as in: 'https://localhost:8443/gateway/cluster1/webhdfs/v1/?op=GETHOMEDIRECTORY'
On Thu, Jul 2, 2015 at 2:59 PM, Aneela Saleem <[email protected]> wrote: > Dear larry, > > The gateway.log file contains the following logs: > > 2015-07-02 23:43:16,391 INFO hadoop.gateway > (GatewayServer.java:handleCreateDeployment(427)) - Deploying topology > cluster1 to > /home/hduser/knox-0.6.0/bin/../data/deployments/cluster1.war.14e501479a0 > 2015-07-02 23:43:16,392 INFO hadoop.gateway > (DeploymentFactory.java:createDeployment(82)) - Configured services > directory is /home/hduser/knox-0.6.0/bin/../data/services > 2015-07-02 23:43:16,428 INFO hadoop.gateway > (DefaultGatewayServices.java:initializeContribution(180)) - Credential > store found for the cluster: cluster1 - no need to create one. > 2015-07-02 23:44:40,453 WARN hadoop.gateway > (GatewayFilter.java:doFilter(152)) - Failed to match path > /WEBHDFS/v1?op=GETHOMEDIRECTORY > 2015-07-02 23:44:51,722 WARN hadoop.gateway > (GatewayFilter.java:doFilter(152)) - Failed to match path > /WEBHDFS/v1?op=GETHOMEDIRECTORY > 2015-07-02 23:45:30,783 WARN hadoop.gateway > (GatewayFilter.java:doFilter(152)) - Failed to match path > /WEBHDFS/v1?op=GETHOMEDIRECTORY > > And i'm just a beginner. I'm trying to configure Knox Topology to connect > to HDFS, using default services provided. And i'm validating connection to > services. > > On Thu, Jul 2, 2015 at 11:50 PM, larry mccay <[email protected]> > wrote: > >> Please check the {GATEWAY_HOME}/logs/gateway.log file for errors during >> deployment. >> >> I notice that you have no providers described in cluster1.xml - this may >> be the root of your problem. >> I don't think that I have ever even tried that. >> >> What are you expecting the authentication behavior to be there? >> >> >> On Thu, Jul 2, 2015 at 2:27 PM, Aneela Saleem <[email protected]> >> wrote: >> >>> ${KNOX_HOME}/knox/conf/topologies has the following contents: >>> >>> admin.xml cluster1.xml README sandbox.xml >>> >>> cluster1.xml is the topology descriptor file created by me. >>> >>> cluster1.xml has following contents: >>> >>> <topology> >>> <gateway> >>> </gateway> >>> <service> >>> <role>NAMENODE</role> >>> <url>hdfs:// namenode-host :8020</url> >>> </service> >>> >>> <service> >>> <role>JOBTRACKER</role> >>> <url>rpc:// jobtracker-host :8050</url> >>> </service> >>> >>> <service> >>> <role>RESOURCEMANAGER</role> >>> <url>http://red3:8088/ws</url> >>> </service> >>> >>> <service> >>> <role>WEBHDFS</role> >>> <url>http://localhost:50070/webhdfs</url> >>> </service> >>> >>> <service> >>> <role>WEBHCAT</role> >>> <url>http://webcat-host :50111/templeton</url> >>> </service> >>> >>> <service> >>> <role>OOZIE</role> >>> <url>http://oozie-host :11000/oozie</url> >>> </service> >>> >>> <service> >>> <role>WEBHBASE</role> >>> <url>http://webhbase-host :60080</url> >>> </service> >>> </topology> >>> >>> >>> admin.xml has default contents. I did not make any changes. >>> >>> >>> >>> On Thu, Jul 2, 2015 at 11:15 PM, Steve Howard <[email protected]> >>> wrote: >>> >>>> To be clear, the error isn't hostname not found (that just means it >>>> wasn't in the DNS cache on your computer), it's that the URL can't be found >>>> by the knox server. >>>> >>>> Can you post the contents of your ${KNOX_HOME}/knox/conf/topologies >>>> directory? >>>> >>>> On Thu, Jul 2, 2015 at 1:50 PM, Aneela Saleem <[email protected]> >>>> wrote: >>>> >>>>> Hi, >>>>> >>>>> I'm trying to connect to HDFS through Knox gateway. I run the >>>>> following command: >>>>> >>>>> curl -vk >>>>> https://localhost:8443/gateway/cluster1/WEBHDFS/v1?op=GETHOMEDIRECTORY >>>>> >>>>> when i run this command i get the following error: >>>>> >>>>> * Hostname was NOT found in DNS cache >>>>> * Trying 127.0.0.1... >>>>> * Connected to localhost (127.0.0.1) port 8443 (#0) >>>>> * successfully set certificate verify locations: >>>>> * CAfile: none >>>>> CApath: /etc/ssl/certs >>>>> * SSLv3, TLS handshake, Client hello (1): >>>>> * SSLv3, TLS handshake, Server hello (2): >>>>> * SSLv3, TLS handshake, CERT (11): >>>>> * SSLv3, TLS handshake, Server key exchange (12): >>>>> * SSLv3, TLS handshake, Server finished (14): >>>>> * SSLv3, TLS handshake, Client key exchange (16): >>>>> * SSLv3, TLS change cipher, Client hello (1): >>>>> * SSLv3, TLS handshake, Finished (20): >>>>> * SSLv3, TLS change cipher, Client hello (1): >>>>> * SSLv3, TLS handshake, Finished (20): >>>>> * SSL connection using ECDHE-RSA-DES-CBC3-SHA >>>>> * Server certificate: >>>>> * subject: C=US; ST=Test; L=Test; O=Hadoop; OU=Test; CN=localhost >>>>> * start date: 2015-06-29 21:39:18 GMT >>>>> * expire date: 2016-06-28 21:39:18 GMT >>>>> * issuer: C=US; ST=Test; L=Test; O=Hadoop; OU=Test; CN=localhost >>>>> * SSL certificate verify result: self signed certificate (18), >>>>> continuing anyway. >>>>> > GET /gateway/cluster1/WEBHDFS/v1?op=GETHOMEDIRECTORY HTTP/1.1 >>>>> > User-Agent: curl/7.35.0 >>>>> > Host: localhost:8443 >>>>> > Accept: */* >>>>> > >>>>> < HTTP/1.1 404 Not Found >>>>> < Content-Length: 0 >>>>> * Server Jetty(8.1.14.v20131031) is not blacklisted >>>>> < Server: Jetty(8.1.14.v20131031) >>>>> < >>>>> * Connection #0 to host localhost left intact >>>>> >>>>> >>>>> >>>>> Can anyone please help me in tackling this issue? >>>>> >>>> >>>> >>> >> >
