No problem. Good to hear it works. For the future. It means, the bundle importing the javax.servlet could not be resolved and was not started. You can use the list command (or list -t 0, if you want to see the system bundles too) to see the state of each bundle. In this case the bundle was probably in state Installed, not Resolved or Started. You can use this command next time to check the state of your bundles. It helps you to identify the potential problem.
Regards Krzysztof On 26.03.2015 13:05, developm...@mobigov.com wrote: > > > I have found the issue. The bundle that was importing my hadoop bundle > was using javax.servlet version 3.1 for websockets but the jetty bundle > required javax.servlet 2.5 <3.0 so that was being imported as well. That > was causing a conflict that didnt get logged. I increased the range on > the jetty bundle and things seem to be ok now. Thank you very much for > taking your time to try and help me. > > David Daniel > > On 2015-03-26 10:45, developm...@mobigov.com wrote: > >> I believe so here is the applicable export for the hadoop-core bundle >> >> org.apache.hadoop.security;uses:="org.apache.hadoop.fs.permissio >> n,org.apache.hadoop.http,org.apache.hadoop.conf,org.apache.hadoop.secur >> ity.authentication.server,org.apache.hadoop.security.token,org.apache.c >> ommons.logging,org.apache.hadoop.fs,org.apache.hadoop.io,org.apache.had >> oop.util,org.apache.hadoop.security.authentication.util,javax.servlet,j >> avax.servlet.http,javax.security.auth.kerberos,org.mortbay.jetty,org.mo >> rtbay.io,org.mortbay.jetty.security,javax.net.ssl,org.apache.hadoop.ipc >> ,javax.security.sasl,javax.security.auth.callback,org.apache.commons.co >> dec.binary,sun.net.dns,sun.net.util,org.apache.hadoop.security.authenti >> cation.client,org.apache.hadoop.net,javax.security.auth,org.apache.hado >> op.security.authorize,org.apache.hadoop.metrics2,org.apache.hadoop.metr >> ics2.lib,javax.security.auth.login,com.sun.security.auth.module,javax.s >> ecurity.auth.spi,com.sun.security.auth";version="1.2.1" >> >> here is the exports output. >> >> karaf@root()> exports | grep org.apache.hadoop.security >> org.apache.hadoop.security.authentication.client | 1.2.1 | 1870 | >> org.apache.servicemix.bundles.hadoop-core >> org.apache.hadoop.security.authentication.server | 1.2.1 | 1870 | >> org.apache.servicemix.bundles.hadoop-core >> org.apache.hadoop.security.authentication.util | 1.2.1 | 1870 | >> org.apache.servicemix.bundles.hadoop-core >> org.apache.hadoop.security.authorize | 1.2.1 | 1870 | >> org.apache.servicemix.bundles.hadoop-core >> org.apache.hadoop.security.token.delegation | 1.2.1 | 1870 | >> org.apache.servicemix.bundles.hadoop-core >> org.apache.hadoop.security.token | 1.2.1 | 1870 | >> org.apache.servicemix.bundles.hadoop-core >> org.apache.hadoop.security | 1.2.1 | 1870 | >> org.apache.servicemix.bundles.hadoop-core >> karaf@root()> >> >> Thank you for the information on the inner classes and the export >> command. I will make sure that everything in the uses part is exported >> as well. >> >> Thanks for your help, >> >> David Daniel >> >> On 2015-03-26 08:47, Sobkowiak Krzysztof wrote: >> >>> Hi I can see you import the concrete version (1.2.1), not a range. Could >>> you check wehther any bundle in your ServiceMix exports the package with >>> this version? exports | grep org.apache.hadoop.security There should be no >>> problem with inner clesses (as long as the inner classes are public) >>> Regards Krzysztof On 25.03.2015 22:24, developm...@mobigov.com wrote: >>> importing inner classes. So I think I am almost done but when I try to >>> connect my bundle cant seem to find a class. I imported the package but it >>> is in a inner class and I am not sure how to import that. unable to find >>> LoginModule class: >>> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule is the >>> error I am getting In my import statement should I have a * or something to >>> get all inner classes org.apache.hadoop.security;version=1.2.1 is my import >>> statement currently On 2015-03-25 19:32, Krzysztof Sobkowiak wrote: There >>> is also a recipe in Karaf Cookbook > https://github.com/jgoodyear/ApacheKarafCookbook/tree/master/chapter9/chapter-9-recipe1 > [1] [1 [1]] [1 [2]]. It defines a feature for Hadoop too. It is for karaf > 3.0.x (also ServiceMix 6.0.x) but it can be o good start for you. regards > Krzysztof On 25.03.2015 20:26, Krzysztof Sobkowiak wrote: Hi I think, Camel > has a feature ready to use containing the hadoop client: karaf@root> > features:info camel-hdfs2 Description of >> camel-hdfs2 2.14.1 feature >> ---------------------------------------------------------------- The >> camel-hdfs2 feature can only run if you have libsnappyjava.dylib in >> java.library.path >> ---------------------------------------------------------------- Feature has >> no configuration Feature has no configuration files Feature depends on: >> camel-core 2.14.1 Feature contains followed bundles: >> mvn:commons-lang/commons-lang/2.6 start-level=50 >> mvn:com.google.guava/guava/17.0 start-level=50 >> mvn:com.google.protobuf/protobuf-java/2.5.0 start-level=50 >> mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.guice/3.0_1 >> start-level=50 >> mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.jsch/0.1.51_1 >> start-level=50 >> mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.paranamer/2.4_1 >> start-level=50 >> mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.avro/1.7.3_1 >> start-level=50 mvn:org.apache.commons/commons-compress/1.5 start-level=50 >> mvn:org.apache.commons/commons-math3/3.3 start-level=50 >> mvn:commons-cli/commons-cli/1.2 start-level=50 >> mvn:commons-configuration/commons-configuration/1.9 start-level=50 >> mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.commons-httpclient/3.1_7 >> start-level=50 mvn:io.netty/netty/3.9.4.Final start-level=50 >> mvn:org.codehaus.jackson/jackson-core-asl/1.9.12 start-level=50 >> mvn:org.codehaus.jackson/jackson-mapper-asl/1.9.12 start-level=50 >> mvn:org.xerial.snappy/snappy-java/1.1.0.1 start-level=50 >> mvn:commons-codec/commons-codec/1.9 start-level=50 >> mvn:commons-collections/commons-collections/3.2.1 start-level=50 >> mvn:commons-io/commons-io/1.4 start-level=50 mvn:commons-net/commons-net/3.3 >> start-level=50 mvn:org.apache.zookeeper/zookeeper/3.4.6 start-level=50 >> mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.xmlenc/0.52_1 >> start-level=50 >> mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.xerces/2.11.0_1 >> start-level=50 >> mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.xmlresolver/1.2_5 >> start-level=50 mvn:org.apache.camel/camel-hdfs2/2.14.1 start-level=50 >> mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.hadoop-client/2.3.0_2 >> start-level=50 You can find the definition here >> https://github.com/apache/camel/blob/master/platforms/karaf/features/src/main/resources/features.xml#L625-L654 >> [2] [3 [2]] [2 [3]]. It contains all necessary dependencies (and probably >> some more) It contains a bit older version of hadoop client. If you are on >> ServiceMix, you can simply install this feature karaf@root> features:install >> camel-hdfs2 If you need the newer version you must build your own feature. >> You have probably to upgrade some other dependencies. Jean-Baptiste has also >> written a blog about Hadoop on Karaf and implemented some features for this: >> http://blog.nanthrax.net/2013/07/apache-hadoop-and-karaf-article-1-karaf-as-hdfs-client/ >> [4] [4 [4]] [3 [2]]. Please ping him to check how > actual this feature is. regards >> Krzysztof On 25.03.2015 19:14, developm...@mobigov.com wrote: Hello, I have >> the hadoop-core and hadoop-client bundles loading into my project. In the >> console I can see that they are exporting the package that I need Symbolic >> Name org.apache.servicemix.bundles.hadoop-client Version 2.4.1.1 Bundle >> Location >> mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.hadoop-client/2.4.1_1 >> Last Modification Wed Mar 25 14:03:16 EDT 2015 Bundle Documentation >> http://www.apache.org/ [3] [2 [3]] [4 [2]] [1 [4]] Vendor The Apache >> Software Foundation Description This OSGi bundle wraps 2.4.1 jar files. >> Start Level 80 Exported Packages org.apache.hadoop,version=2.4.1 >> org.apache.hadoop.classification,version=2.4.1 >> org.apache.hadoop.classification.tools,version=2.4.1 >> org.apache.hadoop.conf,version=2.4.1 >> org.apache.hadoop.filecache,version=2.4.1 org.apache.hadoop.fs,version=2.4.1 >> org.apache.hadoop.fs.ftp,version=2.4.1 >> org.apache.hadoop.fs.local,version=2.4.1 > org.apache.hadoop.fs.permission,version=2.4.1 >> org.apache.hadoop.fs.s3,version=2.4.1 >> org.apache.hadoop.fs.s3native,version=2.4.1 >> org.apache.hadoop.fs.shell,version=2.4.1 >> org.apache.hadoop.fs.viewfs,version=2.4.1 org.apache.hadoop.ha,version=2.4.1 >> org.apache.hadoop.ha.proto,version=2.4.1 >> org.apache.hadoop.ha.protocolPB,version=2.4.1 >> org.apache.hadoop.hdfs,version=2.4.1 If I extract the client jar I can see >> the class I am trying to create but when I call code that uses the class I >> get this exception java.lang.ClassNotFoundException: Class >> org.apache.hadoop.hdfs.DistributedFileSystem not found Could this be because >> I am missing a dependency somewhere down the line that hadoop-client has >> marked as optional Thanks for any help, David Daniel Links: ------ [1] >> http://www.apache.org/ [3] [2 [3]] [4 [2]] Links: ------ [1 [2]] >> https://github.com/jgoodyear/ApacheKarafCookbook/tree/master/chapter9/chapter-9-recipe1 >> [1] [1 [1]] [2] >> https://github.com/apache/camel/blob/master/platforms/karaf/features/src/main/resources/features.xml#L625-L654 > [2] [3 [2]] [3] >> http://blog.nanthrax.net/2013/07/apache-hadoop-and-karaf-article-1-karaf-as-hdfs-client/ >> [4] [4 [4]] [4] http://www.apache.org/ [3] [2 [3]] >> >> Links: >> ------ >> [1] >> https://github.com/jgoodyear/ApacheKarafCookbook/tree/master/chapter9/chapter-9-recipe1 >> [1] >> [2] http://www.apache.org/ [3] >> [3] >> https://github.com/apache/camel/blob/master/platforms/karaf/features/src/main/resources/features.xml#L625-L654 >> [2] >> [4] >> http://blog.nanthrax.net/2013/07/apache-hadoop-and-karaf-article-1-karaf-as-hdfs-client/ >> [4] > > > Links: > ------ > [1] > https://github.com/jgoodyear/ApacheKarafCookbook/tree/master/chapter9/chapter-9-recipe1 > [2] > https://github.com/apache/camel/blob/master/platforms/karaf/features/src/main/resources/features.xml#L625-L654 > [3] http://www.apache.org/ > [4] > http://blog.nanthrax.net/2013/07/apache-hadoop-and-karaf-article-1-karaf-as-hdfs-client/ -- Krzysztof Sobkowiak JEE & OSS Architect Senior Solution Architect @ Capgemini SSC <http://www.pl.capgemini-sdm.com/en/> Apache ServiceMix <http://servicemix.apache.org/> Committer & PMC