Re: [PROPOSAL] Karaf Decanter monitoring
Hey, The collection definitely sounds like a perfect idea for a Karaf sub project to me. Beside the great potential for the components I like the especially fitting name +1 Kind regards, Andreas On Oct 14, 2014 5:13 PM, Jean-Baptiste Onofré j...@nanthrax.net wrote: Hi all, First of all, sorry for this long e-mail ;) Some weeks ago, I blogged about the usage of ELK (Logstash/Elasticsearch/Kibana) with Karaf, Camel, ActiveMQ, etc to provide a monitoring dashboard (know what's happen in Karaf and be able to store it for a long period): http://blog.nanthrax.net/2014/03/apache-karaf-cellar-camel- activemq-monitoring-with-elk-elasticsearch-logstash-and-kibana/ If this solution works fine, there are some drawbacks: - it requires additional middlewares on the machines. Additionally to Karaf itself, we have to install logstash, elasticsearch nodes, and kibana console - it's not usable out of the box: you need at least to configure logstash (with the different input/output plugins), kibana (to create the dashboard that you need) - it doesn't cover all the monitoring needs, especially in term of SLA: we want to be able to raise some alerts depending of some events (for instance, when a regex is match in the log messages, when a feature is uninstalled, when a JMX metric is greater than a given value, etc) Actually, Karaf (and related projects) already provides most (all) data required for the monitoring. However, it would be very helpful to have a glue, ready to use and more user friendly, including a storage of the metrics/monitoring data. Regarding this, I started a prototype of a monitoring solution for Karaf and the applications running in Karaf. The purpose is to be very extendible, flexible, easy to install and use. In term of architecture, we can find the following component: 1/ Collectors SLA Policies The collectors are services responsible of harvesting monitoring data. We have two kinds of collectors: - the polling collectors are invoked by a scheduler periodically. - the event driven collectors react to some events. Two collectors are already available: - the JMX collector is a polling collector which harvest all MBeans attributes - the Log collector is a event driven collector, implementing a PaxAppender which react when a log message occurs We can planned the following collectors: - a Camel Tracer collector would be an event driven collector, acting as a Camel Interceptor. It would allow to trace any Exchange in Camel. It's very dynamic (thanks to OSGi services), so it's possible to add a new custom collector (user/custom implementation). The Collectors are also responsible of checking the SLA. As the SLA policies are tight to the collected data, it makes sense that the collector validates the SLA and call/delegate the alert to SLA services. 2/ Scheduler The scheduler service is responsible to call the Polling Collectors, gather the harvested data, and delegate to the dispatcher. We already have a simple scheduler (just a thread), but we can plan a quartz scheduler (for advanced cron/trigger configuration), and another one leveraging the Karaf scheduler. 3/ Dispatcher The dispatcher is called by the scheduler or the event driven collectors to dispatch the collected data to the appenders. 4/ Appenders The appender services are responsible to send/store the collected data to target systems. For now, we have two appenders: - a log appender which just log the collected data - a elasticsearch appender which send the collected data to a elasticsearch instance. For now, it uses external elasticsearch, but I'm working on an elasticsearch feature allowing to embed elasticsearch in Karaf (it's mostly done). We can plan the following other appenders: - redis to send the collected data in Redis messaging system - jdbc to store the collected data in a database - jms to send the collected data to a JMS broker (like ActiveMQ) - camel to send the collected data to a Camel direct-vm/vm endpoint of a route (it would create an internal route) 5/ Console/Kibana The console is composed by two parts: - a angularjs or bootstrap layer allowing to configure the SLA and global settings - embedded kibana instance with pre-configured dashboard (when the elasticsearch appender is used). We will have a set of already created lucene queries and a kind of Karaf/Camel/ActiveMQ/CXF dashboard template. The kibana instance will be embedded in Karaf (not external). Of course, we have ready to use features, allowing to very easily install modules that we want. I named the prototype Karaf Decanter. I don't have preference about the name, and the location of the code (it could be as Karaf subproject like Cellar or Cave, or directly in the Karaf codebase). Thoughts ? Regards JB -- Jean-Baptiste Onofré jbono...@apache.org http://blog.nanthrax.net Talend - http://www.talend.com
Re: Finding classes in a Bundle
Hey Philipp, It kind of depends on your use case. One option is to e.g. use a bundle listener to get each bundle. On the bundles themselves you've options to load resources and classes [1]. This might be an option. But as said, depends quite heavily on your use case. Kind regards, Andreas [1] http://www.osgi.org/javadoc/r4v43/core/org/osgi/framework/Bundle.html On Tue, Nov 19, 2013 at 6:20 PM, Philipp Hoenisch philipp.hoeni...@gmail.com wrote: Hey guys, I'm not sure if this belongs in here or more to the Felix mailing list, anyway, I'll give it a try: versions i use: karaf: 2.3.2 felix: 4.0.3 I'm trying to find a class which is located in one of the packages without knowing it's name, however, I do know that it is annotated with a specific annotation. This class is autogenerated using a maven plugin. Anyway, what I try to do is to find this class (located in the same bundle) and create an instance during runtime, aka right after the bundleActivator was started As the package name is known I tried a simple approach aka: Thread.currentThread().getContextClassLoader().getResources(PACKAGENAME); and then iterate over the the files. The problem is that the ClassLoaders work a bit differently in OSGI right? Can anyone point me to the right direction how I can find a particular class? thank you in advance, best regards, Philipp -- View this message in context: http://karaf.922171.n3.nabble.com/Finding-classes-in-a-Bundle-tp4030368.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: Maven error with Karaf Pax-exam
OK, I've just rechecked the code. The default host is: m_host = InetAddress.getLocalHost().getHostName(); The default port is the first free port between: 21413 and 21511 you can set custom ports or host by adding a systemProperty to the options array during configuration setting org.ops4j.pax.exam.rbc.Constants.RMI_PORT_PROPERTY org.ops4j.pax.exam.rbc.Constants.RMI_HOST_PROPERTY org.ops4j.pax.exam.rbc.Constants.RMI_NAME_PROPERTY or simply setting a real system property for them. I hope this helps. Kind regards, Andreas On Mon, Jun 10, 2013 at 10:06 PM, Charles Moulliard ch0...@gmail.comwrote: Code using it -- http://grepcode.com/file/repo1.maven.org/maven2/org.apache.karaf.tooling.exam/org.apache.karaf.tooling.exam.container/2.3.0/org/apache/karaf/tooling/exam/container/internal/KarafTestContainer.java On Mon, Jun 10, 2013 at 9:52 PM, Andreas Pieber anpie...@gmail.comwrote: I don't know by heart but iirc this var is directly set in the pax exam code. I've never thought that someone would like to overwrite it :-) you can check the paxexam - karaf code directly or hope that I (or someone else jb?) find the time within the week to check. On Jun 10, 2013 9:48 PM, Charles Moulliard ch0...@gmail.com wrote: Find the issue. The bundle was not deployed ;-) Is it possible to find which port number or host should be used by pax exam rbc. I tried that without success -- systemProperty(org.ops4j.pax.exam.rbc.rmi.host).value(127.0.0.1), On Mon, Jun 10, 2013 at 9:43 PM, Andreas Pieber anpie...@gmail.comwrote: Based on what does spring defines its search path? I don't know what I could possibly done wrong while writing the framework which could invoke such errors. Basically pax exam karaf does almost the same as u do when you start it via the shell. Kind regards, Andreas On Jun 10, 2013 7:04 PM, Charles Moulliard ch0...@gmail.com wrote: How can we this issue where the schema is searched at the root of where pax exam has unpack the project and not inside org.kie/kie-spring bundle ? java.io.FileNotFoundException: /Users/chmoulli/JBoss/Code/droolsjbpm-oss/droolsjbpm-integration/drools-osgi/drools-karaf-itest/target/exam/unpack/69bbc140-6a26-4c31-a694-6e9c0987556f/org/kie/spring/kie-spring-2.0.0.xsd (No such file or directory) at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.createSAXParseException(ErrorHandlerWrapper.java:198)[:1.7.0_04] at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.warning(ErrorHandlerWrapper.java:99)[:1.7.0_04] at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:433)[:1.7.0_04] ?xml version=1.0 encoding=UTF-8? beans xmlns=http://www.springframework.org/schema/beans; xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance; xmlns:kie=http://drools.org/schema/kie-spring; xsi:schemaLocation= http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd http://drools.org/schema/kie-spring org/kie/spring/kie-spring-2.0.0.xsd On Mon, Jun 10, 2013 at 6:54 PM, Charles Moulliard ch0...@gmail.comwrote: Thx for the trick JB. Work now. Here is what I did @RunWith(JUnit4TestRunner.class) public class KieSpringOnKarafTest { protected static final transient Logger LOG = LoggerFactory.getLogger(KieSpringOnKarafTest.class); protected static final String DroolsVersion = 6.0.0-SNAPSHOT; protected OsgiBundleXmlApplicationContext applicationContext; @Inject protected BundleContext bc; @Before public void init() { applicationContext = createApplicationContext(); assertNotNull(Should have created a valid spring context, applicationContext); } protected void refresh() { applicationContext.setBundleContext(bc); applicationContext.refresh(); } protected OsgiBundleXmlApplicationContext createApplicationContext() { return new OsgiBundleXmlApplicationContext(new String[]{org/kie/spring/kie-beans.xml}); } @Test public void testKContainer() throws Exception { refresh(); KieContainer kieContainer = (KieContainer) applicationContext.getBean(defaultContainer); assertNotNull(kieContainer); System.out.println(kieContainer.getReleaseId() == +kieContainer.getReleaseId()); } On Mon, Jun 10, 2013 at 6:39 PM, Jean-Baptiste Onofré j...@nanthrax.net wrote: Try to put @Before to init and call the method before the test. Regards JB On 06/10/2013 06:30 PM, Charles Moulliard wrote: Been able to figure out the issue. Thx for your help. I get now another exception with this code. The BundleContext object is null when I tried to add it to the applicationContext. According to Pax Exam, it should be injected automatically. Is it process after calling the constructor of the class ? I use pax exam 2.6.0 @RunWith(JUnit4TestRunner
Re: Error during karaf startup
Is your karaf.data directory writeable by the user starting karaf? Kind regards, Andreas On Wed, Jun 19, 2013 at 2:09 PM, Srikanth srikanth.hu...@gmail.com wrote: Hello, I am getting below error during karaf startup. Karaf can't startup, make sure the log file can be accessed and written by the user starting Karaf : /tmp/srikanth/cumulocity-cep-karaf-1.0.0-SNAPSHOT/system/org/apache/felix/org.apache.felix.framework/3.0.9/org.apache.felix.framework-3.0.9.jar Please let me know what could be the problem. I tried to give 755 permission to log, but still the same problem. Thanks. Srikanth -- View this message in context: http://karaf.922171.n3.nabble.com/Error-during-karaf-startup-tp4029069.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: Error during karaf startup
I would need to check the code for this. But one question up front. Have you changed anything? Are you simply downloading Karaf? Do you start Karaf by using ./bin/karaf or cd bin/ and then executing karaf by using karaf? Is it possible that another karaf instance is on your path? I assume that your user has permission to write to ~ too? Kind regards, Andreas On Wed, Jun 19, 2013 at 2:23 PM, Srikanth srikanth.hu...@gmail.com wrote: Thanks for quick reply. Yes, user starting the karaf can write in karaf/data directory. Any idea why message shows as org/apache/felix/org.apache.felix.framework/3.0.9/org.apache.felix.framework-3.0.9.jar Since in my karaf artifact is org/apache/felix/org.apache.felix.framework/4.0.3/org.apache.felix.framework-4.0.3.jar and i am using 2.3.1 version of karaf Srikanth -- View this message in context: http://karaf.922171.n3.nabble.com/Error-during-karaf-startup-tp4029069p4029071.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: Error during karaf startup
oh... ok, the error message is simply shown on every IO error during the main.launch method. Please chck your etc/config.properties and see if it points to the correct felix version. Kind regards, Andreas On Wed, Jun 19, 2013 at 2:40 PM, Srikanth srikanth.hu...@gmail.com wrote: I tried by both ./bin/karaf and cd bin/, same problem in both cases. I am not downloading th karaf, i am building using maven by plugins and features. We have only one karaf instance, that too its not running :-( yes, user has permission to write to directory. -- View this message in context: http://karaf.922171.n3.nabble.com/Error-during-karaf-startup-tp4029069p4029073.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: Log4J v2
Hey, The question would rather be: Does pax-logging support log4jv2 and AFAIK the answer is no. The question would be: how difficult can it be to add log4j v2 support to pax-logging :-) Kind regards, Andreas On Thu, Jun 13, 2013 at 3:50 PM, CLEMENT Jean-Philippe jean-philippe.clem...@fr.thalesgroup.com wrote: Dear Karaf Team, I would like to know if Karaf (v3) supports Log4J v2? Cheers, JP [@@ OPEN @@]
Re: integration test, startup time for karaf 2.3.1
Do you reference the apache karaf artifact in your pom files? This is the only problem I can imagine making plain karaf tests so slow for you. If you dont enter the karaf distribution in your pom file as a dependency the test framework doesnt copy it into your .m2 directory and therefore freshly download it for each test. Kind regards, Andreas On Thu, Jun 13, 2013 at 10:36 PM, Marcos Mendez mar...@jitisoft.com wrote: Hi, I'm trying to improve the startup time for our integration tests. I'm using the vanilla maven distribution zip, the exam tooling, and cut down the featuresBoot to config. Is there anything else I can do to cut down the startup time? It's taking around 20-30s to start with no additional configuration. Any ideas? Regards, Marcos
Re: Blueprint scope and reference-list
I'm not sure if I understand your question. Yes it is !-- Simply call the factory manually on each invocation -- bean id=”bindHandler” class=”TestClass” argument ref=myFactory/ /bean reference-list interface=”SomeInterface”/ reference-listener ref=”bindHandler” bind-method=”bind”/ /reference-list I hope this helps! Kind regards, Andreas On Mon, Jun 3, 2013 at 11:54 AM, CLEMENT Jean-Philippe jean-philippe.clem...@fr.thalesgroup.com wrote: Hi Andreas, ** ** Is there a way to inject the factory (prototype bean) itself? ** ** Cheers, JP ** ** [@@ OPEN @@] ** ** *De :* Andreas Pieber [mailto:anpie...@gmail.com] *Envoyé :* samedi 1 juin 2013 10:57 *À :* Apache Karaf *Objet :* Re: Blueprint scope and reference-list ** ** Hey, ** ** I'm sorry, but there isn't any blueprint only method available for this. You'll need to create your object manually during the bind call if you like such a behavior. ** ** Kind regards, Andreas ** ** On Fri, May 31, 2013 at 11:11 AM, CLEMENT Jean-Philippe jean-philippe.clem...@fr.thalesgroup.com wrote: Dear Karaf Team, I would like to build a bean each time a reference is matched. I tried this: bean id=”KindOfFactory” class=”TestClass” scope=”prototype”/ reference-list interface=”SomeInterface”/ reference-listener ref=”KindOfFactory” bind-method=”bind”/ /reference-list But prototype instantiation seems to be linked to the reference-list itself rather than the listener call. Anyway, is there a way to instantiate a bean each time a kind of service is mapped in Karaf using only blueprint? Kind regards, Jean-Philippe [@@ OPEN @@] ** **
Re: Maven error with Karaf Pax-exam
Based on what does spring defines its search path? I don't know what I could possibly done wrong while writing the framework which could invoke such errors. Basically pax exam karaf does almost the same as u do when you start it via the shell. Kind regards, Andreas On Jun 10, 2013 7:04 PM, Charles Moulliard ch0...@gmail.com wrote: How can we this issue where the schema is searched at the root of where pax exam has unpack the project and not inside org.kie/kie-spring bundle ? java.io.FileNotFoundException: /Users/chmoulli/JBoss/Code/droolsjbpm-oss/droolsjbpm-integration/drools-osgi/drools-karaf-itest/target/exam/unpack/69bbc140-6a26-4c31-a694-6e9c0987556f/org/kie/spring/kie-spring-2.0.0.xsd (No such file or directory) at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.createSAXParseException(ErrorHandlerWrapper.java:198)[:1.7.0_04] at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.warning(ErrorHandlerWrapper.java:99)[:1.7.0_04] at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:433)[:1.7.0_04] ?xml version=1.0 encoding=UTF-8? beans xmlns=http://www.springframework.org/schema/beans; xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance; xmlns:kie=http://drools.org/schema/kie-spring; xsi:schemaLocation= http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd http://drools.org/schema/kie-spring org/kie/spring/kie-spring-2.0.0.xsd On Mon, Jun 10, 2013 at 6:54 PM, Charles Moulliard ch0...@gmail.comwrote: Thx for the trick JB. Work now. Here is what I did @RunWith(JUnit4TestRunner.class) public class KieSpringOnKarafTest { protected static final transient Logger LOG = LoggerFactory.getLogger(KieSpringOnKarafTest.class); protected static final String DroolsVersion = 6.0.0-SNAPSHOT; protected OsgiBundleXmlApplicationContext applicationContext; @Inject protected BundleContext bc; @Before public void init() { applicationContext = createApplicationContext(); assertNotNull(Should have created a valid spring context, applicationContext); } protected void refresh() { applicationContext.setBundleContext(bc); applicationContext.refresh(); } protected OsgiBundleXmlApplicationContext createApplicationContext() { return new OsgiBundleXmlApplicationContext(new String[]{org/kie/spring/kie-beans.xml}); } @Test public void testKContainer() throws Exception { refresh(); KieContainer kieContainer = (KieContainer) applicationContext.getBean(defaultContainer); assertNotNull(kieContainer); System.out.println(kieContainer.getReleaseId() == +kieContainer.getReleaseId()); } On Mon, Jun 10, 2013 at 6:39 PM, Jean-Baptiste Onofré j...@nanthrax.netwrote: Try to put @Before to init and call the method before the test. Regards JB On 06/10/2013 06:30 PM, Charles Moulliard wrote: Been able to figure out the issue. Thx for your help. I get now another exception with this code. The BundleContext object is null when I tried to add it to the applicationContext. According to Pax Exam, it should be injected automatically. Is it process after calling the constructor of the class ? I use pax exam 2.6.0 @RunWith(JUnit4TestRunner.**class) public class KieSpringOnKarafTest extends KieSpringIntegrationTestSuppor **t { protected static final transient Logger LOG = LoggerFactory.getLogger(**KieSpringOnKarafTest.class); protected OsgiBundleXmlApplicationContex**t applicationContext; @Inject protected BundleContext bc; public KieSpringOnKarafTest () { applicationContext = createApplicationContext(); assertNotNull(Should have created a valid spring context, applicationContext); // applicationContext.**setBundleContext(bc); // BundleContext is NULL applicationContext.refresh(); } protected OsgiBundleXmlApplicationContex**t createApplicationContext() { return new OsgiBundleXmlApplicationContex**t(new String[]{org/kie/spring/kie-**beans.xml}); } On Sun, Jun 9, 2013 at 9:20 PM, Andreas Pieber anpie...@gmail.com mailto:anpie...@gmail.com wrote: to use versionAsInProject you need to add build plugins plugin groupIdorg.apache.**servicemix.tooling/groupId artifactIddepends-maven-**plugin/artifactId version1.2/version executions execution idgenerate-depends-file/id goals goalgenerate-depends-file/**goal /goals /execution /executions /plugin /plugins /build On Sun, Jun 9, 2013 at 3:56 PM, Charles Moulliard ch0...@gmail.com mailto:ch0...@gmail.com wrote
Re: Maven error with Karaf Pax-exam
I don't know by heart but iirc this var is directly set in the pax exam code. I've never thought that someone would like to overwrite it :-) you can check the paxexam - karaf code directly or hope that I (or someone else jb?) find the time within the week to check. On Jun 10, 2013 9:48 PM, Charles Moulliard ch0...@gmail.com wrote: Find the issue. The bundle was not deployed ;-) Is it possible to find which port number or host should be used by pax exam rbc. I tried that without success -- systemProperty(org.ops4j.pax.exam.rbc.rmi.host).value(127.0.0.1), On Mon, Jun 10, 2013 at 9:43 PM, Andreas Pieber anpie...@gmail.comwrote: Based on what does spring defines its search path? I don't know what I could possibly done wrong while writing the framework which could invoke such errors. Basically pax exam karaf does almost the same as u do when you start it via the shell. Kind regards, Andreas On Jun 10, 2013 7:04 PM, Charles Moulliard ch0...@gmail.com wrote: How can we this issue where the schema is searched at the root of where pax exam has unpack the project and not inside org.kie/kie-spring bundle ? java.io.FileNotFoundException: /Users/chmoulli/JBoss/Code/droolsjbpm-oss/droolsjbpm-integration/drools-osgi/drools-karaf-itest/target/exam/unpack/69bbc140-6a26-4c31-a694-6e9c0987556f/org/kie/spring/kie-spring-2.0.0.xsd (No such file or directory) at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.createSAXParseException(ErrorHandlerWrapper.java:198)[:1.7.0_04] at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.warning(ErrorHandlerWrapper.java:99)[:1.7.0_04] at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:433)[:1.7.0_04] ?xml version=1.0 encoding=UTF-8? beans xmlns=http://www.springframework.org/schema/beans; xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance; xmlns:kie=http://drools.org/schema/kie-spring; xsi:schemaLocation= http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd http://drools.org/schema/kie-spring org/kie/spring/kie-spring-2.0.0.xsd On Mon, Jun 10, 2013 at 6:54 PM, Charles Moulliard ch0...@gmail.comwrote: Thx for the trick JB. Work now. Here is what I did @RunWith(JUnit4TestRunner.class) public class KieSpringOnKarafTest { protected static final transient Logger LOG = LoggerFactory.getLogger(KieSpringOnKarafTest.class); protected static final String DroolsVersion = 6.0.0-SNAPSHOT; protected OsgiBundleXmlApplicationContext applicationContext; @Inject protected BundleContext bc; @Before public void init() { applicationContext = createApplicationContext(); assertNotNull(Should have created a valid spring context, applicationContext); } protected void refresh() { applicationContext.setBundleContext(bc); applicationContext.refresh(); } protected OsgiBundleXmlApplicationContext createApplicationContext() { return new OsgiBundleXmlApplicationContext(new String[]{org/kie/spring/kie-beans.xml}); } @Test public void testKContainer() throws Exception { refresh(); KieContainer kieContainer = (KieContainer) applicationContext.getBean(defaultContainer); assertNotNull(kieContainer); System.out.println(kieContainer.getReleaseId() == +kieContainer.getReleaseId()); } On Mon, Jun 10, 2013 at 6:39 PM, Jean-Baptiste Onofré j...@nanthrax.netwrote: Try to put @Before to init and call the method before the test. Regards JB On 06/10/2013 06:30 PM, Charles Moulliard wrote: Been able to figure out the issue. Thx for your help. I get now another exception with this code. The BundleContext object is null when I tried to add it to the applicationContext. According to Pax Exam, it should be injected automatically. Is it process after calling the constructor of the class ? I use pax exam 2.6.0 @RunWith(JUnit4TestRunner.**class) public class KieSpringOnKarafTest extends KieSpringIntegrationTestSuppor**t { protected static final transient Logger LOG = LoggerFactory.getLogger(**KieSpringOnKarafTest.class); protected OsgiBundleXmlApplicationContex**t applicationContext; @Inject protected BundleContext bc; public KieSpringOnKarafTest () { applicationContext = createApplicationContext(); assertNotNull(Should have created a valid spring context, applicationContext); // applicationContext.**setBundleContext(bc); // BundleContext is NULL applicationContext.refresh(); } protected OsgiBundleXmlApplicationContex**t createApplicationContext() { return new OsgiBundleXmlApplicationContex**t(new String[]{org/kie/spring/kie-**beans.xml}); } On Sun, Jun 9, 2013 at 9:20 PM, Andreas Pieber anpie...@gmail.com mailto:anpie
Re: Maven error with Karaf Pax-exam
Have you extracted the versions using the service mix plugin as described in the documentation? Is the version. Properties generated correctly? Kind regards, Andreas On 9 Jun 2013 12:46, Charles Moulliard ch0...@gmail.com wrote: Hi, I get this pax exam (maven error) in an unit test which was working previously. Here is also the pom file. What should I do in the pom file definition to avoid this error ( https://gist.github.com/cmoulliard/49e4ef4c871d48bba550) java.lang.RuntimeException: Could not resolve version for groupId:org.apache.karaf artifactId:apache-karaf by reading the dependency information generated by maven. at org.ops4j.pax.exam.MavenUtils.getArtifactVersion(MavenUtils.java:78) at org.ops4j.pax.exam.MavenUtils$1.getVersion(MavenUtils.java:100) at org.ops4j.pax.exam.options.MavenArtifactUrlReference.version(MavenArtifactUrlReference.java:110) at org.ops4j.pax.exam.options.MavenArtifactUrlReference.versionAsInProject(MavenArtifactUrlReference.java:118) Class import static org.apache.karaf.tooling.exam.options.KarafDistributionOption.*; ... @Configuration public static Option[] configure() { return new Option[]{ karafDistributionConfiguration().frameworkUrl( maven().groupId(org.apache.karaf).artifactId(apache-karaf).type(tar.gz).versionAsInProject()) .karafVersion(MavenUtils.getArtifactVersion(org.apache.karaf, apache-karaf)).name(Apache Karaf) .unpackDirectory(new File(target/exam/unpack/)), keepRuntimeFolder(), ... Pom file Regards, -- Charles Moulliard Apache Committer / Architect (RedHat) Twitter : @cmoulliard | Blog : http://cmoulliard.blogspot.com
Re: Blueprint scope and reference-list
Hey, I'm sorry, but there isn't any blueprint only method available for this. You'll need to create your object manually during the bind call if you like such a behavior. Kind regards, Andreas On Fri, May 31, 2013 at 11:11 AM, CLEMENT Jean-Philippe jean-philippe.clem...@fr.thalesgroup.com wrote: Dear Karaf Team, ** ** I would like to build a bean each time a reference is matched. ** ** I tried this: bean id=”KindOfFactory” class=”TestClass” scope=”prototype”/ ** ** reference-list interface=”SomeInterface”/ reference-listener ref=”KindOfFactory” bind-method=”bind”/ /reference-list ** ** But prototype instantiation seems to be linked to the reference-list itself rather than the listener call. ** ** Anyway, is there a way to instantiate a bean each time a kind of service is mapped in Karaf using only blueprint? ** ** Kind regards, Jean-Philippe ** ** [@@ OPEN @@]
Re: EIK not working
+1 too from my point of view. I dont think that tons of features will be added in the next time. So for at least another year ppl could stay with the old version of EIK using 0.9. Sounds like a plan to me. Kind regards, Andreas On Mon, May 27, 2013 at 3:58 PM, j...@nanthrax.net j...@nanthrax.net wrote: It sounds good to me. Regards JB -- Jean-Baptiste Onofré jbono...@apache.org http://blog.nanthrax.net Talend - http://wwx.talend.com - Reply message - From: Timo Naroska tnaro...@mac.com To: user@karaf.apache.org Subject: EIK not working Date: Mon, May 27, 2013 3:53 pm The issues for supporting Juno are KARAF-2103 and KARAF-2216. Some formerly internal PDE APIs that EIK is using became public in 3.8. This is a breaking change which makes it hard to build against both pre 3.6/3.7 and 3.8/later. Juno was released nearly a year ago and Kepler is scheduled for end of next month. It would be nice to have EIK support the latest builds of eclipse rather than two year old versions. Why not change the requirements for EIK 0.10.0 to minimum 3.8? Regards Timo On 22.05.2013, at 08:48, Jean-Baptiste Onofré j...@nanthrax.net wrote: Short answer: no. EIK is build on Helios, and tested on Helios and Europa. A lot of changes have been introduced in Juno. AFAIR, we have a Jira to upgrade and support Juno, but I didn't have time to work on it for now. Regards JB On 05/22/2013 08:43 AM, CLEMENT Jean-Philippe wrote: Well, is it possible to use EIK on Eclipse Juno? Regards, JP [@@ THALES GROUP INTERNAL @@] -Message d'origine- De : CLEMENT Jean-Philippe [mailto: jean-philippe.clem...@fr.thalesgroup.com] Envoyé : lundi 20 mai 2013 13:30 À : user@karaf.apache.org Objet : RE: EIK not working Hi Jean-Baptiste, I use Juno Service Release 2 / build id 20130225-0426. Regards, JP [@@ OPEN @@] -Message d'origine- De : Jean-Baptiste Onofré [mailto:j...@nanthrax.net] Envoyé : lundi 20 mai 2013 12:58 À : user@karaf.apache.org Objet : Re: EIK not working Hi Jean-Philippe, What's the Eclipse version that you use ? Regards JB On 05/20/2013 12:03 PM, CLEMENT Jean-Philippe wrote: I'm back on EIK subject. So I installed a clean Juno Eclipse version on a RHEL 5. Downloaded EIK 0.9.0 and uncompressed it in a new directory. Went to Help Install new software Add. Local. as expected. Then I selected Eclipse Integration for Apache Karaf Next and obtained the following error: Cannot complete the install because one or more required items could not be found. Software being installed: Eclipse JMX Integration Feature for Apache Felix Karaf 0.9.0 (org.apache.karaf.eik.jmx.feature.feature.group 0.9.0) Missing requirement: Apache Karaf :: EIK :: Plugins :: UI for Eclipse Integration 0.9.0 (org.apache.karaf.eik.ui 0.9.0) requires 'bundle org.eclipse.pde.ui [3.6.0,4.0.0)' but it could not be found Cannot satisfy dependency: From: Eclipse JMX Integration Feature for Apache Felix Karaf 0.9.0 (org.apache.karaf.eik.jmx.feature.feature.group 0.9.0) To: org.apache.karaf.eik.ui 0.9.0 How can I fix this? Best regards, Jean-Philippe Clément [@@ OPEN @@] -- Jean-Baptiste Onofré jbono...@apache.org http://blog.nanthrax.net Talend - http://www.talend.com -- Jean-Baptiste Onofré jbono...@apache.org http://blog.nanthrax.net Talend - http://www.talend.com
Re: ConfigAdmin services
Karaf uses the Felix configuration admn service :-) On 25 May 2013 23:00, Ryan Moquin fragility...@gmail.com wrote: I noticed Karaf has a config admin service, but so does felix. What's the difference? Is there are reason to use one over the other? Might be a dumb question, but it seems a little confusing. :)
Re: Feature dependencies across kar files and custom repositories
The kar files are extracted in the repositories? If yes I don't see any reason why it shouldn't work? Do you encounter any problems or was this just a rethorical question? Kind regards, Andreas On May 23, 2013 8:33 PM, Brian_E brianemon...@gmail.com wrote: Let's say I have a proper Kar file with a set of bundles and dependencies defined to it, and it deploys properly on its own. For example's sake, lets say it provides Feature A. Next, I create a custom repository to be distributed with karaf (and configured to be read from), and within this repository it defines Feature B that depends on feature A within the Kar file. Should this work? Is this supported? Is there a defined startup order within karaf that Kars get deployed before repositories are read from? Any input is appreciated. Thanks -- View this message in context: http://karaf.922171.n3.nabble.com/Feature-dependencies-across-kar-files-and-custom-repositories-tp4028813.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: log4j:WARN No appenders could be found for logger
I'vent encountered that problem by now. It could be a problem in your exports/imports or some embedded libraries. You best chance might be to check the content of your manifests.xml and bundle and compare it to your other bundles. Kind regards, Andreas On Fri, May 10, 2013 at 9:02 PM, Clement Jebakumar jeba.r...@gmail.comwrote: I have a karaf custom build which has some 12 bundles. And have configured log4j in some of the bundles. I can see one of the bundle issuing the warning 'log4j:WARN No appenders could be found for logger Other bundles works fine and logs go to the karaf.log file. Facing the issue with only one bundle. How to fix it. As far as i know, i didnt do any special configurations. *Clement Jebakumar,* 111/27 Keelamutharamman Kovil Street, Tenkasi, 627 811 http://www.declum.com/clement.html
Re: 3.0.0RC1 - Karaf WebConsole in Equinox Issue
Hey Aritra, I assume you want https://issues.apache.org/jira/browse/KARAF-2240. Kind regards, Andreas On Mon, Apr 1, 2013 at 12:02 PM, Aritra Chatterjee yours.ari...@gmail.comwrote: Thanks! Is there a tracking bug logged for this, so that we get to know when it's fixed? -Aritra On Mon, Apr 1, 2013 at 2:42 PM, Jean-Baptiste Onofré j...@nanthrax.net wrote: Hi Aritra, yes it's a known issue, I'm working on it. It should be fixed in the following days (on 3.0.0-SNAPSHOT). Regards JB On 04/01/2013 10:20 AM, Aritra Chatterjee wrote: Hi, I am trying to run 3.0.0 RC1 karaf with Equinox container (Changed felix to equinox in config.properties). The launch does not cause any issues, however, on issuing the command feature:install webconsole, I get the following stack trace in the log file: 2013-03-28 20:18:22,676 | INFO | Local user karaf | FeaturesServiceImpl | 42 - org.apache.karaf.features.core - 3.0.0.RC1 | Installing feature webconsole 3.0.0.RC1 2013-03-28 20:18:22,676 | INFO | Local user karaf | FeaturesServiceImpl | 42 - org.apache.karaf.features.core - 3.0.0.RC1 | Installing feature http 3.0.0.RC1 2013-03-28 20:18:22,677 | INFO | Local user karaf | FeaturesServiceImpl | 42 - org.apache.karaf.features.core - 3.0.0.RC1 | Installing feature pax-http 3.0.0.M3 2013-03-28 20:18:22,677 | INFO | Local user karaf | FeaturesServiceImpl | 42 - org.apache.karaf.features.core - 3.0.0.RC1 | Installing feature pax-jetty 8.1.9.v20130131 2013-03-28 20:18:22,789 | INFO | Local user karaf | FeaturesServiceImpl | 42 - org.apache.karaf.features.core - 3.0.0.RC1 | Installing feature standard-condition-webconsole_0_0_0 3.0.0.RC1 2013-03-28 20:18:22,824 | INFO | Local user karaf | ShellUtil | 48 - org.apache.karaf.shell.console - 3.0.0.RC1 | Exception caught while executing command java.lang.NoSuchMethodError: org.osgi.framework.Version.toString0()Ljava/lang/String; at org.osgi.framework.VersionRange.toString(VersionRange.java:393)[osgi-3.8.0.v20120529-1548.jar:] at java.lang.String.valueOf(String.java:2838)[:1.6.0_27] at java.lang.StringBuffer.append(StringBuffer.java:236)[:1.6.0_27] at org.eclipse.osgi.internal.resolver.ImportPackageSpecificationImpl.toString(ImportPackageSpecificationImpl.java:212) at org.eclipse.osgi.internal.module.ResolverBundle.constraintsConflict(ResolverBundle.java:453) at org.eclipse.osgi.internal.module.ResolverImpl.checkFragmentConstraints(ResolverImpl.java:1419) at org.eclipse.osgi.internal.module.ResolverImpl.resolveBundle(ResolverImpl.java:1389) at org.eclipse.osgi.internal.module.ResolverImpl.resolveBundles0(ResolverImpl.java:783) at org.eclipse.osgi.internal.module.ResolverImpl.resolveBundles(ResolverImpl.java:653) at org.eclipse.osgi.internal.module.ResolverImpl.resolve(ResolverImpl.java:487) at org.eclipse.osgi.internal.resolver.StateImpl.resolve(StateImpl.java:481) at org.eclipse.osgi.internal.resolver.StateImpl.resolve(StateImpl.java:557) at org.eclipse.osgi.framework.internal.core.PackageAdminImpl.doResolveBundles(PackageAdminImpl.java:249) at org.eclipse.osgi.framework.internal.core.PackageAdminImpl.resolveBundles(PackageAdminImpl.java:192) at org.eclipse.osgi.framework.internal.core.BundleHost.startWorker(BundleHost.java:322) at org.eclipse.osgi.framework.internal.core.AbstractBundle.start(AbstractBundle.java:300) at org.eclipse.osgi.framework.internal.core.AbstractBundle.start(AbstractBundle.java:292) at org.apache.karaf.features.internal.FeaturesServiceImpl.startBundle(FeaturesServiceImpl.java:465) at org.apache.karaf.features.internal.FeaturesServiceImpl.installFeatures(FeaturesServiceImpl.java:424) at org.apache.karaf.features.internal.FeaturesServiceImpl.installFeature(FeaturesServiceImpl.java:360) at org.apache.karaf.features.internal.FeaturesServiceImpl.installFeature(FeaturesServiceImpl.java:349) at Proxya9a053f2_4c25_46d2_b119_7f6c6cac394f.installFeature(Unknown Source) at org.apache.karaf.features.command.InstallFeatureCommand.doExecute(InstallFeatureCommand.java:62) at org.apache.karaf.features.command.FeaturesCommandSupport.doExecute(FeaturesCommandSupport.java:38) at org.apache.karaf.shell.console.AbstractAction.execute(AbstractAction.java:33) at org.apache.karaf.shell.console.OsgiCommandSupport.execute(OsgiCommandSupport.java:39) at org.apache.karaf.shell.commands.basic.AbstractCommand.execute(AbstractCommand.java:33) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)[:1.6.0_27] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)[:1.6.0_27] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)[:1.6.0_27] at java.lang.reflect.Method.invoke(Method.java:616)[:1.6.0_27] at
Re: Pax Wicket project bundle starts/stops twice if dev:watch set
Hey, On Thu, Mar 14, 2013 at 12:31 AM, ralfKaraf ralf.but...@web.de wrote: Aehm ... yes, it does. That is obviously not Pax Wicket related!! Did not expect this. Sorry for blaming Pax Wicket!! It isn't that bad that you automatically put all blame on it, is it? :-) NP, yes please create an issue. This behavior somehow feels wrong to me! Kind regards, Andreas Now, is that normal behaviour of Karaf? Shouldn't be I guess. Should I file an issue? -- View this message in context: http://karaf.922171.n3.nabble.com/Pax-Wicket-project-bundle-starts-stops-twice-if-dev-watch-set-tp4027937p4028182.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: Pax Wicket returns application/octet-stream for static web page
Hey Ralf, I've answered directly at the PaxWicket issue. Kind regards, Andreas On Thu, Mar 14, 2013 at 6:47 AM, ralfKaraf ralf.but...@web.de wrote: Would anyone be able to help me out on this. How do you guys refer to static html pages within your bundle (without mounting it)? I filed an issue http://team.ops4j.org/browse/PAXWICKET-427 for this but I don't know if it is an error or rather an enhancement. -- View this message in context: http://karaf.922171.n3.nabble.com/Pax-Wicket-returns-application-octet-stream-for-static-web-page-tp4028044p4028184.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: Pax Wicket project bundle starts/stops twice if dev:watch set
which version of karaf do you use? Does the same problem (double start for a short period) also happen to other bundles? Kind regards, Andreas On Thu, Feb 28, 2013 at 7:10 AM, ralfKaraf ralf.but...@web.de wrote: Hi there, not entirely sure if this is the right forum to ask the question. I was deploying the plain/simple sample of pax wicket in version 2.1.0 into my Karaf OSGi container. Everything starts smoothly. I can stop or uninstall the bundle and start or install it again. The behaviour is as expected. But when I set dev:watch BUNDLEID to allow the project to be redeployed after an mvn install, the sample is started/stopped twice: karaf@root install mvn:org.ops4j.pax.wicket.samples.plain/org.ops4j.pax.wicket.samples.plain.simple/2.1.0 Bundle ID: 105 *** WARNING: Wicket is running in DEVELOPMENT mode. *** *** ^^^*** *** Do NOT deploy to your live server(s) without changing this. *** *** See Application#getConfigurationType() for more information. *** karaf@root dev:watch 105 Watched URLs/IDs: 105 [A mvn install is conducted on the plain.simple project ...] karaf@root [Watch] Updating watched bundle: org.ops4j.pax.wicket.samples.plain.simple (2.1.0) *** WARNING: Wicket is running in DEVELOPMENT mode. *** *** ^^^*** *** Do NOT deploy to your live server(s) without changing this. *** *** See Application#getConfigurationType() for more information. *** *** WARNING: Wicket is running in DEVELOPMENT mode. *** *** ^^^*** *** Do NOT deploy to your live server(s) without changing this. *** *** See Application#getConfigurationType() for more information. *** Even though it looks like that the WicketApplication is created twice only, it actually is the Activator start and stop methods that get invoked twice? How can I prevent it from doing so? Thanks, Ralf -- View this message in context: http://karaf.922171.n3.nabble.com/Pax-Wicket-project-bundle-starts-stops-twice-if-dev-watch-set-tp4027937.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: Pax Wicket project bundle starts/stops twice if dev:watch set
My guts are saying: either we do something wrong in the way bundles are refreshed or there's kind of a different bug... What do you mean with it happens for the blueprint as well? Independently this definitely sounds like something we do wrong and if it likely could affect more parts than pax wicket. Can you please create an issue for Karaf that we don't loose track of this? Kind regards, Andreas On Fri, Mar 1, 2013 at 3:27 AM, ralfKaraf ralf.but...@web.de wrote: Thanks Andreas, I use Karaf 2.3 and it happens for the blueprint as well. -- View this message in context: http://karaf.922171.n3.nabble.com/Pax-Wicket-project-bundle-starts-stops-twice-if-dev-watch-set-tp4027937p4027972.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: New Karaf features on github
On Tue, Feb 26, 2013 at 12:55 PM, Tcharl cmorda...@gmail.com wrote: Hi, where to push, may I clone Karaf? Exactly. There's a github mirror at https://github.com/apache/karaf. You can clone this one and provide pull requests as you're used to. We'll get notified about them on the dev list. Some additional comments: a) Why do you have some MANIFEST.MF files in the resource folders of your features b) Some parts of the features might be interesting to include into Karaf features (e.g. the Spring extensions) In general I still get a stall feeling in my stomage seeing all those enterprise features. Keeping them directly at Karaf sounds like a bad idea. We have already problems with the other enterprise features and their versions (e.g. Spring). Maybe it's slowly getting time to decide how we want to manage such external feature bundles? I add the dev list for CC to get this question explicitly to the dev community. Kind regards, Andreas I also made a little helloworld (not sure it is working but the spirit is here) : https://github.com/Tcharl/Karaf-and-Virgo-osgiliath-reference-helloworld/tree/master/helloworld See you soon! -- View this message in context: http://karaf.922171.n3.nabble.com/New-Karaf-features-on-github-tp4027864p4027884.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: Missing features in 3.0.0?
try: feature:install eventadmin first Kind regards, Andreas On Mon, Feb 18, 2013 at 9:04 PM, sd sdoyl...@yahoo.com wrote: This look familiar? feature:install webconsole After running: features:install sling I get this: Caused by: java.lang.ClassNotFoundException: org.osgi.service.event.EventAdmin not found by org.ops4j.pax.web.pax-web-runtime [75] at org.apache.felix.framework.BundleWiringImpl.findClassOrResourceByDelegation(BundleWiringImpl.java:1460) at org.apache.felix.framework.BundleWiringImpl.access$400(BundleWiringImpl.java:72) at org.apache.felix.framework.BundleWiringImpl$BundleClassLoader.loadClass(BundleWiringImpl.java:1843) at java.lang.ClassLoader.loadClass(ClassLoader.java:247) -- View this message in context: http://karaf.922171.n3.nabble.com/Missing-features-in-3-0-0-tp4027790p4027797.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: features missing from features
From my point of view I see no problem with those three points. Solutions for them could definitely help people with writing their features. So, solving those three would definitely have my support. Kind regards, Andreas On Sun, Feb 10, 2013 at 2:34 AM, Andrei Pozolotin andrei.pozolo...@gmail.com wrote: *Hello there.* I miss the following features in features - see link and below https://gist.github.com/carrot-garden/4747691 ### 1. build feature for a bundle ### 2. depend on a feature.xml ### 3. depend on a feature with repository I am curious is it just me missing this or more people need this? :-) Thank you, Andrei ## features missing in features ### 1. build feature for a bundle example: pom-A.xml currently there is no way build both bundle.jar and feature.xml in one step: karaf-maven-plugin is including all scopes, so resulting feature.xml is useless. workaround: duplicate efforts and setup separate project just to produce feature.xml for the same bundle.jar solution: fix bugs in karaf-maven-plugin ### 2. depend on a feature.xml example: pom-A.xml pom-B.xml pom-C.xml currently there is no way to depend on feature.xml in maven for a bundle.jar this is because feature.xml is resolved via peer pom.xml for dependency purposes with result being: pom-A.xml classes are available, pom-B.xml are missing in the pom-C.xml project workaround: duplicate efforts / introduce hacks and have pom-B.xml reference itself as dependency solution: add life cycle mapping for karaf-maven-plugin, so it will automatically add pom-B.xml artifact module-B.jar on class path when it finds feature-B.xml ### 3. depend on a feature with repository this is similar to previous, except repositories from features are also resolved in maven
Re: Karaf on Raspberry Pi
@Robert: JRE properties had been pushed (thanks JB) @JB: Do you think we can cross compiile the wrapper scripts for ARM the same way you've done for windows 64 bit? Kind regards, Andreas On Thu, Feb 7, 2013 at 9:39 AM, j...@nanthrax.net wrote: Hi Robert, very interesting, thanks for the sharing. FYI, I'm upgrading jre.properties in both Karaf 2.3.x and trunk. Regards JB On 2013-02-06 23:07, Robert Murphy wrote: Greetings! I managed to get Karaf running smoothly on the Raspberry Pi against the Sun JDK 8 Preview; I'd love to work with the Karaf community to make this an easier process, as right now there are a couple modifications that need to be done to several core files before everything runs nicely. If any of the developers are interested in getting these tweaks integrated into Karaf, I'd be thrilled to help out in any way I can. Regards, Robert
Re: Karaf on Raspberry Pi
So the idea should rather be to completely change to commons-deamon instead of the current approach for the wrapper bundle? Kind regards, Andreas On Thu, Feb 7, 2013 at 11:57 AM, j...@nanthrax.net wrote: Hi Andreas, It should be possible to cross compile on ARM for Tanuki JSW. We can create a Jira about that and I can tackle that. AFAIR, commons-daemon provides ARM support by default. Regards JB On 2013-02-07 11:47, Andreas Pieber wrote: @Robert: JRE properties had been pushed (thanks JB) @JB: Do you think we can cross compiile the wrapper scripts for ARM the same way you've done for windows 64 bit? Kind regards, Andreas On Thu, Feb 7, 2013 at 9:39 AM, j...@nanthrax.net [1] wrote: Hi Robert, very interesting, thanks for the sharing. FYI, I'm upgrading jre.properties in both Karaf 2.3.x and trunk. Regards JB On 2013-02-06 23:07, Robert Murphy wrote: Greetings! I managed to get Karaf running smoothly on the Raspberry Pi against the Sun JDK 8 Preview; I'd love to work with the Karaf community to make this an easier process, as right now there are a couple modifications that need to be done to several core files before everything runs nicely. If any of the developers are interested in getting these tweaks integrated into Karaf, I'd be thrilled to help out in any way I can. Regards, Robert Links: -- [1] mailto:j...@nanthrax.net
Re: Karaf on Raspberry Pi
I second Christian here; ARM support would be funny and cool but I don't see any requirement to add it anywhere else than 3.1. If there's a strong requirement we can backport it, but I don't expect any :-) Kind regards, Andreas On Thu, Feb 7, 2013 at 1:39 PM, Christian Schneider ch...@die-schneider.net wrote: I propose to simply switch to commons daemon for karaf 3.1 and do no changes for earlier versions. So this means ARM support is a bit delayed but it is a lot less work. Christian On 07.02.2013 13:28, j...@nanthrax.net wrote: Agree, my proposal would be: - keep JSW as it is for Karaf 2.3.1 and 3.0.0.RC1 - provide ARM support (by cross compilation) in Karaf 2.3.2 and 3.0.1 - provide commons-daemon support in Karaf 2.4.0 and 3.1.0 Regards JB On 2013-02-07 13:20, Achim Nierbeck wrote: Ok, so we do have a fully functional wrapper based on commons-deamon at hand but don't use it (yet). We should start a vote to switch to it for trunk and maybe do a ARM cross-compile for 2.3. regards, Achim 2013/2/7 j...@nanthrax.net [7] As I said long time ago ;), I already have the commons-daemon module ready for the trunk. Anyway, my point is that we can do both: cross compile for JSW (at least for Karaf 2.3.x), commons-daemon for trunk (and leave the choice between commons-daemon and JSW). Regards JB On 2013-02-07 12:21, Andreas Pieber wrote: So the idea should rather be to completely change to commons-deamon instead of the current approach for the wrapper bundle? Kind regards, Andreas On Thu, Feb 7, 2013 at 11:57 AM, j...@nanthrax.net [3] [3] wrote: Hi Andreas, It should be possible to cross compile on ARM for Tanuki JSW. We can create a Jira about that and I can tackle that. AFAIR, commons-daemon provides ARM support by default. Regards JB On 2013-02-07 11:47, Andreas Pieber wrote: @Robert: JRE properties had been pushed (thanks JB) @JB: Do you think we can cross compiile the wrapper scripts for ARM the same way you've done for windows 64 bit? Kind regards, Andreas On Thu, Feb 7, 2013 at 9:39 AM, j...@nanthrax.net [1] [1] [1] wrote: Hi Robert, very interesting, thanks for the sharing. FYI, I'm upgrading jre.properties in both Karaf 2.3.x and trunk. Regards JB On 2013-02-06 23:07, Robert Murphy wrote: Greetings! I managed to get Karaf running smoothly on the Raspberry Pi against the Sun JDK 8 Preview; I'd love to work with the Karaf community to make this an easier process, as right now there are a couple modifications that need to be done to several core files before everything runs nicely. If any of the developers are interested in getting these tweaks integrated into Karaf, I'd be thrilled to help out in any way I can. Regards, Robert Links: -- [1] mailto:j...@nanthrax.net [2] [2] Links: -- [1] mailto:j...@nanthrax.net [4] [2] mailto:j...@nanthrax.net [5] [3] mailto:j...@nanthrax.net [6] -- Apache Karaf http://karaf.apache.org/ [8] Committer PMC OPS4J Pax Web http://wiki.ops4j.org/display/paxweb/Pax+Web/ [9] Committer Project Lead OPS4J Pax for Vaadin http://team.ops4j.org/wiki/display/PAXVAADIN/Home [10] Commiter Project Lead blog http://notizblog.nierbeck.de/ [11] Links: -- [1] mailto:j...@nanthrax.net [2] mailto:j...@nanthrax.net [3] mailto:j...@nanthrax.net [4] mailto:j...@nanthrax.net [5] mailto:j...@nanthrax.net [6] mailto:j...@nanthrax.net [7] mailto:j...@nanthrax.net [8] http://karaf.apache.org/ [9] http://wiki.ops4j.org/display/paxweb/Pax+Web/ [10] http://team.ops4j.org/wiki/display/PAXVAADIN/Home [11] http://notizblog.nierbeck.de/ -- Christian Schneider http://www.liquid-reality.de Open Source Architect http://www.talend.com
Re: Karaf on Raspberry Pi
@KARAF-2163: does anybody knows if it's just: copy jre7 and rename it to jre8? Kind regards, Andreas On Thu, Feb 7, 2013 at 12:47 AM, Jamie G. jamie.goody...@gmail.com wrote: A jira entry has already been filed to add ire-8 to the properties file: https://issues.apache.org/jira/browse/KARAF-2163 Cheers, Jamie On Wed, Feb 6, 2013 at 8:15 PM, Jamie G. jamie.goody...@gmail.com wrote: Nice blog post! Have you tried using Karaf's service wrapper? http://karaf.apache.org/manual/latest-2.3.x/users-guide/wrapper.html Cheers, Jamie On Wed, Feb 6, 2013 at 7:12 PM, Robert Murphy robert.mur...@chirondigital.com wrote: It's on the v2 (512 MB RAM, though CPU seems to be the main constraint at this point). The tweaks were basically just making karaf aware that it was running under jdk 1.8 and several VM argument tweaks. I detailed them here: http://blog.murphycr.com/post/42377466237/tutorial-running-karaf-on-a-raspberry-pi The trickiest part of the whole endeavor was getting it to run as a service; for some reason I couldn't get the environment to behave correctly when run with bin/start instead of bin/karaf. I haven't gotten so far as to documenting that though (I'll post it here when I do though!). Regards, Robert On Wed, Feb 6, 2013 at 5:35 PM, Jamie G. jamie.goody...@gmail.com wrote: Neat! Which version of the Raspberry Pi did you work with? What kind of modifications did you make? Cheers, Jamie On Wed, Feb 6, 2013 at 6:37 PM, Robert Murphy robert.mur...@chirondigital.com wrote: Greetings! I managed to get Karaf running smoothly on the Raspberry Pi against the Sun JDK 8 Preview; I'd love to work with the Karaf community to make this an easier process, as right now there are a couple modifications that need to be done to several core files before everything runs nicely. If any of the developers are interested in getting these tweaks integrated into Karaf, I'd be thrilled to help out in any way I can. Regards, Robert
Re: Karaf on Raspberry Pi
I think we would need to recompile the entire wrapper to arm as JB did for windows 64 bit. @JB, do you think this would give us any problems? Kind regards, Andreas On Thu, Feb 7, 2013 at 1:38 AM, Jamie G. jamie.goody...@gmail.com wrote: Hmm, I wonder what options we have for creating an arm friendly wrapper? :) Cheers, Jamie On Wed, Feb 6, 2013 at 8:34 PM, Robert Murphy robert.mur...@chirondigital.com wrote: Jamie - I have, but that produces a binary that of incompatible with the arm architecture. Thanks! On Feb 6, 2013 6:45 PM, Jamie G. jamie.goody...@gmail.com wrote: Nice blog post! Have you tried using Karaf's service wrapper? http://karaf.apache.org/manual/latest-2.3.x/users-guide/wrapper.html Cheers, Jamie On Wed, Feb 6, 2013 at 7:12 PM, Robert Murphy robert.mur...@chirondigital.com wrote: It's on the v2 (512 MB RAM, though CPU seems to be the main constraint at this point). The tweaks were basically just making karaf aware that it was running under jdk 1.8 and several VM argument tweaks. I detailed them here: http://blog.murphycr.com/post/42377466237/tutorial-running-karaf-on-a-raspberry-pi The trickiest part of the whole endeavor was getting it to run as a service; for some reason I couldn't get the environment to behave correctly when run with bin/start instead of bin/karaf. I haven't gotten so far as to documenting that though (I'll post it here when I do though!). Regards, Robert On Wed, Feb 6, 2013 at 5:35 PM, Jamie G. jamie.goody...@gmail.com wrote: Neat! Which version of the Raspberry Pi did you work with? What kind of modifications did you make? Cheers, Jamie On Wed, Feb 6, 2013 at 6:37 PM, Robert Murphy robert.mur...@chirondigital.com wrote: Greetings! I managed to get Karaf running smoothly on the Raspberry Pi against the Sun JDK 8 Preview; I'd love to work with the Karaf community to make this an easier process, as right now there are a couple modifications that need to be done to several core files before everything runs nicely. If any of the developers are interested in getting these tweaks integrated into Karaf, I'd be thrilled to help out in any way I can. Regards, Robert
Re: Karaf on Raspberry Pi
Well at least it's a starting point; any objections if for now I simply copy the jre7 section? Kind regards, Andreas On Thu, Feb 7, 2013 at 5:01 AM, Robert Murphy robert.mur...@chirondigital.com wrote: Andreas - That is effectively what I did; it might not be 100% correct as the jdk8 is still in preview, but as far as functionality is concerned it works just fine. On Wed, Feb 6, 2013 at 10:38 PM, Andreas Pieber anpie...@gmail.comwrote: @KARAF-2163: does anybody knows if it's just: copy jre7 and rename it to jre8? Kind regards, Andreas On Thu, Feb 7, 2013 at 12:47 AM, Jamie G. jamie.goody...@gmail.comwrote: A jira entry has already been filed to add ire-8 to the properties file: https://issues.apache.org/jira/browse/KARAF-2163 Cheers, Jamie On Wed, Feb 6, 2013 at 8:15 PM, Jamie G. jamie.goody...@gmail.com wrote: Nice blog post! Have you tried using Karaf's service wrapper? http://karaf.apache.org/manual/latest-2.3.x/users-guide/wrapper.html Cheers, Jamie On Wed, Feb 6, 2013 at 7:12 PM, Robert Murphy robert.mur...@chirondigital.com wrote: It's on the v2 (512 MB RAM, though CPU seems to be the main constraint at this point). The tweaks were basically just making karaf aware that it was running under jdk 1.8 and several VM argument tweaks. I detailed them here: http://blog.murphycr.com/post/42377466237/tutorial-running-karaf-on-a-raspberry-pi The trickiest part of the whole endeavor was getting it to run as a service; for some reason I couldn't get the environment to behave correctly when run with bin/start instead of bin/karaf. I haven't gotten so far as to documenting that though (I'll post it here when I do though!). Regards, Robert On Wed, Feb 6, 2013 at 5:35 PM, Jamie G. jamie.goody...@gmail.com wrote: Neat! Which version of the Raspberry Pi did you work with? What kind of modifications did you make? Cheers, Jamie On Wed, Feb 6, 2013 at 6:37 PM, Robert Murphy robert.mur...@chirondigital.com wrote: Greetings! I managed to get Karaf running smoothly on the Raspberry Pi against the Sun JDK 8 Preview; I'd love to work with the Karaf community to make this an easier process, as right now there are a couple modifications that need to be done to several core files before everything runs nicely. If any of the developers are interested in getting these tweaks integrated into Karaf, I'd be thrilled to help out in any way I can. Regards, Robert
Re: Karaf 3.x Plans?
In fact we're just waiting for some more Aries releases which should be finished within the next weeks. From a feature point of view we're basically finished. Kind regards, Andreas On Feb 2, 2013 9:33 PM, Gareth gareth.o.coll...@gmail.com wrote: Hi, Are there plans to release Karaf 3.0 in the near future? I ask because I am interested in using a more current Pax Web release with Karaf (the Pax Web 1.1/Jetty 7 release is now a little old). I am curious - is Karaf 3.x just waiting for new releases of dependencies (e.g. Aries, XBean?)...or are there major features that still need to be implemented? thanks in advance, Gareth -- View this message in context: http://karaf.922171.n3.nabble.com/Karaf-3-x-Plans-tp4027587.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: Karaf RPMs - what options are out there?
What would we need to do to provide a RPM/DEB package? Where/how would we need to distribute it? Kind regards, Andreas On Tue, Jan 15, 2013 at 6:09 PM, aj...@virginia.edu aj...@virginia.edu wrote: I have, for local consumption. I based the build on an ActiveMQ RPM script. It wasn't too difficult and I can send you the example, if you like. There has been discussion before on this list about maintaining a semi-blessed RPM of Karaf directly from the project. I'd be happy to work on that, if I could get guidance and help integrating it into the release process. --- A. Soroka Software Systems Engineering :: Online Library Environment the University of Virginia Library On Jan 15, 2013, at 11:53 AM, Graham Leggett wrote: Hi all, I am currently packaging an application stack as RPMs for deployment to a production system, and am looking for RPM packages for karaf. JPackage provide a set of RPMs, but these RPMs depend on unstable (from package's perspective) packages, and weigh in at an eye watering 274MB of RPM dependencies. Has anyone other than package packaged karaf as an RPM? Regards, Graham --
Re: Karaf RPMs - what options are out there?
TBH I'm still not against the idea of having deb/rpm for Karaf. So the release manager would a) run mvn release:prepare ... b) scripts/rpmPackaging.sh c) deploy the final artifact to a rpm repo? Or simply provide it for download with the rest of the distribution in hope someone will pick up the package and distribute it to their favorite repos? Feel free to correct me, but it sounds reasonable to me. BUT the decision need to be made by our release managers: @Jamie: WDYT? Kind regards, Andreas p.s.: yes, please create an issue with this script anyway. Would make it ways easier to find later on. On Tue, Jan 15, 2013 at 6:37 PM, aj...@virginia.edu aj...@virginia.edu wrote: That's simpler than distributing an RPM to the repos, by a long ways. With that approach, all we'd need would be the spec (build) script. I append the example at the end of this email. As you can see, it's not very much. I'm sure it could be better written (I'm no RPM maven {grin}) but it's a place to start. I used the Tanuki Java wrapper as part of the dependency, because that software is commonly available for almost any Linux distro. I also included a separate body of source for the package (shown below as wrapper-gear) which is just the files that Karaf creates from its wrapper: commands. These two inclusions allow the RPM to install Karaf as a Linux system service using the Tanuki wrapper. Graham-- I'll send you a tarball of wrapper-gear off-list, just so you can see what I did. It's this worthy of opening an issue? --- A. Soroka Software Systems Engineering :: Online Library Environment the University of Virginia Library On Jan 15, 2013, at 12:26 PM, Graham Leggett wrote: On 15 Jan 2013, at 7:09 PM, aj...@virginia.edu wrote: I have, for local consumption. I based the build on an ActiveMQ RPM script. It wasn't too difficult and I can send you the example, if you like. If you can, I would hugely appreciate it. There has been discussion before on this list about maintaining a semi-blessed RPM of Karaf directly from the project. I'd be happy to work on that, if I could get guidance and help integrating it into the release process. I maintain the spec file that is shipped with Apache httpd, the idea is that you can download a httpd tarball and run rpmbuild -tb tarball and it just works. All that is required is that a file called package.spec exists in the root of the tarball. (It might also work if the spec file is elsewhere in the tree, but I've not tried). I am busy looking at the jpackage version of karaf, and it seems to have a significantly different structure to that shipped by Apache. Still need to get my head around it. Regards, Graham -- [ajs6f@esbdev2 SPECS]$ cat apache-karaf.spec # turn off Red Hat's jar repacking macro %define __jar_repack %{nil} # # Spec file for packaging Apach Karaf # Summary: Apache Karaf Name: apache-karaf Version: 2.2.7 Release: 1 License: Apache Group: Networking/Daemons Source0: http://www.apache.org/dyn/closer.cgi/karaf/2.2.7/apache-karaf-2.2.7.tar.gz Source1: wrapper-gear URL: http://karaf.apache.org/ Vendor: Apache Foundation Packager: A. Soroka aj...@virginia.edu BuildArch: noarch Requires: java Requires: tanukiwrapper %description Apache Karaf is a small OSGi runtime which provides a lightweight container into which various components and applications can be deployed. %define karaf_home /usr/share/karaf %define karaf_data /var/cache/karaf %define karaf_config /etc/karaf %define karaf_deploy /var/local/karaf-deploy %define karaf_startuplib /usr/local/lib/karaf %define karaf_systemlib /usr/local/lib/karaf-system %define karaf_logs /var/log/karaf %prep %setup %build # No build needed for a pure-Java application like Karaf /bin/true %install # Add the karaf user and group # the || : (or NOP) on the end allows for the condition that this user and group already exist /usr/sbin/groupadd --system karaf 2 /dev/null || : /usr/sbin/useradd --comment Apache Karaf -g karaf \ --shell /bin/bash --system --home-dir %{karaf_home} karaf 2 /dev/null || : # We put the various pieces of Karaf into FHS-appropriate places, then symln them # together in /usr/share/karaf. # First copy all of the Karaf tree into %{karaf_home} (should be /usr/share/karaf) mkdir -p $RPM_BUILD_ROOT%{karaf_home} mv --verbose * $RPM_BUILD_ROOT%{karaf_home}/ # Now we make the FHS locations for Karaf stuff mkdir -p $RPM_BUILD_ROOT%{karaf_config} $RPM_BUILD_ROOT%{karaf_deploy} $RPM_BUILD_ROOT%{karaf_deploy} \ $RPM_BUILD_ROOT%{karaf_startuplib} $RPM_BUILD_ROOT%{karaf_systemlib} $RPM_BUILD_ROOT%{karaf_data} \ $RPM_BUILD_ROOT%{karaf_logs} $RPM_BUILD_ROOT/etc/init.d # Now we move pieces of the tree out of there into FHS-appropriate spots and re-symln them pushd $RPM_BUILD_ROOT%{karaf_home} mv etc/* $RPM_BUILD_ROOT%{karaf_config} rmdir etc ln -s %{karaf_config}
Re: Using variables in org.apache.karaf.features.cfg
well, checking the code I would say there's no difference to the other .cfg files. I'm even not sure if it's a bootstrap error. How exactly can I reproduce the problem? in the features file: featuresRepositories = ${var} and in custom.properties var = mvn:org.apache.karaf.features/standard/3.0.0-SNAPSHOT/xml/features,mvn:org.apache.karaf.features/enterprise/3.0.0-SNAPSHOT/xml/features,mvn:org.apache.karaf.features/spring/3.0.0-SNAPSHOT/xml/features this shouldn't work, but e.g. in log.cfg pattern = ${abc} and in custom.settings abc = %d{ISO8601} | %-5.5p | %-16.16t | %-32.32c{1} | %X{bundle.id} - %X{ bundle.name} - %X{bundle.version} | %m%n works? Kind regards, Andreas On Thu, Dec 13, 2012 at 8:22 AM, Bengt Rodehav be...@rodehav.com wrote: What I've tried to do in org.apache.karaf.features.cfg works with other configuration files. File install does support this. It's the way I handle most of my tailored configurations in my custom server. However, there seem to be something special with org.apache.karaf.features.cfg since the same mechanisms dont work there. That's why I wondered whether file install was used for installing the features feature or if it was done by some other means. Could it be a bootstrap problem? /Bengt 2012/12/12 Andreas Pieber anpie...@gmail.com Hey, I'm afraid this is currently not really possible. The custom.properties is written into the System.setProperty while the fileinstall (but I've only checked the code only shortly) does not access this sort. I think to make this available would require a patch to fileinstall. @Everybody with more knowhow about the fileinstall internals: feel free to correct me :-) Kind regards, Andreas On Tue, Dec 11, 2012 at 10:42 AM, Bengt Rodehav be...@rodehav.comwrote: I have a use case where I want to move the list of boot features (the featuresBoot property) from org.apache.karaf.features.cfg into custom.properties. The reason is that our custom server comes with a great number of features but each customer only uses some of them. To allow for easy customisation (and upgrades) I put everything related to a specific installation in a custom.properties file (that I put outside the Karaf home directory). I can then easily see how this installation is customised and I can easily upgrade by simply replacing the entire Karaf installation and keep the customisation (since it is located outside Karaf). However, it seems I cannot use variables defined in custom.properties in org.apache.karaf.features.cfg. In fact, I cannot even define a variable in org.apache.karaf.features.cfg and then use it in the same file. How come? Isn't FileInstall used for parsing org.apache.karaf.features.cfg? How can I use custom variables in org.apache.karaf.features.cfg? I use Karaf 2.3.0 with Java 6 on Windows 7. /Bengt
Re: Using variables in org.apache.karaf.features.cfg
I meant xxx.logging.cfg and custom.properties; was just at a customer with no local karaf to check :-) OK, can you please provide a bug report? I can check on it tomorrow. Kind regards, Andreas On Thu, Dec 13, 2012 at 2:22 PM, Bengt Rodehav be...@rodehav.com wrote: Hello Andreas, Yes, the example with the features file (org.apache.karaf.features.cfg) doesn't work but should. I'm not sure what the log.cfg and the custom.settings files are. However, I do this in my org.ops4j.pax.logging.cfg: log4j.appender.info.file=${logdir}/info.log And I put this in the custom.properties: logdir=data/log I use that mechanism in several configuration files, e g: - org.ops4j.pax.web.cfg - org.apache.karaf.management.cfg - org.apache.karaf.shell.cfg The above gives me the possiblitly to manage all the ports in one place (custom.properties) - which is very convenient. But for some reason this mechanism doesn't work for org.apache.karaf.features.cfg. /Bengt 2012/12/13 Andreas Pieber anpie...@gmail.com well, checking the code I would say there's no difference to the other .cfg files. I'm even not sure if it's a bootstrap error. How exactly can I reproduce the problem? in the features file: featuresRepositories = ${var} and in custom.properties var = mvn:org.apache.karaf.features/standard/3.0.0-SNAPSHOT/xml/features,mvn:org.apache.karaf.features/enterprise/3.0.0-SNAPSHOT/xml/features,mvn:org.apache.karaf.features/spring/3.0.0-SNAPSHOT/xml/features this shouldn't work, but e.g. in log.cfg pattern = ${abc} and in custom.settings abc = %d{ISO8601} | %-5.5p | %-16.16t | %-32.32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n works? Kind regards, Andreas On Thu, Dec 13, 2012 at 8:22 AM, Bengt Rodehav be...@rodehav.com wrote: What I've tried to do in org.apache.karaf.features.cfg works with other configuration files. File install does support this. It's the way I handle most of my tailored configurations in my custom server. However, there seem to be something special with org.apache.karaf.features.cfg since the same mechanisms dont work there. That's why I wondered whether file install was used for installing the features feature or if it was done by some other means. Could it be a bootstrap problem? /Bengt 2012/12/12 Andreas Pieber anpie...@gmail.com Hey, I'm afraid this is currently not really possible. The custom.properties is written into the System.setProperty while the fileinstall (but I've only checked the code only shortly) does not access this sort. I think to make this available would require a patch to fileinstall. @Everybody with more knowhow about the fileinstall internals: feel free to correct me :-) Kind regards, Andreas On Tue, Dec 11, 2012 at 10:42 AM, Bengt Rodehav be...@rodehav.comwrote: I have a use case where I want to move the list of boot features (the featuresBoot property) from org.apache.karaf.features.cfg into custom.properties. The reason is that our custom server comes with a great number of features but each customer only uses some of them. To allow for easy customisation (and upgrades) I put everything related to a specific installation in a custom.properties file (that I put outside the Karaf home directory). I can then easily see how this installation is customised and I can easily upgrade by simply replacing the entire Karaf installation and keep the customisation (since it is located outside Karaf). However, it seems I cannot use variables defined in custom.properties in org.apache.karaf.features.cfg. In fact, I cannot even define a variable in org.apache.karaf.features.cfg and then use it in the same file. How come? Isn't FileInstall used for parsing org.apache.karaf.features.cfg? How can I use custom variables in org.apache.karaf.features.cfg? I use Karaf 2.3.0 with Java 6 on Windows 7. /Bengt
Re: Using variables in org.apache.karaf.features.cfg
Hey, I'm afraid this is currently not really possible. The custom.properties is written into the System.setProperty while the fileinstall (but I've only checked the code only shortly) does not access this sort. I think to make this available would require a patch to fileinstall. @Everybody with more knowhow about the fileinstall internals: feel free to correct me :-) Kind regards, Andreas On Tue, Dec 11, 2012 at 10:42 AM, Bengt Rodehav be...@rodehav.com wrote: I have a use case where I want to move the list of boot features (the featuresBoot property) from org.apache.karaf.features.cfg into custom.properties. The reason is that our custom server comes with a great number of features but each customer only uses some of them. To allow for easy customisation (and upgrades) I put everything related to a specific installation in a custom.properties file (that I put outside the Karaf home directory). I can then easily see how this installation is customised and I can easily upgrade by simply replacing the entire Karaf installation and keep the customisation (since it is located outside Karaf). However, it seems I cannot use variables defined in custom.properties in org.apache.karaf.features.cfg. In fact, I cannot even define a variable in org.apache.karaf.features.cfg and then use it in the same file. How come? Isn't FileInstall used for parsing org.apache.karaf.features.cfg? How can I use custom variables in org.apache.karaf.features.cfg? I use Karaf 2.3.0 with Java 6 on Windows 7. /Bengt
Re: Configuring OpenJPA to log in Karaf 2.3.0
TBH I've never tried log4j, but I'm curious that slf4j does not work for you. My persistence.xml looks like: ?xml version=1.0 encoding=UTF-8? persistence version=2.0 xmlns=http://java.sun.com/xml/ns/persistence; persistence-unit name=rx.physikodata transaction-type=JTA providerorg.apache.openjpa.persistence.PersistenceProviderImpl/provider jta-data-sourceosgi:service/javax.sql.DataSource/(osgi.jndi.service.name=jdbc/rx)/jta-data-source LIST_OF_CLASSES_TO_SCAN exclude-unlisted-classestrue/exclude-unlisted-classes validation-modeNONE/validation-mode properties property name=openjpa.Log value=slf4j/ property name=openjpa.RuntimeUnenhancedClasses value=supported/ /properties /persistence-unit /persistence and it works like a charm. I've no changes in my logging configuration. Maybe it's something completely differently? logging from your bundles using slf4j works? You'vent changed the startup order, did you? Kind regards, Andreas On Fri, Nov 30, 2012 at 9:25 AM, Alexey Romanov alexey.v.roma...@gmail.com wrote: Same thing was happening with slf4j/commons back in Karaf 2.2.7, but it threw an exception on log4j. -- View this message in context: http://karaf.922171.n3.nabble.com/Configuring-OpenJPA-to-log-in-Karaf-2-3-0-tp4026959p4026967.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: Configuring OpenJPA to log in Karaf 2.3.0
yep, I'm only using the default org.ops4j.pax.logging.cfg coming with karaf Kind regards, Andreas On Fri, Nov 30, 2012 at 11:12 AM, Alexey Romanov alexey.v.roma...@gmail.com wrote: I've no changes in my logging configuration. Does that mean you have no log4j.category.openjpa.* properties in org.ops4j.pax.logging.cfg? Yours, Alexey Romanov On Fri, Nov 30, 2012 at 1:53 PM, Andreas Pieber [via Karaf] ml-node+s922171n4026971...@n3.nabble.com wrote: TBH I've never tried log4j, but I'm curious that slf4j does not work for you. My persistence.xml looks like: ?xml version=1.0 encoding=UTF-8? persistence version=2.0 xmlns=http://java.sun.com/xml/ns/persistence; persistence-unit name=rx.physikodata transaction-type=JTA providerorg.apache.openjpa.persistence.PersistenceProviderImpl/provider jta-data-sourceosgi:service/javax.sql.DataSource/( osgi.jndi.service.name=jdbc/rx)/jta-data-source LIST_OF_CLASSES_TO_SCAN exclude-unlisted-classestrue/exclude-unlisted-classes validation-modeNONE/validation-mode properties property name=openjpa.Log value=slf4j/ property name=openjpa.RuntimeUnenhancedClasses value=supported/ /properties /persistence-unit /persistence and it works like a charm. I've no changes in my logging configuration. Maybe it's something completely differently? logging from your bundles using slf4j works? You'vent changed the startup order, did you? Kind regards, Andreas On Fri, Nov 30, 2012 at 9:25 AM, Alexey Romanov [hidden email] http://user/SendEmail.jtp?type=nodenode=4026971i=0 wrote: Same thing was happening with slf4j/commons back in Karaf 2.2.7, but it threw an exception on log4j. -- View this message in context: http://karaf.922171.n3.nabble.com/Configuring-OpenJPA-to-log-in-Karaf-2-3-0-tp4026959p4026967.html Sent from the Karaf - User mailing list archive at Nabble.com. -- If you reply to this email, your message will be added to the discussion below: http://karaf.922171.n3.nabble.com/Configuring-OpenJPA-to-log-in-Karaf-2-3-0-tp4026959p4026971.html To unsubscribe from Configuring OpenJPA to log in Karaf 2.3.0, click herehttp://karaf.922171.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_codenode=4026959code=YWxleGV5LnYucm9tYW5vdkBnbWFpbC5jb218NDAyNjk1OXwtNDYxNDY2Mjc3 . NAMLhttp://karaf.922171.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewerid=instant_html%21nabble%3Aemail.namlbase=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespacebreadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml -- View this message in context: http://karaf.922171.n3.nabble.com/Configuring-OpenJPA-to-log-in-Karaf-2-3-0-tp4026959p4026976.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: Undocumented caveats of Pax Url Wrap when used in Karaf Shell
Hey Caspar, thank you very much for pointing this out. I'm with you that adding this into the documentation would be a great enhancement. Since the entire documentation is within the source repository would you mind creating a jira and a patch for this problem? Thank you very much and kind regards, Andreas On Thu, Nov 29, 2012 at 1:26 PM, Caspar MacRae ear...@gmail.com wrote: Ooops, sorry I forgot to escape XML entities (ampersands and quotes) in previous example, it should have read: bundlewrap:mvn:jboss/jbossall-client/${version.jboss}/$Bundle-SymbolicName=jbossall-clientamp;Bundle-Version=${version.jboss}amp;Export-Package=org.jboss.remoting;version=quot;${version.jboss}quot;,!*/bundle or bundle![CDATA[ wrap:mvn:jboss/jbossall-client/4.2.3.GA/$Bundle-SymbolicName=jbossall-clientBundle-Version=4.2.3.GAExport-Package=org.jboss.remoting;version=4.2.3.GA,!* ]]/bundle cheers, Caspar On 29 November 2012 12:02, Caspar MacRae ear...@gmail.com wrote: Hello, Although this is a trivial issue to raise, the ability to dynamically wrap urls both on the commandline and in features.xml is incredibly valuable (not to mention very, very cool) - it's just a pain I do this so infrequently I have to re-discover it each time. There are a couple of caveats to using the wrap protocol on the command line: You must use single quotes around the URL, as the dollar will be interpreted by the Karaf shell. You must use backslash to escape the exclamation mark (for example export-package exclude patterns) Neither of these is required when defined in a features file. The documentation (checked for 2.3.0) for the Wrap URL handler doesn't give any usage examples. I think it'd be great to have one example (command line and equivalent features.xml definition) with a note of the caveats above. I'm still unclear as to why the exclamation mark needs to be escaped, I don't think this is POSIX compliant, AFAIK the single quotes should prevent any interpretation (inconsistent as the dollar certainly doesn't need escaping). (I'd gladly file a jira and/or doc patch if required) thanks, Caspar For example: Commandline: install -s 'wrap:mvn:jboss/jbossall-client/4.2.3.GA/$Bundle-SymbolicName=jbossall-clientBundle-Version=4.2.3.GAExport-Package=org.jboss.remoting;version=4.2.3.GA,\!*' Features XML: bundlewrap:mvn:jboss/jbossall-client/${version.jboss}/$Bundle-SymbolicName=jbossall-clientBundle-Version=${version.jboss}Export-Package=org.jboss.remoting;version=${version.jboss},!*/bundle
Re: Intercepting Karaf's CommandSession.close()??
hey Dan, Not right now, but it might be a good idea thinking about a general hook system for karaf (on the dev list) since there are also other requests for other places. Maybe we can use the event admin to implement something more general (just thinking aloud :-)) Kind regards, Andreas On Wed, Nov 28, 2012 at 5:06 PM, Dan Tran dant...@gmail.com wrote: ping, to make sure it get attention :-) Thanks -D On Sun, Nov 25, 2012 at 2:04 PM, Dan Tran dant...@gmail.com wrote: Hi One of our custom karaf command shell stores its resource under Karaf CommandSession for sharing purpose. When user logging off the shell, I would like to an opportunity to clean up my resource. is it possible to intercept? Big Thanks -Dan
Re: Configuring OpenJPA to log in Karaf 2.3.0
and when you remove the line: property name=openjpa.Log value=slf4j / it works again? Kind regards, Andreas On Fri, Nov 30, 2012 at 6:43 AM, Alexey Romanov alexey.v.roma...@gmail.com wrote: How can I get OpenJPA to log SQL requests, etc.? I expected setting log4j.category.openjpa.Tool=INFO log4j.category.openjpa.Runtime=INFO log4j.category.openjpa.Remote=INFO log4j.category.openjpa.DataCache=INFO log4j.category.openjpa.MetaData=INFO log4j.category.openjpa.Enhance=INFO log4j.category.openjpa.Query=INFO log4j.category.openjpa.jdbc.SQL=INFO log4j.category.openjpa.jdbc.SQLDiag=INFO log4j.category.openjpa.jdbc.JDBC=INFO log4j.category.openjpa.jdbc.Schema=INFO in org.ops4j.pax.logging.cfg and property name=openjpa.Log value=slf4j / in persistence.xml would work (due to http://karaf.922171.n3.nabble.com/log4j-error-with-pax-logging-tt2459109.html#none) but it doesn't. And if I set openjpa.Log to log4j instead, in previous versions I got java.lang.NoClassDefFoundError: org/apache/log4j/LogManager, but with Karaf 2.3.0 my bundle fails to start instead, waiting for a dependency: 2012-11-29 17:40:28,931 | INFO | rint Extender: 1 | BlueprintContainerImpl | container.BlueprintContainerImpl 330 | 7 - org.apache.aries.blueprint.core - 1.0.1 | Bundle ru.focusmedia.odp.server.datastore.jpa is waiting for dependencies [(((!(org.apache.aries.jpa.proxy.factory=*))(osgi.unit.name=ODP_Server))(objectClass=javax.persistence.EntityManagerFactory))] Yours, Alexey Romanov
Re: Thank you Karaf from marioteguh.tumblr.com :-)
This is really great to hear! Thanks for sharing this with us. Kind regards, Andreas On Tue, Nov 20, 2012 at 11:25 AM, Hendy Irawan he...@soluvas.com wrote: Hi Karaf, I'd like to say thank you that Karaf (and Camel) :) has been rock solid on one of my project, syncing between Mario Teguh Facebook page and Tumblr at http://marioteguh.tumblr.com/ : It's been running Karaf 2.2.8 and Camel 2.10 for 3+ months without a hitch (yeah, no time to upgrade to 2.3.0. If it ain't broke, don't fix it hehe...). Very low RAM consumption too :-) Actually it's been running longer than that, but we had occasional server restarts due to package upgrades. karaf@root info Karaf Karaf version 2.2.8 Karaf home /home/ceefour/karaf Karaf base /home/ceefour/karaf OSGi Framework org.apache.felix.framework - 3.0.9 JVM Java Virtual MachineJava HotSpot(TM) Server VM version 23.1-b03 Version 1.7.0_05 Vendor Oracle Corporation Uptime 106 days 16 hours Total compile time 34.192 seconds Threads Live threads37 Daemon threads 32 Peak40 Total started 34841 Memory Current heap size 19,576 kbytes Maximum heap size 27,776 kbytes Committed heap size 27,776 kbytes Pending objects 0 Garbage collector Name = 'Copy', Collections = 2307917, Time = 1 hour 13 minutes Garbage collector Name = 'MarkSweepCompact', Collections = 2564, Time = 4 minutes Classes Current classes loaded 5,977 Total classes loaded182,904 Total classes unloaded 176,927 Operating system NameLinux version 3.4.2-linode44 Architecturei386 Processors 4 karaf@root The server uptime itself : $ uptime 09:56:27 up 106 days, 18:02, 2 users, load average: 1.06, 0.96, 0.80 So Karaf itself has been running continuously from the last server restart to this day. :-) karaf@root list -s START LEVEL 100 , List Threshold: 50 ID State Blueprint Level Symbolic name [ 64] [Active ] [] [ 50] org.apache.camel.camel-core (2.10.0) [ 65] [Active ] [Created ] [ 50] org.apache.camel.karaf.camel-karaf-commands (2.10.0) [ 66] [Active ] [] [ 50] javax.mail (1.4.5) [ 67] [Active ] [] [ 50] org.apache.ws.commons.axiom.axiom-impl (1.2.10) [ 68] [Active ] [] [ 50] org.apache.ws.commons.axiom.axiom-api (1.2.10) [ 69] [Active ] [] [ 50] org.apache.abdera.core (1.1.2) [ 70] [Active ] [] [ 50] org.apache.abdera.extensions-main (1.1.2) [ 71] [Active ] [] [ 50] org.apache.abdera.i18n (1.1.2) [ 72] [Active ] [] [ 50] org.apache.abdera.parser (1.1.2) [ 73] [Active ] [] [ 50] org.apache.commons.codec (1.6.0) [ 74] [Active ] [] [ 50] org.apache.camel.camel-atom (2.10.0) [ 75] [Active ] [] [ 50] org.apache.servicemix.bundles.jdom (1.1.0.4) [ 76] [Active ] [] [ 50] org.apache.servicemix.bundles.rome (1.0.0.3) [ 77] [Active ] [] [ 50] org.apache.camel.camel-rss (2.10.0) [ 78] [Active ] [] [ 50] org.apache.camel.camel-mail (2.10.0) [ 79] [Active ] [] [ 50] org.apache.servicemix.bundles.mvel (2.0.18.3) [ 80] [Active ] [] [ 50] org.apache.camel.camel-mvel (2.10.0) [ 81] [Active ] [] [ 80] org.codehaus.jettison.jettison (1.3.1) [ 87] [Active ] [] [ 80] org.apache.commons.io (2.4.0) [ 88] [Active ] [Created ] [ 80] marioteguh-to-tumblr_v3.blueprint.xml (0.0.0) [ 89] [Active ] [Created ] [ 50] org.apache.camel.camel-blueprint (2.10.0) [ 90] [Active ] [] [ 80] com.daneshzaki.tumblej (2.0.0.SNAPSHOT) [ 91] [Active ] [] [ 80] org.soluvas.social.tumblej-camel (1.0.0.SNAPSHOT) [ 93] [Active ] [] [ 80] jackson-core-asl (1.9.8) [ 94] [Active ] [] [ 80] jackson-jaxrs (1.9.8) [ 95] [Active ] [] [ 80] jackson-mapper-asl (1.9.8) [ 127] [Active ] [] [ 80] com.sun.jersey.contribs.jersey-oauth.oauth-client (1.13.0) [ 128] [Active ] [] [ 80] com.sun.jersey.core (1.13.0) [ 129] [Active ] [] [ 80] com.sun.jersey.json (1.13.0) [ 130] [Active ] [] [ 80] com.sun.jersey.client (1.13.0) [ 131] [Active ] [] [ 80] com.sun.jersey.contribs.jersey-oauth.oauth-signature (1.13.0) Unfortunately, it's time to do yet another server restart due to package and kernel upgrades. So before I reset the
Re: karaf exam: Test with more than one container
not really. The problem is that Pax Exam itself isn't build for a situation like this. And since pax-exam-karaf only uses the capabilities of exam it's not really possible to do this. IMHO there are only two options: a) use the not so great solution via the admin service b) come up with something completely new Kind regards, Andreas On Mon, Nov 19, 2012 at 1:06 PM, Christian Schneider ch...@die-schneider.net wrote: For test with one karaf container karaf exam is already really great. For some cases though I would like to create more than one karaf. For example in cxf dosgi I would like to have two instances: 1) Client for service and DOSGi infrastructure 2) Service provider, DOSGi infrastructure, Zookeeper server My test would run inside container 1 but it would also need container 2. So what are the best practices to achieve this? I have found similar cases in Karaf Cellar. There the karaf admin(instance) service is used to create another karaf instance. Then the connect command is used to issue shell commands that configure the remote instance. While this works it is very verbose and contains a lot of sleeps to wait for commands to finish. What I would ideally be looking for is to have one set of pax exam options for each container. Is that possible? Christian
Re: webonline help as a paxwick-osgi bundle
Hey Dan, What exactly do you want pax-wicket to do here? Provide static content? Or is all you want to do providing static content? Or do I completely miss the point? Kind regards, Andreas On Thu, Nov 8, 2012 at 12:32 AM, Dan Tran dant...@gmail.com wrote: I have webhelp content in a jar file. I like to covert it to a paxwicket's osgi bundle so that when user clicks on my help button, it will navigate to our bundle's index.html as a external window. Is it possible? Thanks -D
Re: webonline help as a paxwick-osgi bundle
OK, basically all you have to do is to mount static resources [1]. Is this what you're looking for? Kind regards, Andreas [1] http://wicketinaction.com/2011/07/wicket-1-5-mounting-resources/ On Fri, Nov 16, 2012 at 6:07 PM, Dan Tran dant...@gmail.com wrote: Hi Andreas, Exactly, you have summarized what I am try to convey. Basically, I would like pax-wicket to serve my static contents which are in a bundle/jar. is it possible? -Dan On Fri, Nov 16, 2012 at 4:31 AM, Andreas Pieber anpie...@gmail.com wrote: Hey Dan, What exactly do you want pax-wicket to do here? Provide static content? Or is all you want to do providing static content? Or do I completely miss the point? Kind regards, Andreas On Thu, Nov 8, 2012 at 12:32 AM, Dan Tran dant...@gmail.com wrote: I have webhelp content in a jar file. I like to covert it to a paxwicket's osgi bundle so that when user clicks on my help button, it will navigate to our bundle's index.html as a external window. Is it possible? Thanks -D
Re: webonline help as a paxwick-osgi bundle
I think mountResource(/mount/path, new SomeResourceReference()); should do exactly that. BUT if you don't need it at the same path (e.g. wicket is at localhost:8080/mywicketapp and the resources at localhost:8080/myresources) then using the plain http service is definitely the best solution to host static resources. On Fri, Nov 16, 2012 at 6:26 PM, Dan Tran dant...@gmail.com wrote: not sure it helps by looking at the code involved. All I want is to have pax-wicket picking up my initial index.html as the starting point and form there on every thing is like a static content -D On Fri, Nov 16, 2012 at 9:16 AM, Andreas Pieber anpie...@gmail.com wrote: OK, basically all you have to do is to mount static resources [1]. Is this what you're looking for? Kind regards, Andreas [1] http://wicketinaction.com/2011/07/wicket-1-5-mounting-resources/ On Fri, Nov 16, 2012 at 6:07 PM, Dan Tran dant...@gmail.com wrote: Hi Andreas, Exactly, you have summarized what I am try to convey. Basically, I would like pax-wicket to serve my static contents which are in a bundle/jar. is it possible? -Dan On Fri, Nov 16, 2012 at 4:31 AM, Andreas Pieber anpie...@gmail.com wrote: Hey Dan, What exactly do you want pax-wicket to do here? Provide static content? Or is all you want to do providing static content? Or do I completely miss the point? Kind regards, Andreas On Thu, Nov 8, 2012 at 12:32 AM, Dan Tran dant...@gmail.com wrote: I have webhelp content in a jar file. I like to covert it to a paxwicket's osgi bundle so that when user clicks on my help button, it will navigate to our bundle's index.html as a external window. Is it possible? Thanks -D
Re: dev:watch problems
Since I can reproduce it anyhow locally it's kind of tricky... Looking at the code again I would say the only reason that it fails if update works is that you messed something up in your system (timestamps do not match). Would you mind attaching a remove debugger to your system and setting a breakpoint to org.apache.karaf.shell.dev.watch.BundleWatcher line 85. The code there is really simple and you should see the problem within minutes then. Sorry for not being of any more help :-( Kind regards, Andreas On Sat, Nov 10, 2012 at 9:18 AM, Bengt Rodehav be...@rodehav.com wrote: Good morning Andreas. I have a Nexus repository specified in my settings.xml - could that be a problem? However, doing an update 97 works fine showing that the bundle location can be found. In this case the bundle resides in my local maven repo (not in Nexus). /Bengt Den 10 nov 2012 07:31 skrev Andreas Pieber anpie...@gmail.com: Hey Bengt, I've just checked again, but I can confirm that dev:watch bascially does what it should do. How do you install your bundles? Have you configured any alternative maven repositories? Any other unusual settings? Kind regards, Andreas On Fri, Nov 9, 2012 at 4:47 PM, Bengt Rodehav be...@rodehav.com wrote: Thanks, /Bengt 2012/11/9 Jean-Baptiste Onofré j...@nanthrax.net OK thanks for the update, I take a look just after your other issue ;) Regards JB On 11/09/2012 04:39 PM, Bengt Rodehav wrote: I get the exact same results using Karaf 2.2.9. /Bengt 2012/11/9 Jean-Baptiste Onofré j...@nanthrax.net mailto: j...@nanthrax.net It should be do automatically. Could you test the same with Karaf 2.2.9 ? Regards JB On 11/09/2012 02:58 PM, Bengt Rodehav wrote: I tried dev:watch * but the bundle still doesn't get updated. BTW do I need to execute dev:watch --start or is it being done automatically after I've done dev:watch 97? /Bengt 2012/11/9 Andreas Pieber anpie...@gmail.com mailto:anpie...@gmail.com mailto:anpie...@gmail.com mailto:anpie...@gmail.com good question. Does a dev:watch * works as expected? Kind regards, Andreas On Fri, Nov 9, 2012 at 9:40 AM, Bengt Rodehav be...@rodehav.com mailto:be...@rodehav.com mailto:be...@rodehav.com mailto:be...@rodehav.com wrote: It looks like this in the log: /2012-11-09 09:34:21,416 | DEBUG | Thread-50| BundleWatcher| af.shell.dev.watch.__**BundleWatcher 81 | Bundle watcher thread started/ /2012-11-09 09:34:21,416 | DEBUG | Thread-50| configadmin | ? ? | getProperties()/ /2012-11-09 09:34:21,421 | DEBUG | lixDispatchQueue | framework | ? ? | FrameworkEvent PACKAGES REFRESHED - org.apache.felix.framework/ /2012-11-09 09:34:22,421 | DEBUG | Thread-50| configadmin | ? ? | getProperties()/ /2012-11-09 09:34:22,421 | DEBUG | lixDispatchQueue | framework | ? ? | FrameworkEvent PACKAGES REFRESHED - org.apache.felix.framework/ /2012-11-09 09:34:23,421 | DEBUG | Thread-50| configadmin | ? ? | getProperties()/ /2012-11-09 09:34:23,421 | DEBUG | lixDispatchQueue | framework | ? ? | FrameworkEvent PACKAGES REFRESHED - org.apache.felix.framework/ Thus, every second the package org.apache.felix.framework seems to be refreshed. Nothing about bundle 97 though. When I manually do an update 97, the bundle is refreshed properly. BTW, I'm running on Windows 7. /Bengt 2012/11/9 j...@nanthrax.net mailto:j...@nanthrax.net mailto:j...@nanthrax.net mailto:j...@nanthrax.net j...@nanthrax.net mailto:j...@nanthrax.net mailto:j...@nanthrax.net mailto:j...@nanthrax.net Hi, Do you have something in the log ? Regards JB -- Jean-Baptiste Onofré jbono...@apache.org mailto:jbono...@apache.org mailto:jbono...@apache.org mailto:jbono...@apache.org http://blog.nanthrax.net Talend - http://wwx.talend.com
Re: dev:watch problems
good question. Does a dev:watch * works as expected? Kind regards, Andreas On Fri, Nov 9, 2012 at 9:40 AM, Bengt Rodehav be...@rodehav.com wrote: It looks like this in the log: *2012-11-09 09:34:21,416 | DEBUG | Thread-50| BundleWatcher | af.shell.dev.watch.BundleWatcher 81 | Bundle watcher thread started* *2012-11-09 09:34:21,416 | DEBUG | Thread-50| configadmin | ? ? | getProperties()* *2012-11-09 09:34:21,421 | DEBUG | lixDispatchQueue | framework | ? ? | FrameworkEvent PACKAGES REFRESHED - org.apache.felix.framework* *2012-11-09 09:34:22,421 | DEBUG | Thread-50| configadmin | ? ? | getProperties()* *2012-11-09 09:34:22,421 | DEBUG | lixDispatchQueue | framework | ? ? | FrameworkEvent PACKAGES REFRESHED - org.apache.felix.framework* *2012-11-09 09:34:23,421 | DEBUG | Thread-50| configadmin | ? ? | getProperties()* *2012-11-09 09:34:23,421 | DEBUG | lixDispatchQueue | framework | ? ? | FrameworkEvent PACKAGES REFRESHED - org.apache.felix.framework* Thus, every second the package org.apache.felix.framework seems to be refreshed. Nothing about bundle 97 though. When I manually do an update 97, the bundle is refreshed properly. BTW, I'm running on Windows 7. /Bengt 2012/11/9 j...@nanthrax.net j...@nanthrax.net Hi, Do you have something in the log ? Regards JB -- Jean-Baptiste Onofré jbono...@apache.org http://blog.nanthrax.net Talend - http://wwx.talend.com - Reply message - From: Bengt Rodehav be...@rodehav.com To: user@karaf.apache.org Subject: dev:watch problems Date: Fri, Nov 9, 2012 8:55 am I'm trying to get the dev:watch command to work but I haven't succeeded yet. If I want to watch the bundle with id 97, I do as follows: dev:watch -i 1000 dev:watch 97 dev:watch --start I could probably do all that in one go but the above is for clarity. If I then rebuild (using maven) the bundle with id 97 I excpect that bundle to be updated within approximately 1 s. However, it never happens. If I then do a update 97 then it works. The command dev:watch --list shows the following: *karaf@root dev:watch --list* *URL ID Bundle Name* * * *97 97 Service-Container :: web-service-plugin* What am I doing wrong? I'm using Karaf 2.3.0. /Bengt
Re: pax-provision and profiles for Karaf 2.3.0
well, it's not surprising since org.ops4j.pax.url.pax-url-mvn/pax-url-mvn/1.3.5 is wrong... it should be org.ops4j.pax.url/pax-url-mvn/1.3.5 Do you think it could be a problem in pax-runner? Do you find the wrong defined url anywhere? BTW, why do you need pax-runner(provision) for this? Typically, here we've made ways better experiences building a karaf assembly directly instead of using pax-provision. Kind regards, Andreas On Mon, Oct 22, 2012 at 2:42 PM, Kjell Otto otto.kj...@gmail.com wrote: Just wanted to provide more information, now I've enabled the param--log=debug/param flag. Output follows: - Succesfully downloaded to [runner/bundles/-1911136526.jar] - Downloading [mvn:org.ops4j.pax.url.pax-url-mvn/pax-url-mvn/1.3.5] - Creating new file at destination: /Users/kjellski/Projects/java/osgi/controlling/runner/bundles/-2136097610.jar - Resolving [mvn:org.ops4j.pax.url.pax-url-mvn/pax-url-mvn/1.3.5]g... - Using manager SimpleLocalRepositoryManager with priority 0 for /Users/kjellski/.m2/repository - Using connector WagonRepositoryConnector with priority 0 for file:/Users/kjellski/.m2/repository/ - Using connector WagonRepositoryConnector with priority 0 for http://repo1.maven.org/maven2/ via localhost:1337 - Using connector WagonRepositoryConnector with priority 0 for http://repository.ops4j.org/maven2/ via localhost:1337 - Using connector WagonRepositoryConnector with priority 0 for http://scm.ops4j.org/repos/ops4j/projects/pax/runner-repository/ via localhost:1337 - Using connector WagonRepositoryConnector with priority 0 for http://osgi.sonatype.org/content/groups/pax-runner/ via localhost:1337 - Using connector WagonRepositoryConnector with priority 0 for http://repo1.maven.org/maven2/ via localhost:1337 - Using connector WagonRepositoryConnector with priority 0 for http://repository.ops4j.org/maven2/ via localhost:1337 - Using connector WagonRepositoryConnector with priority 0 for http://repository.springsource.com/maven/bundles/release/ via localhost:1337 - Using connector WagonRepositoryConnector with priority 0 for http://repository.springsource.com/maven/bundles/external/ via localhost:1337 ___ / / / / Oops, there has been a problem! / / /__/ org.ops4j.pax.runner.platform.PlatformException: [mvn:org.ops4j.pax.url.pax-url-mvn/pax-url-mvn/1.3.5] could not be downloaded ___ /__/ - Exception caught during execution: java.lang.RuntimeException: org.ops4j.pax.runner.platform.PlatformException: [mvn:org.ops4j.pax.url.pax-url-mvn/pax-url-mvn/1.3.5] could not be downloaded at org.ops4j.pax.runner.Run.startPlatform(Run.java:668) at org.ops4j.pax.runner.Run.start(Run.java:207) at org.ops4j.pax.runner.Run.main(Run.java:134) at org.ops4j.pax.runner.Run.main(Run.java:102) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.ops4j.pax.construct.lifecycle.ProvisionMojo.invokePaxRunner(ProvisionMojo.java:837) at org.ops4j.pax.construct.lifecycle.ProvisionMojo.deployRunnerNG(ProvisionMojo.java:815) at org.ops4j.pax.construct.lifecycle.ProvisionMojo.deployBundles(ProvisionMojo.java:502) at org.ops4j.pax.construct.lifecycle.ProvisionMojo.execute(ProvisionMojo.java:287) at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59) at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183) at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161) at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:319) at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156) at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537) at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196) at org.apache.maven.cli.MavenCli.main(MavenCli.java:141) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at
Re: Does maven-bundle-plugin merge its 'instructions' settings?
Hey Dan, Do you really mean the maven-plugin-plugin or the maven-bundle-plugin (as I assume since you referenced the Felix group). In case you mean the maven-bundle-plugin it depends which properties you mean. AFAIK different properties are merged, same properties are merged. E.g. if you define a package-export and bundle-import in the parent, but package-import and package-export in the client, the bundle-import will be taken from the parent, package-import and export from the child. Hope this helps. Kind regards, Andreas On Tue, Oct 30, 2012 at 6:26 AM, Dan Tran dant...@gmail.com wrote: Hi, This question should go to felix group, but I think we have experts here who many be able to help me with this question Currently I have maven-plugin-plugin configure at the parent pom outside of my projects. once a while I need to override the 'instructions' settings from sub project. The question here should I just add/set the 'changed' portion and assume maven-bundle-plugin would merge the 'changed' portion to the one already defined at my parent pom? Thanks -Dan
Re: same feature name exists in multi feature repo url
Sounds reasonable. Can you please file a feature request at our jira that we don't forget about it? Kind regards, Andreas On Thu, Oct 25, 2012 at 3:30 AM, XiLai Dai xl...@talend.com wrote: Hi, Maybe make improvement to features:install command, let it support features repo name, e.g. features:install features_a/myfeature/1.0 Thanks. Xilai -Original Message- From: aj...@virginia.edu [mailto:aj...@virginia.edu] Sent: Wednesday, October 24, 2012 7:59 PM To: user@karaf.apache.org Subject: Re: same feature name exists in multi feature repo url That's what I meant by ordering behavior undefined. The order of search is what you seem to be asking about. To my understanding (I welcome correction) it isn't defined and you just shouldn't rely on it (for example, it could change between versions of Karaf). If you want to make sure a certain feature is loaded and no other, you have to distinguish it (by name or version) from every other. Perhaps you might try using your FQDN or the like as a prefix. --- A. Soroka Software Systems Engineering :: Online Library Environment the University of Virginia Library On Oct 23, 2012, at 11:03 PM, XiLai Dai wrote: Hi, My question is not about the version of feature, there is only one feature of version 1.0 (name='myfeature' version='1.0') defined in both two different features xml which I provided in the mail. My question is: if a feature (same name and same version) exists in multi features repo xml, and all these features urls have been installed into karaf (using features:addurl), then, when I run: features:install myfeature/1.0 Which one will be installed? The one from features_a xml? or the one from features_b xml? From the result of my test, no matter the order of features:addurl commands executed, the last one from the features:listurl will be installed into karaf. Is it right? How to allow user to select his preferred myfeatu.re/1.0 to install? Thanks. Xilai -Original Message- From: aj...@virginia.edu [mailto:aj...@virginia.edu] Sent: Tuesday, October 23, 2012 10:01 PM To: user@karaf.apache.org Subject: Re: same feature name exists in multi feature repo url To be clear, if I understand you correctly, you're pointing out that the version attribute in feature XML _defines_ the version of the feature that the XML describes. The version of the feature that you offer when installing via Maven protocol (say, via command-line features:install) is what Karaf _searches_ for in available repositories, with ordering behavior undefined (howsoever the feature-version in Maven was defined). If you don't give a version in your install operation, it'll search for the latest non-snapshot in available repositories, with ordering behavior undefined. And just to be complete, if you use a feature version in a feature declared as a _dependency_ in feature XML, it has the second meaning. Is that right? Two different meanings, the first of which (defining the feature version) _only_ applies to the attribute in XML? --- A. Soroka Software Systems Engineering :: Online Library Environment the University of Virginia Library On Oct 23, 2012, at 9:50 AM, Andreas Pieber wrote: Well, since you're using the mvn (maven) Protokoll this need to be installed at least in your lokal maven repository. The version of the features you've defined in the xml is of NO relevance here. Only the version you've given it in your maven repo is relevant. For starting using the file: protocol, directly pointing to your features in your file system might be the easier option. You can still switch to mvn there once you get familiar with the concepts. Kind regards, Andreas On Oct 23, 2012 5:36 AM, XiLai Dai xl...@talend.com wrote: Hi, One feature xml named feature-a-1.0 ?xml version=1.0 encoding=UTF-8? features name=feature_a xmlns=http://karaf.apache.org/xmlns/features/v1.0.0; feature name='myfeature' version='1.0' bundlemvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.jdom/1.1_4/bundle /feature /features The other feature xml named feature-b-1.0 features name=feature_b xmlns=http://karaf.apache.org/xmlns/features/v1.0.0; feature name='myfeature' version='1.0' bundlemvn:commons-dbcp/commons-dbcp/1.4/bundle /feature /features Both of them have a feature named myfeature/1.0. then add feature url to karaf: karaf@rootfeatures:addurl mvn:org.test/features_a/1.0/xml karaf@rootfeatures:addurl mvn:org.test/features_b/1.0/xml then install myfeature: karaf@rootfeatures:install myfeature which is the expected search order for myfeature? I did some test but sometimes myfeature which from the first features xml installed, sometimes from the second features. Thanks. Xilai
Re: same feature name exists in multi feature repo url
Yep. Exactly to the point! Would you mind if I include your summary into the user manual? :-) Kind regards, Andreas On Oct 23, 2012 4:02 PM, aj...@virginia.edu aj...@virginia.edu wrote: To be clear, if I understand you correctly, you're pointing out that the version attribute in feature XML _defines_ the version of the feature that the XML describes. The version of the feature that you offer when installing via Maven protocol (say, via command-line features:install) is what Karaf _searches_ for in available repositories, with ordering behavior undefined (howsoever the feature-version in Maven was defined). If you don't give a version in your install operation, it'll search for the latest non-snapshot in available repositories, with ordering behavior undefined. And just to be complete, if you use a feature version in a feature declared as a _dependency_ in feature XML, it has the second meaning. Is that right? Two different meanings, the first of which (defining the feature version) _only_ applies to the attribute in XML? --- A. Soroka Software Systems Engineering :: Online Library Environment the University of Virginia Library On Oct 23, 2012, at 9:50 AM, Andreas Pieber wrote: Well, since you're using the mvn (maven) Protokoll this need to be installed at least in your lokal maven repository. The version of the features you've defined in the xml is of NO relevance here. Only the version you've given it in your maven repo is relevant. For starting using the file: protocol, directly pointing to your features in your file system might be the easier option. You can still switch to mvn there once you get familiar with the concepts. Kind regards, Andreas On Oct 23, 2012 5:36 AM, XiLai Dai xl...@talend.com wrote: Hi, One feature xml named feature-a-1.0 ?xml version=1.0 encoding=UTF-8? features name=feature_a xmlns= http://karaf.apache.org/xmlns/features/v1.0.0; feature name='myfeature' version='1.0' bundlemvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.jdom/1.1_4/bundle /feature /features The other feature xml named feature-b-1.0 features name=feature_b xmlns= http://karaf.apache.org/xmlns/features/v1.0.0; feature name='myfeature' version='1.0' bundlemvn:commons-dbcp/commons-dbcp/1.4/bundle /feature /features Both of them have a feature named “myfeature”/”1.0”. then add feature url to karaf: karaf@rootfeatures:addurl mvn:org.test/features_a/1.0/xml karaf@rootfeatures:addurl mvn:org.test/features_b/1.0/xml then install myfeature: karaf@rootfeatures:install myfeature which is the expected search order for myfeature? I did some test but sometimes myfeature which from the first features xml installed, sometimes from the second features. Thanks. Xilai
Re: Possible to point karaf-pax-exam to a existing unpacked karaf installation
Only for you Dan ;-) I've attached a VERY quick patch which might fix your problem. Feel free to apply and test if it helps you. Basically all you need to do is a) apply the patch to karaf-2.3.x branch (which had been attached to the issue created by you); build it yourself and use the following constructor where frameworkUrl points to an already unpacked directory: public KarafDistributionBaseConfigurationOption(String frameworkURL, String name, String karafVersion, Boolean frameworkUrlPointsToTargetDirectory) { I hope this helps. Kind regards, Andreas On Mon, Oct 15, 2012 at 8:42 AM, Dan Tran dant...@gmail.com wrote: Hi Andreas Could you point me to the '2 already supported'? that would give some hint how to implement the 'alreadyUnpack' one. Or you can cook up the new option, i can test it right away :-) Thanks -D On Fri, Oct 5, 2012 at 1:09 PM, Andreas Pieber anpie...@gmail.com wrote: Right now it's not really possible. But it shouldn't be too hard to add a new distribution provider simular to the two already supported. So, while I personally don't need it I think it's a good idea. Kind regards, Andreas On Oct 5, 2012 9:33 AM, Jean-Baptiste Onofré j...@nanthrax.net wrote: Hi Dan, it's not possible easily (you have to use KarafDistributionBaseConfigurationOption.unpackDirectory manually, etc) but I think it's a good addition. We can add an option like UseExistingKaraf().setLocation(...) for instance. Regards JB On 10/05/2012 09:14 AM, Dan Tran wrote: Hi I am able to run simple pax-exam test with a given karaf archive using maven coordinate. Is there a option to use an existing karaf directory ( ie already unpack )?? Thanks -D -- Jean-Baptiste Onofré jbono...@apache.org http://blog.nanthrax.net Talend - http://www.talend.com
Re: Karaf and DOSGi throwing Null pointer Exception
@raw tag: seams like gmail does not likes it @error: well, without debugging the code I would simply guess it's related to something not correctly shutting down the bundles (since some parts of the framework might not know to be a bundle). But independently; why does the exception border you? Do you encounter any real errors by it? Kind regards, Andreas On Thu, Oct 18, 2012 at 11:21 AM, dealbitte anand.boc...@gmail.com wrote: Hi, I believe you are not able to see text between the tag in the email. I pasted the exception within raw tags: Anyways, here again: installed DOSGi Single-Bundle Distribution (1.3.1) from maven as follows: install -s mvn:org.apache.cxf.dosgi/cxf-dosgi-ri-singlebundle-distribution/1.3.1 .. But when I exit karaf instance (by pressing Ctrl+D), I got the following exception: It is not clear to me, why this exception occurs. Any help towards this is appreciated. karaf@test-instance ^D ERROR: Bundle cxf-dosgi-ri-singlebundle-distribution [50] EventDispatcher: Error during dispatch. (java.lang.NullPointerException) java.lang.NullPointerException at org.apache.cxf.dosgi.topologymanager.TopologyManager.notifyListenersOfRemovalIfAppropriate(TopologyManager.java:372) at org.apache.cxf.dosgi.topologymanager.TopologyManager.notifyListenersOfRemovalIfAppropriate(TopologyManager.java:217) at org.apache.cxf.dosgi.topologymanager.TopologyManager.removeService(TopologyManager.java:201) at org.apache.cxf.dosgi.topologymanager.ServiceListenerImpl.serviceChanged(ServiceListenerImpl.java:68) at org.apache.felix.framework.util.EventDispatcher.invokeServiceListenerCallback(EventDispatcher.java:871) at org.apache.felix.framework.util.EventDispatcher.fireEventImmediately(EventDispatcher.java:733) at org.apache.felix.framework.util.EventDispatcher.fireServiceEvent(EventDispatcher.java:662) at org.apache.felix.framework.Felix.fireServiceEvent(Felix.java:3890) at org.apache.felix.framework.Felix.access$000(Felix.java:79) at org.apache.felix.framework.Felix$2.serviceChanged(Felix.java:728) at org.apache.felix.framework.ServiceRegistry.unregisterService(ServiceRegistry.java:135) at org.apache.felix.framework.ServiceRegistrationImpl.unregister(ServiceRegistrationImpl.java:129) at org.apache.felix.framework.ServiceRegistry.unregisterServices(ServiceRegistry.java:178) at org.apache.felix.framework.Felix.stopBundle(Felix.java:2296) at org.apache.felix.framework.Felix.setActiveStartLevel(Felix.java:1215) at org.apache.felix.framework.StartLevelImpl.run(StartLevelImpl.java:266) at java.lang.Thread.run(Thread.java:662) ERROR: Bundle cxf-dosgi-ri-singlebundle-distribution [50] Error stopping bundle. (java.lang.NullPointerException) java.lang.NullPointerException at org.apache.cxf.dosgi.topologymanager.TopologyManager.removeRemoteServiceAdmin(TopologyManager.java:139) at org.apache.cxf.dosgi.topologymanager.RemoteServiceAdminList.removeRemoteServiceAdmin(RemoteServiceAdminList.java:83) at org.apache.cxf.dosgi.topologymanager.RemoteServiceAdminList$1.removedService(RemoteServiceAdminList.java:68) at org.osgi.util.tracker.ServiceTracker$Tracked.customizerRemoved(ServiceTracker.java:922) at org.osgi.util.tracker.AbstractTracked.untrack(AbstractTracked.java:351) at org.osgi.util.tracker.ServiceTracker.close(ServiceTracker.java:403) at org.apache.cxf.dosgi.topologymanager.RemoteServiceAdminList.stop(RemoteServiceAdminList.java:99) at org.apache.cxf.dosgi.topologymanager.Activator.stop(Activator.java:83) at org.apache.cxf.dosgi.singlebundle.AggregatedActivator.stopEmbeddedActivators(AggregatedActivator.java:137) at org.apache.cxf.dosgi.singlebundle.AggregatedActivator.stop(AggregatedActivator.java:51) at org.apache.felix.framework.util.SecureAction.stopActivator(SecureAction.java:651) at org.apache.felix.framework.Felix.stopBundle(Felix.java:2278) at org.apache.felix.framework.Felix.setActiveStartLevel(Felix.java:1215) at org.apache.felix.framework.StartLevelImpl.run(StartLevelImpl.java:266) at java.lang.Thread.run(Thread.java:662) ERROR: Bundle cxf-dosgi-ri-singlebundle-distribution [50] Error stopping mvn:org.apache.cxf.dosgi/cxf-dosgi-ri-singlebundle-distribution/1.3.1 (org.osgi.framework.BundleException: Activator stop error in bundle cxf-dosgi-ri-singlebundle-distribution [50].) java.lang.NullPointerException at org.apache.cxf.dosgi.topologymanager.TopologyManager.removeRemoteServiceAdmin(TopologyManager.java:139) at org.apache.cxf.dosgi.topologymanager.RemoteServiceAdminList.removeRemoteServiceAdmin(RemoteServiceAdminList.java:83) at
Re: Karaf and DOSGi throwing Null pointer Exception
is it possible that you've forgotten to attach/paste the exception? :-) Kind regards, Andreas On Mon, Oct 15, 2012 at 1:54 PM, dealbitte anand.boc...@gmail.com wrote: Hi, I have created a karaf child instance using 'admin:create' command and installed DOSGi Single-Bundle Distribution (1.3.1) from maven as follows: I installed 2 other OSGi bundles (service bundle service implementation bundle) that implement a simple service (printing sometext). The service is published under http://localhost:9081/prediction From the browser, http://localhost:9081/prediction?wsdl displays WSDL correctly. So everything looks fine. But when I exit karaf instance (by pressing Ctrl+D), I got the following exception: It is not clear to me, why this exception occurs. Any help towards this is appreciated. Thanks. -- View this message in context: http://karaf.922171.n3.nabble.com/Karaf-and-DOSGi-throwing-Null-pointer-Exception-tp4026408.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: 3rd Party Feature Definitions
Well, IIRC we've discussed this already on IRC some time ago about that. One the main problems by this was that we need to release all of those separately; which adds quite some work. But basically I'm with you. It's a PITA with those spring aries enterprise feature upgrades and that we have to wait for them. IMHO we should really re-discuss this issue again; to move anything not required into different features. Thanks to Christians searchurl feature we could still make it pretty easy for ppl to add them afterwards if they like. This wouldn't make too much difference to how we're handling it right now anyhow... WDYT? Kind regards, Andreas On Tue, Oct 9, 2012 at 11:19 PM, Scott England-Sullivan sully6...@gmail.com wrote: Hi all, In a recent thread on the development list there was a discussion regarding the release of Karaf 2.3.0 and the possibility of holding it up to accommodate an update to Spring 3.1. It struck me; why is Karaf tied to a 3rd party release at all? Why isn't the modular container itself modular? Why aren't 3rd party support modules such as Spring deployers externalized and allowed to progress at their own pace? Third party dependent modules should be developed against a given release of Karaf, they shouldn't drive it. There is a new karaf-webconsole project so the precedence is there. Karaf is a great, light-weight container which put a nice manageable wrapper on OSGi with a great CLI, ConfigAdmin, provisioning, etc., and IMHO should stay focused on just that at its core. The capabilities that are tied to simplifying 3rd party support are goodness but not required and as such, shouldn't drive the cores development. Now maybe you really can't separate one from the other though I don't see where it is tightly coupled at. I also understand it is a greater challenge to manage because the project become fractured but maybe Karaf is at that point. In reality I am good either way but thought it was worth discussing. Best Regards, Scott ES -- -- Scott England-Sullivan Apache Camel Committer Principal Consultant / Sr. Architect | Red Hat, Inc. FuseSource is now part of Red Hat Web: fusesource.com | redhat.com Blog: sully6768.blogspot.com Twitter: sully6768
Re: 3rd Party Feature Definitions
OK, now that I finally found my way through the original thread causing this discussion I'm even stronger +1 for this topic than before. Get out everything out of the core release which is not started by default in the default apache-karaf distribution. To make things easy for us we might pack all those other features and commands and so on into a single release structure to make it easy for us which is quite roughly compatible to karaf core 2.x.y(.z) for Karaf 2 compatible plugins and 3.x.y(.z) for Karaf 3 compatible extensions. This should make the vote the release process easy enough for us AND since we can version the features independently of the full release versions the user can still mix them as he sees fit. Just something else to get the discussion about this going :-) Kind regards, Andreas On Thu, Oct 11, 2012 at 6:54 PM, Andreas Pieber anpie...@gmail.com wrote: Well, IIRC we've discussed this already on IRC some time ago about that. One the main problems by this was that we need to release all of those separately; which adds quite some work. But basically I'm with you. It's a PITA with those spring aries enterprise feature upgrades and that we have to wait for them. IMHO we should really re-discuss this issue again; to move anything not required into different features. Thanks to Christians searchurl feature we could still make it pretty easy for ppl to add them afterwards if they like. This wouldn't make too much difference to how we're handling it right now anyhow... WDYT? Kind regards, Andreas On Tue, Oct 9, 2012 at 11:19 PM, Scott England-Sullivan sully6...@gmail.com wrote: Hi all, In a recent thread on the development list there was a discussion regarding the release of Karaf 2.3.0 and the possibility of holding it up to accommodate an update to Spring 3.1. It struck me; why is Karaf tied to a 3rd party release at all? Why isn't the modular container itself modular? Why aren't 3rd party support modules such as Spring deployers externalized and allowed to progress at their own pace? Third party dependent modules should be developed against a given release of Karaf, they shouldn't drive it. There is a new karaf-webconsole project so the precedence is there. Karaf is a great, light-weight container which put a nice manageable wrapper on OSGi with a great CLI, ConfigAdmin, provisioning, etc., and IMHO should stay focused on just that at its core. The capabilities that are tied to simplifying 3rd party support are goodness but not required and as such, shouldn't drive the cores development. Now maybe you really can't separate one from the other though I don't see where it is tightly coupled at. I also understand it is a greater challenge to manage because the project become fractured but maybe Karaf is at that point. In reality I am good either way but thought it was worth discussing. Best Regards, Scott ES -- -- Scott England-Sullivan Apache Camel Committer Principal Consultant / Sr. Architect | Red Hat, Inc. FuseSource is now part of Red Hat Web: fusesource.com | redhat.com Blog: sully6768.blogspot.com Twitter: sully6768
Re: pax-wicket for wicket 6.0
Hey Andi, TBH it might work, but there where some troubles with the wicket initializers and I'm not sure if everything works without problems. But it's definitely on my todo list to give this a shot before the 2.0.0 release. Kind regards, Andreas On Wed, Sep 26, 2012 at 1:27 PM, Andreas Kuhtz andreas.ku...@gmail.com wrote: What's the reason that all wicket classes are included in the org.ops4j.pax.wicket.service-2.0.0-SNAPSHOT.jar ? I thought the wicket-6 bundles are already OSGi-ready? ... If they were not included you could support a range for the wicket version that would allow to use newer versions without having a new pax-wicket release. Best regards, Andi 2012/9/25 Andreas Pieber anpie...@gmail.com As said, pax wicket 2.0.0 is already read; build the master locally; I just want to get some additional feedback before releasing Kind regards, Andreas On Tue, Sep 25, 2012 at 6:18 AM, Dan Tran dant...@gmail.com wrote: will do, when ever pax wicket 2.0 is ready -D On Mon, Sep 24, 2012 at 9:07 PM, Andreas Pieber anpie...@gmail.com wrote: Well, actually good that you ask. Pax wicket trunk is already wickets ready and the changes for wicket Web console shouldn't be more than some name space adaptions. So once checked Web console I can cut pax wicket 2.0.0 and we're good to go. Would you like to give it a shot? :-) Kind regards Andreas On Sep 24, 2012 10:40 PM, Dan Tran dant...@gmail.com wrote: May be this way too early to ask, but I am asking for any way. :-) Do we have a timeframe where karaf team would like to support wicket 6.0? Thanks -D
Re: pax-wicket for wicket 6.0
As said, pax wicket 2.0.0 is already read; build the master locally; I just want to get some additional feedback before releasing Kind regards, Andreas On Tue, Sep 25, 2012 at 6:18 AM, Dan Tran dant...@gmail.com wrote: will do, when ever pax wicket 2.0 is ready -D On Mon, Sep 24, 2012 at 9:07 PM, Andreas Pieber anpie...@gmail.com wrote: Well, actually good that you ask. Pax wicket trunk is already wickets ready and the changes for wicket Web console shouldn't be more than some name space adaptions. So once checked Web console I can cut pax wicket 2.0.0 and we're good to go. Would you like to give it a shot? :-) Kind regards Andreas On Sep 24, 2012 10:40 PM, Dan Tran dant...@gmail.com wrote: May be this way too early to ask, but I am asking for any way. :-) Do we have a timeframe where karaf team would like to support wicket 6.0? Thanks -D
Re: Possible to start Karaf without opening any ports?
Hey Peter, Thank you very much for the update; would you mind include your findings as a patch into the manual? Thank you very much and kind regards, Andreas On Thu, Sep 6, 2012 at 3:51 PM, peterg peter.gardfjall.w...@gmail.com wrote: Okay, for what it's worth I'll respond to my own post: (NOTE: all of these settings apply to Karaf 2.2.9 -- they may not apply for the 3.x branch) This is how to suppress the ports: - Disable the shutdown port: Set karaf.shutdown.port=-1 in etc/custom.properties or etc/config.properties - Disable ssh port (8101): Set the karaf.startRemoteShell system property to false (Merely removing the ssh feature from featuresBoot doesn't work, since the ssh server is included as a startup bundle). - Disable Karaf's JMX management RMI ports (1099, 4): Uninstall the management bundle: org.apache.karaf.management.server-2.2.9 either at runtime (for instance, from the console) or comment out its entry in etc/startup.properties. (Merely removing the configuration feature from featuresBoot doesn't work, since the management server is included as a startup bundle). - Disable JMX port: Remove -Dcom.sun.management.jmxremote from bin/karaf best regards, Peter -- View this message in context: http://karaf.922171.n3.nabble.com/Possible-to-start-Karaf-without-opening-any-ports-tp4025808p4025959.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: karaf in production
Hey, While in theory a diff release would be possible it's quite hard to do in reality. Someone would need to add a maven plugin either checking the SCM or the binaries for differences. BUT this could be quite a challange since there are some parts of a binary changing for each release without any real changes. So a simply md5 calc won't do for .jar files... On the other hand some scm diffs and pom parsings could do some wonders... not sure... On the other hand it would be (of course) possible to generate a list or diff release manually but I think this is WAYs to error prone to give it any real chance. So if you (or someone else) would like to give it a shot I think we'll be very pleased adding an additional diff based release to our release cycles; otherwise I'm afraid such a process would be simply too risky :-( Kind regards, Andreas On Mon, Sep 10, 2012 at 10:05 AM, lbu lburgazz...@gmail.com wrote: Hi Jean-Baptiste, thank you for the quick answer. Do you think it would be possible to ship in any karaf release what files have been changed? This would simplify the upgrade process for karaf installations on which I deploy on. e.g: * Karaf 2.2.9: - etc/shell.init.script - lib/karaf.jar * Karaf 2.2.8: - etc/startup.properties - lib/karaf.jar -- View this message in context: http://karaf.922171.n3.nabble.com/karaf-in-production-tp4025971p4025973.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: Equinox Console (Based on Gogo) in Karaf ?
Hey Michael, Just curious but wouldn't it be easier/better to implement the diag (IIRC there's already something like it in karaf)/ss commands into Karaf instead of making the equinox console working in Karaf? Just my 0.02€ Kind regards, Andreas On Mon, Sep 10, 2012 at 10:35 AM, Michael Täschner m.taesch...@gmail.com wrote: Hi JB, thanks for the fast response. Too bad this does not work as is because the equinox commands ss, diag, etc. would have been quite helpful. Thanks and Regards, Michael 2012/9/10 Jean-Baptiste Onofré j...@nanthrax.net Hi Michael, I never tried this, but I'm afraid that we will have some conflict around the stream handler, etc. Regards JB On 09/10/2012 10:27 AM, Michael Täschner wrote: Hi, is it possible to get the new Equinox shell running in karaf ? I put org.eclipse.equinox.console_ 1.0.0.v20120522-1841.jar into the deploy folder and the bundle is now stuck in STARTING. All imports are resolved and I would have expected it to start up nicely. Any ideas ? Thanks and Regards, Michael -- Jean-Baptiste Onofré jbono...@apache.org http://blog.nanthrax.net Talend - http://www.talend.com
Re: switch to github
@4) no need for that; once you're through the review process the apply happens within 24 hours (different time/work zones :-)). BTW, your pull request is almost through the pipe. Just waiting for two minor corrections and a new patch at the jira issue to get it in :-) Kind regards, Andreas On Sat, Aug 18, 2012 at 2:33 AM, Andrei Pozolotin andrei.pozolo...@gmail.com wrote: no, I have improvement on that git.html: 1) do all your work on github mirror via fork/pull 2) discuss your github pulls on ASF jira 3) when jira is accepted, do the git format-patch origin/trunk and attach patch.txt to the jira with ASF grant check box 4) now pray to your favorite ASF committer to really accept the patch :-) Original Message Subject: Re: switch to github From: Johan Edstrom seij...@gmail.com To: user@karaf.apache.org Date: Fri 17 Aug 2012 06:52:45 PM CDT It is really clear. http://www.apache.org/dev/git.html On Aug 17, 2012, at 5:42 PM, Brian Topping topp...@codehaus.org wrote: On Aug 18, 2012, at 2:11 AM, Jean-Baptiste Onofré j...@nanthrax.net wrote: When you are not committer on a project, and you contribute a patch, you have to explicitly grant your license to ASF. To do that, you just mention it by checking Grant ASF when attaching the file to the Jira. Yes, i appreciate that, but I thought we were trying to clarify whether Github pulls were acceptable means of providing patches. It seems that they are not acceptable for non-committers, so the fact that there are pull requests obscures the fact that those pull requests are unusable and therefore not statistically relevant. Having said that, it would be good to concretely clarify that Github pulls are not acceptable for non-committers, avoiding any interpretation that Github is a means by which non-committers can provide value to the project. It's important because it is actually very difficult in my experience to get patches applied, which dissuades people from contributing and makes it appear that nobody is interested when there may in fact be many folks interested in contributing but find it too unproductive to do so. These misinterpretations are very damaging to a project since valuable contributions (however small or unimportant to one group) are never made, and folks of a mindset similar to the person who never contributed do not in turn ever start using the project because these features never made it in. This is very much an anti-pattern in ASF projects, but I've found it pretty common as well, so please don't interpret this as me calling out Karaf in particular. Brian
Re: switch to github
The process Andrei described is really the best way to get things going tbh since they make the review process for us ways easier. Besides I think the real problem behind the entire github/pullreqest discussion is the problem that there's no really good jira/patch-based-review combination available at ASF. But independently this is a point which will take another bunch of years to get corrected if you follow the core lists. The entire git/svn/review/... discussion is going on there for years by now with quite some bashing... curious about the outcome :-) Kind regards, Andreas On Sat, Aug 18, 2012 at 7:25 AM, Andreas Pieber anpie...@gmail.com wrote: @4) no need for that; once you're through the review process the apply happens within 24 hours (different time/work zones :-)). BTW, your pull request is almost through the pipe. Just waiting for two minor corrections and a new patch at the jira issue to get it in :-) Kind regards, Andreas On Sat, Aug 18, 2012 at 2:33 AM, Andrei Pozolotin andrei.pozolo...@gmail.com wrote: no, I have improvement on that git.html: 1) do all your work on github mirror via fork/pull 2) discuss your github pulls on ASF jira 3) when jira is accepted, do the git format-patch origin/trunk and attach patch.txt to the jira with ASF grant check box 4) now pray to your favorite ASF committer to really accept the patch :-) Original Message Subject: Re: switch to github From: Johan Edstrom seij...@gmail.com To: user@karaf.apache.org Date: Fri 17 Aug 2012 06:52:45 PM CDT It is really clear. http://www.apache.org/dev/git.html On Aug 17, 2012, at 5:42 PM, Brian Topping topp...@codehaus.org wrote: On Aug 18, 2012, at 2:11 AM, Jean-Baptiste Onofré j...@nanthrax.net wrote: When you are not committer on a project, and you contribute a patch, you have to explicitly grant your license to ASF. To do that, you just mention it by checking Grant ASF when attaching the file to the Jira. Yes, i appreciate that, but I thought we were trying to clarify whether Github pulls were acceptable means of providing patches. It seems that they are not acceptable for non-committers, so the fact that there are pull requests obscures the fact that those pull requests are unusable and therefore not statistically relevant. Having said that, it would be good to concretely clarify that Github pulls are not acceptable for non-committers, avoiding any interpretation that Github is a means by which non-committers can provide value to the project. It's important because it is actually very difficult in my experience to get patches applied, which dissuades people from contributing and makes it appear that nobody is interested when there may in fact be many folks interested in contributing but find it too unproductive to do so. These misinterpretations are very damaging to a project since valuable contributions (however small or unimportant to one group) are never made, and folks of a mindset similar to the person who never contributed do not in turn ever start using the project because these features never made it in. This is very much an anti-pattern in ASF projects, but I've found it pretty common as well, so please don't interpret this as me calling out Karaf in particular. Brian
Re: No info log output when installing a feature
Hey Martin, Sounds like a great idea; can you create an jira for this that we don't loose track of this improvement? (an attached patch is btw, always welcomed too ;-)) Kind regards, Andreas On Tue, Aug 7, 2012 at 3:30 PM, lichtin lich...@yahoo.com wrote: Hi During testing with Pax-Exam, I noticed there's no (info) log output at all during a feature installation. Installing some larger feature can take a long time and it would be nice to see some progress. My suggestion is to change the log level to INFO for the Installing bundle lines: 2012-08-07 14:57:21,760 | DEBUG | Thread-7 | FeaturesServiceImpl | 39 - org.apache.karaf.features.core - 2.2.8 | Installing bundle mvn:org.apache.cxf/cxf-rt-ws-addr/2.6.1 2012-08-07 14:57:21,846 | DEBUG | Thread-7 | FeaturesServiceImpl | 39 - org.apache.karaf.features.core - 2.2.8 | Installing bundle mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.ehcache/2.5.1_1 and perhaps also for the Installing feature lines: 2012-08-07 14:57:47,359 | DEBUG | Thread-7 | FeaturesServiceImpl | 39 - org.apache.karaf.features.core - 2.2.8 | Installing feature camel-core 2.10.0 2012-08-07 14:57:47,359 | DEBUG | Thread-7 | FeaturesServiceImpl | 39 - org.apache.karaf.features.core - 2.2.8 | Installing feature spring 3.0.7.RELEASE -- View this message in context: http://karaf.922171.n3.nabble.com/No-info-log-output-when-installing-a-feature-tp4025433.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: No info log output when installing a feature
but those are written to the console; maybe we should have some basic log to the log during the process too in info mode? Kind regards, Andreas On Tue, Aug 7, 2012 at 4:19 PM, Jean-Baptiste Onofré j...@nanthrax.net wrote: Hi guys, FYI, in pure Karaf commands, we have the -v (verbose) option to see where we are in the features installation process (to know which bundle is in download, etc). Regards JB On 08/07/2012 03:34 PM, Andreas Pieber wrote: Hey Martin, Sounds like a great idea; can you create an jira for this that we don't loose track of this improvement? (an attached patch is btw, always welcomed too ;-)) Kind regards, Andreas On Tue, Aug 7, 2012 at 3:30 PM, lichtin lich...@yahoo.com wrote: Hi During testing with Pax-Exam, I noticed there's no (info) log output at all during a feature installation. Installing some larger feature can take a long time and it would be nice to see some progress. My suggestion is to change the log level to INFO for the Installing bundle lines: 2012-08-07 14:57:21,760 | DEBUG | Thread-7 | FeaturesServiceImpl | 39 - org.apache.karaf.features.core - 2.2.8 | Installing bundle mvn:org.apache.cxf/cxf-rt-ws-addr/2.6.1 2012-08-07 14:57:21,846 | DEBUG | Thread-7 | FeaturesServiceImpl | 39 - org.apache.karaf.features.core - 2.2.8 | Installing bundle mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.ehcache/2.5.1_1 and perhaps also for the Installing feature lines: 2012-08-07 14:57:47,359 | DEBUG | Thread-7 | FeaturesServiceImpl | 39 - org.apache.karaf.features.core - 2.2.8 | Installing feature camel-core 2.10.0 2012-08-07 14:57:47,359 | DEBUG | Thread-7 | FeaturesServiceImpl | 39 - org.apache.karaf.features.core - 2.2.8 | Installing feature spring 3.0.7.RELEASE -- View this message in context: http://karaf.922171.n3.nabble.com/No-info-log-output-when-installing-a-feature-tp4025433.html Sent from the Karaf - User mailing list archive at Nabble.com. -- Jean-Baptiste Onofré jbono...@apache.org http://blog.nanthrax.net Talend - http://www.talend.com
Re: No info log output when installing a feature
IMHO the verbose feature is interesting for everything on the commandline; the log should always be written, at least at some extend, independent from the -v param; shouldn't it? Kind regards, Andreas On Tue, Aug 7, 2012 at 5:35 PM, Jean-Baptiste Onofré j...@nanthrax.net wrote: We already did this for the instance (add a verbose flag to write in the log file), so it makes sense to do the same for features. Regards JB On 08/07/2012 05:15 PM, Andreas Pieber wrote: but those are written to the console; maybe we should have some basic log to the log during the process too in info mode? Kind regards, Andreas On Tue, Aug 7, 2012 at 4:19 PM, Jean-Baptiste Onofré j...@nanthrax.net wrote: Hi guys, FYI, in pure Karaf commands, we have the -v (verbose) option to see where we are in the features installation process (to know which bundle is in download, etc). Regards JB On 08/07/2012 03:34 PM, Andreas Pieber wrote: Hey Martin, Sounds like a great idea; can you create an jira for this that we don't loose track of this improvement? (an attached patch is btw, always welcomed too ;-)) Kind regards, Andreas On Tue, Aug 7, 2012 at 3:30 PM, lichtin lich...@yahoo.com wrote: Hi During testing with Pax-Exam, I noticed there's no (info) log output at all during a feature installation. Installing some larger feature can take a long time and it would be nice to see some progress. My suggestion is to change the log level to INFO for the Installing bundle lines: 2012-08-07 14:57:21,760 | DEBUG | Thread-7 | FeaturesServiceImpl | 39 - org.apache.karaf.features.core - 2.2.8 | Installing bundle mvn:org.apache.cxf/cxf-rt-ws-addr/2.6.1 2012-08-07 14:57:21,846 | DEBUG | Thread-7 | FeaturesServiceImpl | 39 - org.apache.karaf.features.core - 2.2.8 | Installing bundle mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.ehcache/2.5.1_1 and perhaps also for the Installing feature lines: 2012-08-07 14:57:47,359 | DEBUG | Thread-7 | FeaturesServiceImpl | 39 - org.apache.karaf.features.core - 2.2.8 | Installing feature camel-core 2.10.0 2012-08-07 14:57:47,359 | DEBUG | Thread-7 | FeaturesServiceImpl | 39 - org.apache.karaf.features.core - 2.2.8 | Installing feature spring 3.0.7.RELEASE -- View this message in context: http://karaf.922171.n3.nabble.com/No-info-log-output-when-installing-a-feature-tp4025433.html Sent from the Karaf - User mailing list archive at Nabble.com. -- Jean-Baptiste Onofré jbono...@apache.org http://blog.nanthrax.net Talend - http://www.talend.com -- Jean-Baptiste Onofré jbono...@apache.org http://blog.nanthrax.net Talend - http://www.talend.com
Re: Why is JDK required for running Karaf?
btw, another option to workaround this problem is to provide a custom jre; we typically deliver a JDK with our container, BUT for windows we copy over the bin directory from the JDK over the JRE, this is all required to fix that problem. Just to offer an additional option. Kind regards, Andreas On Tue, Jul 31, 2012 at 12:44 PM, Hervé BARRAULT herve.barra...@gmail.com wrote: Hi, the java environment can be defined as server or client (http://www.oracle.com/technetwork/java/hotspotfaq-138619.html#compiler_types). Using a JRE under linux provide both server and client option. Using a JRE under windows provides only a client. (the server part is only provided with JDK) When using karaf, the option is set to server so when running under windows, you need the JDK. As users, we noticed that for previous versions. I don't know if current versions need more than this option. Regards Hervé On Tue, Jul 31, 2012 at 12:15 PM, Christoph Gritschenberger christoph.gritschenber...@gmail.com wrote: Hi, I ran into an Error when I tried to run karaf on a Windows XP with Oracle JRE 7u5 installed. Error: missing `server' JVM at `C:\Programme\Java\jre7\bin\server\jvm.dll'. Please install or use the JRE or JDK that contains these missing components. I then discovered that the karaf-documentation states that a JDK is required to run karaf. I installed JDK 7u5 and everything went fine. But I have some difficulties explaining to certain customers that they need to install a Development Kit to run this thing. So my question is, why exactly is it required to install JDK? Is there no other way? Thanks. kind regards, christoph
Re: Why is JDK required for running Karaf?
argl.. sorry for the mistake; we provide a JRE copy over the bin dir from the JDK :-) so, no it's correct. Kind regards, Andreas On Tue, Jul 31, 2012 at 12:47 PM, Andreas Pieber anpie...@gmail.com wrote: btw, another option to workaround this problem is to provide a custom jre; we typically deliver a JDK with our container, BUT for windows we copy over the bin directory from the JDK over the JRE, this is all required to fix that problem. Just to offer an additional option. Kind regards, Andreas On Tue, Jul 31, 2012 at 12:44 PM, Hervé BARRAULT herve.barra...@gmail.com wrote: Hi, the java environment can be defined as server or client (http://www.oracle.com/technetwork/java/hotspotfaq-138619.html#compiler_types). Using a JRE under linux provide both server and client option. Using a JRE under windows provides only a client. (the server part is only provided with JDK) When using karaf, the option is set to server so when running under windows, you need the JDK. As users, we noticed that for previous versions. I don't know if current versions need more than this option. Regards Hervé On Tue, Jul 31, 2012 at 12:15 PM, Christoph Gritschenberger christoph.gritschenber...@gmail.com wrote: Hi, I ran into an Error when I tried to run karaf on a Windows XP with Oracle JRE 7u5 installed. Error: missing `server' JVM at `C:\Programme\Java\jre7\bin\server\jvm.dll'. Please install or use the JRE or JDK that contains these missing components. I then discovered that the karaf-documentation states that a JDK is required to run karaf. I installed JDK 7u5 and everything went fine. But I have some difficulties explaining to certain customers that they need to install a Development Kit to run this thing. So my question is, why exactly is it required to install JDK? Is there no other way? Thanks. kind regards, christoph
Re: Why is JDK required for running Karaf?
well, looking at [1] I don't think that there is any difference. Since karaf could run as a server BUT also as a client environment we maybe should a) make it possible to switch between those with a param and b) add a fallback solution for windows WDYT? kind regards, Andreas On Tue, Jul 31, 2012 at 1:02 PM, Christoph Gritschenberger christoph.gritschenber...@gmail.com wrote: Hi, With this, I get some syntax-error -Dcom.sun.management.jmxremote ist syntaktisch an dieser Stelle nicht verarbeitbar. (Sorry for the german, but that's my only WinXP I got lying around.) I tried with skipping the quotes and jmxremote-part, then everything seemed to work. set JAVA_OPTS=-Xmx512M bin\karaf.bat Maybe the karaf.bat-script should detect whether -server is supported. Or is it required for some feature(s)? kind regards, christoph On 31/07/12 12:35, Guillaume Nodet wrote: Could you try running without the -server flags maybe ? set JAVA_OPTS=-Xmx512M -Dcom.sun.management.jmxremote bin\karaf.bat On Tue, Jul 31, 2012 at 12:15 PM, Christoph Gritschenberger christoph.gritschenber...@gmail.com wrote: Hi, I ran into an Error when I tried to run karaf on a Windows XP with Oracle JRE 7u5 installed. Error: missing `server' JVM at `C:\Programme\Java\jre7\bin\server\jvm.dll'. Please install or use the JRE or JDK that contains these missing components. I then discovered that the karaf-documentation states that a JDK is required to run karaf. I installed JDK 7u5 and everything went fine. But I have some difficulties explaining to certain customers that they need to install a Development Kit to run this thing. So my question is, why exactly is it required to install JDK? Is there no other way? Thanks. kind regards, christoph
Re: Why is JDK required for running Karaf?
btw ;-) [1] http://stackoverflow.com/questions/198577/real-differences-between-java-server-and-java-client Kind regards, Andreas On Tue, Jul 31, 2012 at 1:14 PM, Andreas Pieber anpie...@gmail.com wrote: well, looking at [1] I don't think that there is any difference. Since karaf could run as a server BUT also as a client environment we maybe should a) make it possible to switch between those with a param and b) add a fallback solution for windows WDYT? kind regards, Andreas On Tue, Jul 31, 2012 at 1:02 PM, Christoph Gritschenberger christoph.gritschenber...@gmail.com wrote: Hi, With this, I get some syntax-error -Dcom.sun.management.jmxremote ist syntaktisch an dieser Stelle nicht verarbeitbar. (Sorry for the german, but that's my only WinXP I got lying around.) I tried with skipping the quotes and jmxremote-part, then everything seemed to work. set JAVA_OPTS=-Xmx512M bin\karaf.bat Maybe the karaf.bat-script should detect whether -server is supported. Or is it required for some feature(s)? kind regards, christoph On 31/07/12 12:35, Guillaume Nodet wrote: Could you try running without the -server flags maybe ? set JAVA_OPTS=-Xmx512M -Dcom.sun.management.jmxremote bin\karaf.bat On Tue, Jul 31, 2012 at 12:15 PM, Christoph Gritschenberger christoph.gritschenber...@gmail.com wrote: Hi, I ran into an Error when I tried to run karaf on a Windows XP with Oracle JRE 7u5 installed. Error: missing `server' JVM at `C:\Programme\Java\jre7\bin\server\jvm.dll'. Please install or use the JRE or JDK that contains these missing components. I then discovered that the karaf-documentation states that a JDK is required to run karaf. I installed JDK 7u5 and everything went fine. But I have some difficulties explaining to certain customers that they need to install a Development Kit to run this thing. So my question is, why exactly is it required to install JDK? Is there no other way? Thanks. kind regards, christoph
Re: Why is JDK required for running Karaf?
@Christoph: can you create a JIRA for the problem that we don't lost this problem? Kind regards, Andreas On Tue, Jul 31, 2012 at 1:57 PM, Guillaume Nodet gno...@gmail.com wrote: Yes, I think if the default windows JRE does not support server, we should not include this option by default on windows. Unless we can actually find if it's present or not, which would be even better. On Tue, Jul 31, 2012 at 1:02 PM, Christoph Gritschenberger christoph.gritschenber...@gmail.com wrote: Hi, With this, I get some syntax-error -Dcom.sun.management.jmxremote ist syntaktisch an dieser Stelle nicht verarbeitbar. (Sorry for the german, but that's my only WinXP I got lying around.) I tried with skipping the quotes and jmxremote-part, then everything seemed to work. set JAVA_OPTS=-Xmx512M bin\karaf.bat Maybe the karaf.bat-script should detect whether -server is supported. Or is it required for some feature(s)? kind regards, christoph On 31/07/12 12:35, Guillaume Nodet wrote: Could you try running without the -server flags maybe ? set JAVA_OPTS=-Xmx512M -Dcom.sun.management.jmxremote bin\karaf.bat On Tue, Jul 31, 2012 at 12:15 PM, Christoph Gritschenberger christoph.gritschenber...@gmail.com wrote: Hi, I ran into an Error when I tried to run karaf on a Windows XP with Oracle JRE 7u5 installed. Error: missing `server' JVM at `C:\Programme\Java\jre7\bin\server\jvm.dll'. Please install or use the JRE or JDK that contains these missing components. I then discovered that the karaf-documentation states that a JDK is required to run karaf. I installed JDK 7u5 and everything went fine. But I have some difficulties explaining to certain customers that they need to install a Development Kit to run this thing. So my question is, why exactly is it required to install JDK? Is there no other way? Thanks. kind regards, christoph -- Guillaume Nodet Blog: http://gnodet.blogspot.com/ FuseSource, Integration everywhere http://fusesource.com
Re: make project import work with m2e
hey Andrei, The patch looks fine to me; Can you please attach the patch to the according jira issue [1] and allow it for inclusion? I'll take care for it to get included. Thanks and kind regards, Andreas [1] https://issues.apache.org/jira/browse/KARAF-1048 On Sat, Jul 28, 2012 at 6:46 PM, Jean-Baptiste Onofré j...@nanthrax.net wrote: Hi Andrei, sorry to have missed the previous request. Let me take a look ;) Regards JB On 07/28/2012 05:42 PM, Andrei Pozolotin wrote: *Jean-Baptiste*, *hello;* I asked for this before; I hope I have more luck this time :-) make project import work with m2e https://github.com/apache/karaf/pull/5 Thank you, Andrei. -- Jean-Baptiste Onofré jbono...@apache.org http://blog.nanthrax.net Talend - http://www.talend.com
Re: etc files for new instance
not right now, but since there where quiet a number of requests by now it might be a good idea to create a new feature request jira. Kind regards, Andreas On Mon, Jul 30, 2012 at 9:49 PM, helander leh...@gmail.com wrote: I found out that the /etc files for a newly created instance were not copied from the KARAF-HOME/etc directory, but from resources in one of the admin bundles. For a custom distro where the /etc files are typically customized, this becomes a problem, since the created instances will not contain the customized /etc files. Is there any way that you can control what /etc files that get copied to the new instance? /Lars -- View this message in context: http://karaf.922171.n3.nabble.com/etc-files-for-new-instance-tp4025370.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: Unpacked KAR deployment
Hey Lars, I think it would be better if you please create the issue yourself. It hasn't been created yet but that way it will allow you to get notifications about the current state of the issue and it will make it easier for us to ask further questions about the feature if required :-) Thanks and kind regards, Andreas On Thu, Jul 26, 2012 at 5:57 PM, helander leh...@gmail.com wrote: Ok, thanks. /Lars -- View this message in context: http://karaf.922171.n3.nabble.com/Unpacked-KAR-deployment-tp4025335p4025344.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: Felix File install and Weaving Hooks
TBH I'm not sure if we want this refresh per default since it could be quite costly for bundles with lots of work todo. Nevertheless, an optional parameter at the feature requesting a reload of hot-deployed might be interesting; something like feature name=xxx refresh-hot-deployed=true... Independently first of all it might be interesting to figure out if this works at all as expected. Basically I would say yes, but there's always kind of difference between theory and praxis ;-) would you mind to try this out with a quick prototype if such a refresh would do the trick at all? Kind regards, Andreas On Wed, Jul 25, 2012 at 2:22 PM, Christoph Gritschenberger christoph.gritschenber...@gmail.com wrote: Hi, I just had a look at Christian Schneider's code that delays startup of the console. It just periodically checks the list of installed bundles and stops when it's startup is finished. I don't think it can be reused for this purpose. I think the real problem is that bundles with start-level 80 (default for bundles in deploy) are started before other bundles specified in features-core with lower start-level (in our case 40). A specific solution would be, when starting the bundle providing the weaving hook, to check all installed bundles whether they would require weaving. If a bundle does, refresh it, so that the weaving hook is applied. Maybe this can be fixed by adding an option to refresh all hot-deployed bundles when features-core is finished installing the features from features.cfg WDYT? kind regards, christoph On 18/07/12 19:07, Andreas Pieber wrote: maybe there is some other way delay deploy folder loading till all bundles are at least started? Since Christian is after something here anyhow (he'll need something similar for his startup logic anyhow) maybe we can reuse parts of this logic? Or is there something completely different possibly here? Kind regards, Andreas On Wed, Jul 18, 2012 at 6:56 PM, Guillaume Nodet gno...@gmail.com wrote: Yes, that would be a problem, because fileinstall is the one that grabs all the configuration from the etc/ directory and that needs to be done early in the process. On Wed, Jul 18, 2012 at 6:37 PM, Christoph Gritschenberger christoph.gritschenber...@gmail.com wrote: Hi, We recently started using Weaving Hooks in our Project (running in karaf-3.0.0-SNAPSHOT) and ran into a Problem: When deploying a bundle using the deploy-directory the bundle might get installed before the weaving hook is active. This causes the bundle to remain unweaved. We described the weaving-hook as part of a feature in a features.xml and added it to featuresBoot in org.apache.karaf.features.cfg. (We use start-level 40 for it as start-levels lower than 30 caused issues with aries because it obviously does not like bundle that are started before itself.) Now the Felix File installer is started with start-level 17, so it might pick up the bundle in the deploy-folder way before Karaf loads the feature-core that would install the feature of the weaver. My question is, does it even make sense to start the file installer earlier than the features-core-bundle? As far as I can tell it would make more sense to start the Feature-core before the file-installer, or does that cause other problems? WDYT? kind regards, christoph -- Guillaume Nodet Blog: http://gnodet.blogspot.com/ FuseSource, Integration everywhere http://fusesource.com
Re: Strange issue with maven repositories and org.apache.karaf.webconsole.features
no that should be a configuration in pax-url-aether. I'm not sure right now if this is possible at all right now for pax-url or if we simply configured it wrong. The documentation and therefore the google entries are quite short on this topic :-) Best to create a Karaf Jira that we don't forget about it; In other words, it's definitely a bug (even if only a configuration one for missing options). Kind regards, Andreas On Mon, Jul 23, 2012 at 1:20 PM, Sammy Ramareddy sammy.ramare...@solarmax.com wrote: Hello, I'm using karaf 2.2.8 with equinox running on debian with java 1.7.0_03. I'm having a strange issue where karaf needs a lot of time (around 5-10minutes) to generate for the first time the features tab in the webconsole or give some feedback to a features:list command in the ssh console. I currently have the following urls configured in $KARAF_HOME/etc/org.ops4j.pax.url.mvn.cfg: org.ops4j.pax.url.mvn.repositories= \ http://192.168.2.224:8080/artifactory/repo@snapshots, \ http://i0019231.subdomain.domain.tld:8080/artifactory/repo@snapshots The second url is actually pointing to the same artifactory repository as the first one, just running with another IP (within our VPN subnet). Once in production, karaf will run outside our network and will need to reach our artifactory server through a VPN connection, hence the need for two urls. The problem is the following: when the VPN connection is *not* running, and thus i0019231.subdomain.domain.tld is not reachable (its IP is 172.21.x.x), karaf needs 5-10 minutes to be able to start the features-plugin. When this is happening, the following entry is in the Karaf log: failed to open bundleresource://78.fwk10937487/ where bundle 78 is org.apache.karaf.webconsole.features This issue happens only when the VPN client is not running and thus the second repository is not reachable. When the second url is reachable, then everything runs fine. Why does the features-plugin need so long to time out? Am I doing something wrong? Is it a bug? Thanks a lot for any help ! Best Regards Sam
Re: Felix File install and Weaving Hooks
maybe there is some other way delay deploy folder loading till all bundles are at least started? Since Christian is after something here anyhow (he'll need something similar for his startup logic anyhow) maybe we can reuse parts of this logic? Or is there something completely different possibly here? Kind regards, Andreas On Wed, Jul 18, 2012 at 6:56 PM, Guillaume Nodet gno...@gmail.com wrote: Yes, that would be a problem, because fileinstall is the one that grabs all the configuration from the etc/ directory and that needs to be done early in the process. On Wed, Jul 18, 2012 at 6:37 PM, Christoph Gritschenberger christoph.gritschenber...@gmail.com wrote: Hi, We recently started using Weaving Hooks in our Project (running in karaf-3.0.0-SNAPSHOT) and ran into a Problem: When deploying a bundle using the deploy-directory the bundle might get installed before the weaving hook is active. This causes the bundle to remain unweaved. We described the weaving-hook as part of a feature in a features.xml and added it to featuresBoot in org.apache.karaf.features.cfg. (We use start-level 40 for it as start-levels lower than 30 caused issues with aries because it obviously does not like bundle that are started before itself.) Now the Felix File installer is started with start-level 17, so it might pick up the bundle in the deploy-folder way before Karaf loads the feature-core that would install the feature of the weaver. My question is, does it even make sense to start the file installer earlier than the features-core-bundle? As far as I can tell it would make more sense to start the Feature-core before the file-installer, or does that cause other problems? WDYT? kind regards, christoph -- Guillaume Nodet Blog: http://gnodet.blogspot.com/ FuseSource, Integration everywhere http://fusesource.com
Re: ActiveMQ plug-in using activemq-bluepring
of cause it's possible. See [1] e.g. for an example. Kind regards, Andreas [1] https://github.com/openengsb/openengsb-framework/tree/v2.5.0/infrastructure/jms On Wed, Jun 20, 2012 at 9:08 PM, Rajbir Saini rajbsa...@yahoo.com wrote: Hi, Is it possible to configure ActiveMQ plug-in using activemq-blueprint? I search around but could not find a way to configure it. Thanks, Raj
Re: Odd behaviors with custom assembly
Yep, the framework... as I've expected. Simply remove karaf-framework from your featuresBoot tag and you're good to go... I'd no time by now to investigate the problem in detail but it seams that it triggers some core lib to start twice producing this strange result... Kind regards, Andreas On Fri, Apr 20, 2012 at 03:11, Chris Geer ch...@cxtsoftware.com wrote: Andreas, I finally got some time today to figure out a simple way to recreate it, and it is simple. Start with the lastest SMX trunk Build apache-servicemix-minimal assembly. Deploy and run to verify it starts normally without any oddness Modify the src/main/filtered-resources/etc/org.apache.karaf.features.cfg file and change the featuresBoot to the following: featuresBoot=karaf-framework,config,transaction,eventadmin,management,activemq-blueprint,activeq-web-console,webconsole,cxf,camel,camel-blueprint,camel-activemq Rebuild the assembly Deploy and run On that last run I will see double branding and errors about not being able to persist features. I am running on Mac OS X Lion and using the tar.gz deployment file if it matters. I've done this lots of different ways with both SMX 4.4.1 and SMX 4.5.0 starting points. I haven't started with just Karaf since I need all the features SMX integrates. After writing and testing all of that I just realized you can reproduce the problem by simply downloading the normal servicemix-minimal distro from the website and changing the featuresBoot as described above. No compiling required. Chris On Thu, Apr 19, 2012 at 12:20 AM, Andreas Pieber anpie...@gmail.com wrote: I know thos problems and they've encountered to me as I manipulated the features.cfg in a way that the framework was started twice. Can you write down an exact todo list to reproduce this issue? thx and kind regards, Andreas On Wed, Apr 18, 2012 at 17:48, Chris Geer ch...@cxtsoftware.com wrote: As I've been getting more comfortable with Karaf (ServiceMix) I thought I was ready to be a little more adventurous and go ahead and build my own custom assembly with some upgraded packages I needed but I'm getting some odd behavior and I can't pinpoint what exactly caused the issue. Since I'm essentially using ServiceMix I started with their assemblies as an example to build mine and really only changed the package versions. - Camel 2.9.2-SNAPSHOT - CXF 2.5.2 When I start my distro though I get two odd things 1) The initial branding (logo and such) shows up twice 2) I get the following exception which shows up on the console, not log org.apache.karaf.features.core[org.apache.karaf.features.internal.FeaturesServiceImpl] : Error persisting FeaturesService state java.lang.IllegalStateException: Invalid BundleContext. at org.apache.felix.framework.BundleContextImpl.checkValidity(BundleContextImpl.java:365) at org.apache.felix.framework.BundleContextImpl.getDataFile(BundleContextImpl.java:347) at org.apache.karaf.features.internal.FeaturesServiceImpl.saveState(FeaturesServiceImpl.java:1049) at org.apache.karaf.features.internal.FeaturesServiceImpl.installFeatures(FeaturesServiceImpl.java:407) at org.apache.karaf.features.internal.FeaturesServiceImpl$1.run(FeaturesServiceImpl.java:975) org.apache.karaf.features.core[org.apache.karaf.features.internal.FeaturesServiceImpl] : Error persisting FeaturesService state java.lang.IllegalStateException: Invalid BundleContext. at org.apache.felix.framework.BundleContextImpl.checkValidity(BundleContextImpl.java:365) at org.apache.felix.framework.BundleContextImpl.getDataFile(BundleContextImpl.java:347) at org.apache.karaf.features.internal.FeaturesServiceImpl.saveState(FeaturesServiceImpl.java:1049) at org.apache.karaf.features.internal.FeaturesServiceImpl$1.run(FeaturesServiceImpl.java:980) Everything seems to run just fine but those seemed odd. What makes it even weirder is that neither of those things happen on future launches of the container. Any thoughts on what might cause these and should I be concerned? Thanks, Chris
Re: deploying a database source as service in Karaf 2.2.5
Try a \ in front of the ! Kind regards Andreas Send from my mobile. Please excuse the brevity and/or possible auto correction errors. On Mar 16, 2012 9:42 PM, James Gartner james.gart...@moodys.com wrote: Thanks Christian -- I did get it to work with some help from Matt Madhavan -- it was a configuration item I needed to add in the org.ops4j.pax.url.mvn.cfg file for updating the repositories for NEXUS. I still do have some issues though when trying to run certain commands, sepcifically: osgi:install wrap:mvn:http://download.java.net/maven/2!net.java.dev.jna/jna/3.1.0 I get the following error: Error executing command: !net.java.dev.jna/jna/3.1.0: event not found This command is directly from the Karaf dev guide on colorizing the command line, but I can't seem to get this to work. Any ideas folks? -- View this message in context: http://karaf.922171.n3.nabble.com/deploying-a-database-source-as-service-in-Karaf-2-2-5-tp3814097p3833115.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: Scala protocol for Slang
with absolutely no idea about slang... have you install slang-scala feature or just the deploy bundle? This one sounds that all bundles need to be available for slang to correctly install url handlers, compilers, ... Kind regards, Andreas [1] http://fusesource.com/forge/git/slang.git/?p=slang.git;a=blob;f=features/src/main/resources/features.xml;h=f244b8ac42963c7560e08ea142765f31a9a1c427;hb=HEAD On Thu, Mar 8, 2012 at 11:38, Guillaume Yziquel guillaume.yziq...@crossing-tech.com wrote: Hi. (Cross-posting to user@karaf.apache.org, as I suspect the slang-...@fusesource.org mailing to have low activity). I've just tried out the Scala Slang Deployer for Karaf http://fusesource.com/forge/git/slang.git/ but scala files do not get deployed. class ScalaDeploymentListener extends ArtifactUrlTransformer { val LOG = LogFactory.getLog(classOf[ScalaDeploymentListener]) def canHandle(artifact: File) = { artifact.isFile() artifact.getName().endsWith(.scala) } def transform(artifact: URL) : URL = { try { new URL(scala, null, artifact.toString()); } catch { case e: Exception = { LOG.error(Unable to build scala bundle, e); return null; } } } } It fails in 'new URL(scala, null, artifact.toString())' as there are no 'scala' protocol: 10:07:42,262 | ERROR | Framework/deploy | ScalaDeploymentListener | ? ? | 258 - org.fusesource.slang.scala.deployer - 1.0.0.SNAPSHOT | Unable to build scala bundle java.net.MalformedURLException: unknown protocol: scala at java.net.URL.init(URL.java:395)[:1.6.0_29] at java.net.URL.init(URL.java:283)[:1.6.0_29] at java.net.URL.init(URL.java:306)[:1.6.0_29] at org.fusesource.slang.scala.deployer.ScalaDeploymentListener.transform(ScalaDeploymentListener.scala:38)[258:org.fusesource.slang.scala.deployer:1.0.0.SNAPSHOT] at org.apache.felix.fileinstall.internal.DirectoryWatcher.transformArtifact(DirectoryWatcher.java:501)[6:org.apache.felix.fileinstall:3.1.10] at org.apache.felix.fileinstall.internal.DirectoryWatcher.process(DirectoryWatcher.java:430)[6:org.apache.felix.fileinstall:3.1.10] at org.apache.felix.fileinstall.internal.DirectoryWatcher.run(DirectoryWatcher.java:263)[6:org.apache.felix.fileinstall:3.1.10] Anybody recently tried out the Slang deployer? -- Guillaume Yziquel Crossing-Tech Parc Scientifique EPFL
Re: Scala protocol for Slang
Hey, OK, this will be some more work somewhere to make this work with 2.2.5... first of all I corrected the versions of karaf and slf4j in the root pom.xml and features.xml; In addition I needed to remove the -SNAPSHOT attribute in the features/src/main/features.xml. then I was able to build the entire thing and deploy it on karaf. Though, after I've tried to deploy the actor example from [1] I get a warning in log from the scalaplugin: 2012-03-08 14:48:35,164 | WARN | raf-2.2.5/deploy | ScalaCompiler | r.compiler.ScalaCompiler$$anon$2 60 | 51 - org.fusesource.slang.scala.deployer - 1.0.0.SNAPSHOT | NoPosition:[Classpath = /home/pieber/.m2/repository/org/apache/karaf/apache-karaf/2.2.5/apache-karaf-2.2.5/system/org/apache/felix/org.apache.felix.framework/3.0.9/org.apache.felix.framework-3.0.9.jar:/home/pieber/.m2/repository/org/apache/karaf/apache-karaf/2.2.5/apache-karaf-2.2.5/lib/endorsed/org.apache.karaf.exception-2.2.5.jar:/opt/java/jre/lib/resources.jar:/opt/java/jre/lib/rt.jar:/opt/java/jre/lib/jsse.jar:/opt/java/jre/lib/jce.jar:/opt/java/jre/lib/charsets.jar:/opt/java/jre/lib/ext/sunec.jar:/opt/java/jre/lib/ext/sunjce_provider.jar:/opt/java/jre/lib/ext/zipfs.jar:/opt/java/jre/lib/ext/sunpkcs11.jar:/opt/java/jre/lib/ext/localedata.jar:/opt/java/jre/lib/ext/dnsns.jar:/opt/java/jre/lib/ext/bcprov-jdk16-1.45.jar:/home/pieber/.m2/repository/org/apache/karaf/apache-karaf/2.2.5/apache-karaf-2.2.5/lib/karaf-jaas-boot.jar:/home/pieber/.m2/repository/org/apache/karaf/apache-karaf/2.2.5/apache-karaf-2.2.5/lib/karaf.jar:.] 2012-03-08 14:48:35,737 | WARN | raf-2.2.5/deploy | ScalaCompiler | r.compiler.ScalaCompiler$$anon$2 60 | 51 - org.fusesource.slang.scala.deployer - 1.0.0.SNAPSHOT | NoPosition:[loaded package loader org.apache.karaf.exception-2.2.5.jar in 527ms] At least this is definitely nothing in karaf but rather in the usage of the scala compiler, a version conflict between JRE version scala version or something else... Kind regards, Andreas [1] http://slang.fusesource.org/documentation/scala/index.html On Thu, Mar 8, 2012 at 13:33, Guillaume Yziquel guillaume.yziq...@crossing-tech.com wrote: Le Thursday 08 Mar 2012 à 12:45:38 (+0100), Andreas Pieber a écrit : with absolutely no idea about slang... have you install slang-scala feature or just the deploy bundle? This one sounds that all bundles need to be available for slang to correctly install url handlers, compilers, ... Kind regards, Andreas [1] http://fusesource.com/forge/git/slang.git/?p=slang.git;a=blob;f=features/src/main/resources/features.xml;h=f244b8ac42963c7560e08ea142765f31a9a1c427;hb=HEAD I have not installed the feature because the maven target didn't work, so I installed the bundles manually. But I did install all the bundles mentionned in the link above. -- Guillaume Yziquel Crossing-Tech Parc Scientifique EPFL
Re: Embedding Karaf in a Java application
OK I've missed that example. But I'm not quite sure if it's that useful for paxexam-karaf. Typically your options section is pretty minimal (compared to other exam tests). Therefore I'm not quite sure about the use. In addition pax-exam starts the process always in a new VM which might be neither what you like. If this is what you desire to do you might want to give the various runners here [1] a shot. Kind regards, Andreas [1] https://svn.apache.org/repos/asf/karaf/trunk/tooling/exam/container/src/main/java/org/apache/karaf/tooling/exam/container/internal/runner/ 2012/3/6 Peter Gardfjäll peter.gardfjall.w...@gmail.com: Thanks JB, I actually already have my own Karaf distribution (sorry for being unclear on that point) so I was mostly interested in the wrapping part. Do you know if there are any nice examples of using the org.apache.karaf.main.Main class (maybe Main.main() is the best one)? Regarding PaxExam Karaf, I am already using it for integration tests. I'm just wondering if that project supports running Karaf outside of a JUnit context similar to the following PaxExam example: https://github.com/tonit/Learn-PaxExam/blob/master/lesson-servermode/src/main/java/org/ops4j/pax/exam/servermode/SimpleServer.java best regards, Peter On Tue, Mar 6, 2012 at 2:37 PM, Jean-Baptiste Onofré j...@nanthrax.net wrote: Hi Peter, I would create my own distribution of Karaf. If you want to hide/wrap it in your application, Main could be a way to achieve that. Pax-Exam-Karaf is designed more for the itests. Regards JB On 03/06/2012 02:27 PM, peterg wrote: Hi all, what would be the easiest way to embed a Karaf instance within an existing Java application? - using/deriving from org.apache.karaf.main.Main? - using PaxExam Karaf? - something else...? Ideally, I would like to just start an instance of my own Karaf distribution (available as a Maven artifact in my own repository) and interact with it (accessing/invoking bundles and services, for example). best regards, Peter -- View this message in context: http://karaf.922171.n3.nabble.com/Embedding-Karaf-in-a-Java-application-tp3803560p3803560.html Sent from the Karaf - User mailing list archive at Nabble.com. -- Jean-Baptiste Onofré jbono...@apache.org http://blog.nanthrax.net Talend - http://www.talend.com
Re: SCR and Karaf.
Hey, Well... I don't see your enviornment, but basically all you have to do now is to check the imports/exports of the bundles and export/import the correct classes. How exactly this works depends on your tool set. The full documentation for the maven-bundle-plugin is available here e.g. [1]. Kind regards, Andreas [1] https://felix.apache.org/site/apache-felix-maven-bundle-plugin-bnd.html On Tue, Feb 28, 2012 at 09:32, Guillaume Yziquel guillaume.yziq...@crossing-tech.com wrote: Le Tuesday 28 Feb 2012 à 06:15:50 (+0100), Andreas Pieber a écrit : Hey Guillaume, Hi, Andreas. Can you check your logs (simply write display in your karaf console or $KARAF_HOME/data/log/karaf.log) after installing the second bundle. In addition can you do a la command showing all your installed bundles. If you second bundle isn't started try a start BUNDLE_ID which might yield a better error message on the console. Kind regards, Andreas Thanks. That really helps: java.lang.ClassNotFoundException: guggla.test.Sample at org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(BundleLoader.java:506)[osgi-3.6.0.v20100517.jar:] at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:422)[osgi-3.6.0.v20100517.jar:] at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:410)[osgi-3.6.0.v20100517.jar:] at org.eclipse.osgi.internal.baseadaptor.DefaultClassLoader.loadClass(DefaultClassLoader.java:107)[osgi-3.6.0.v20100517.jar:] at java.lang.ClassLoader.loadClass(ClassLoader.java:248)[:1.6.0_29] at org.eclipse.osgi.internal.loader.BundleLoader.loadClass(BundleLoader.java:338)[osgi-3.6.0.v20100517.jar:] at org.eclipse.osgi.framework.internal.core.BundleHost.loadClass(BundleHost.java:232)[osgi-3.6.0.v20100517.jar:] at org.eclipse.osgi.framework.internal.core.AbstractBundle.loadClass(AbstractBundle.java:1197)[osgi-3.6.0.v20100517.jar:] at org.apache.felix.scr.impl.manager.ImmediateComponentManager.createImplementationObject(ImmediateComponentManager.java:178)[301:org.apache.felix.scr:1.6.0] at org.apache.felix.scr.impl.manager.ImmediateComponentManager.createComponent(ImmediateComponentManager.java:118)[301:org.apache.felix.scr:1.6.0] at org.apache.felix.scr.impl.manager.AbstractComponentManager$Unsatisfied.activate(AbstractComponentManager.java:997)[301:org.apache.felix.scr:1.6.0] at org.apache.felix.scr.impl.manager.AbstractComponentManager.activateInternal(AbstractComponentManager.java:333)[301:org.apache.felix.scr:1.6.0] at org.apache.felix.scr.impl.manager.AbstractComponentManager.enable(AbstractComponentManager.java:157)[301:org.apache.felix.scr:1.6.0] at org.apache.felix.scr.impl.config.ConfiguredComponentHolder.enableComponents(ConfiguredComponentHolder.java:256)[301:org.apache.felix.scr:1.6.0] at org.apache.felix.scr.impl.BundleComponentActivator.loadDescriptor(BundleComponentActivator.java:253)[301:org.apache.felix.scr:1.6.0] at org.apache.felix.scr.impl.BundleComponentActivator.initialize(BundleComponentActivator.java:147)[301:org.apache.felix.scr:1.6.0] at org.apache.felix.scr.impl.BundleComponentActivator.init(BundleComponentActivator.java:111)[301:org.apache.felix.scr:1.6.0] at org.apache.felix.scr.impl.Activator.loadComponents(Activator.java:285)[301:org.apache.felix.scr:1.6.0] at org.apache.felix.scr.impl.Activator.bundleChanged(Activator.java:203)[301:org.apache.felix.scr:1.6.0] at org.eclipse.osgi.framework.internal.core.BundleContextImpl.dispatchEvent(BundleContextImpl.java:919)[osgi-3.6.0.v20100517.jar:] at rg.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:227)[osgi-3.6.0.v20100517.jar:] at org.eclipse.osgi.framework.eventmgr.ListenerQueue.dispatchEventSynchronous(ListenerQueue.java:149)[osgi-3.6.0.v20100517.jar:] at org.eclipse.osgi.framework.internal.core.Framework.publishBundleEventPrivileged(Framework.java:1349)[osgi-3.6.0.v20100517.jar:] at org.eclipse.osgi.framework.internal.core.Framework.publishBundleEvent(Framework.java:1300)[osgi-3.6.0.v20100517.jar:] at org.eclipse.osgi.framework.internal.core.BundleHost.startWorker(BundleHost.java:380)[osgi-3.6.0.v20100517.jar:] at org.eclipse.osgi.framework.internal.core.AbstractBundle.start(AbstractBundle.java:284)[osgi-3.6.0.v20100517.jar:] at org.eclipse.osgi.framework.internal.core.AbstractBundle.start(AbstractBundle.java:276)[osgi-3.6.0.v20100517.jar:] at org.apache.karaf.shell.osgi.StartBundle.doExecute(StartBundle.java:29)[13:org.apache.karaf.shell.osgi:2.1.4.fuse-00-15] at org.apache.karaf.shell.osgi.BundlesCommand.doExecute(BundlesCommand.java:49)[13:org.apache.karaf.shell.osgi:2.1.4.fuse-00-15] at org.apache.karaf.shell.console.OsgiCommandSupport.execute
Re: SCR and Karaf.
Hey Guillaume, Can you check your logs (simply write display in your karaf console or $KARAF_HOME/data/log/karaf.log) after installing the second bundle. In addition can you do a la command showing all your installed bundles. If you second bundle isn't started try a start BUNDLE_ID which might yield a better error message on the console. Kind regards, Andreas On Mon, Feb 27, 2012 at 17:26, Guillaume Yziquel guillaume.yziq...@crossing-tech.com wrote: Hi. I've been trying to integrate a Scala scripting engine as an OSGi bundle inside Karaf. However, this engine uses SCR to create components and make them available. The following links have been of use: http://karaf.922171.n3.nabble.com/Followup-on-getting-felix-scr-commands-to-show-up-in-karaf-td2700804.html http://dz.prosyst.com/pdoc/mbs_prof_6.1/um/framework/bundles/osgi/scr/scr.html So I now am able to do the following: karaf@root scr list Id State Name [ 0] [registered ] guggla.ScalaScriptEngineFactory karaf@root and I do have my scripting engine available as an SCR component. I've however made another OSGi bundle with an SCR component that attempts to access the guggla.ScalaScriptEngineFactory. However, I do not see it listed when I type 'scr list' so I assume that it has not been able to start up correctly. But where should I look for the errors that prevent this new component from being instantiated properly? All apologies if I did not post on the relevant mailing list, as I simply wasn't really sure where to ask for that. -- Guillaume Yziquel Crossing-Tech
Re: Problems with Karaf 2.2.5 integration tests
Hey Lennart, Just to make sure: If I read the previous thread correctly you're using labs-paxexam-karaf for your integration tests on karaf-2.x; right? On Tue, Feb 28, 2012 at 03:43, lennart.jore...@teliasonera.com wrote: /SKIP It doesn't seem to matter if I provision the bundle holding the KarafIntegrationTest class, or if I add it to the bootDelegationPackage. ok, the bootDelegationPackage thing will definitely not work because the bundle is simply no where available in the karaf environment; nevertheless the provision the bundle holding the KarafIntegrationTest class should work. Can you pls post the snippet of what you're doing? What is the pattern for doing this? Basically you've two options here: either merge your sources into your test package using maven or provision the bundle with the general classes. Kind regards, Andreas I.e. extracting the main algorithms to another maven project, which is then imported into test scope of the actual integration test project. -- // Bästa hälsningar, // [sw. Best regards,] // // Lennart Jörelid, Systems Architect // email: lennart.jore...@teliasonera.com // cell: +46 708 507 603 // skype: jgurueurope
Re: About karaf upgrade
Hey Xilian, I'm really sorry to say this, but I don't think that there will be an easy way to do a delta-update without an explicit Karaf Feature doing this (hey, btw, wouldn't this be an idea for gsoc?). Such a feature could extract the current state of the system, download the new version, starts an external script, replaces/deletes the old files, applies the old state, and starts up again. This might be even kind of OK if we can always ask a user what/how he wants to do some merges. For now you don't have many options... Basically you should always start with a new karaf instance only porting the old one to the new one... for your point 3.: use a tool like kompare or winmerge to find out how the folders had been changed and apply the correct parts for your point 2.: simple copy the deploy folder for your point 1.: only solution which jumps into my mind here is: convince your customers to write rather karaf scripts instead of direct karaf commands. That way you can simply patch-apply the install/start logic on a new server from those scripts. Not perfect, but till the -p option for *:install I don't really see another option. Kind regards, Andreas On Tue, Feb 21, 2012 at 09:44, Jean-Baptiste Onofré j...@nanthrax.net wrote: Hi Xilai, we planned to add a -p (persistent) option to bundle:install, feature:install, kar:install, etc, in order to store the feature/bundle in the system folder in addition of the data folder. It's planned for Karaf 3.0. Regards JB On 02/21/2012 09:24 AM, XiLai Dai wrote: Hi, Andreas, Thanks for you reply! Actually , we want to cover all the artifacts installation behavior from customer: 1.features/bundles which installed from console command; 2.kar (or bundle) files which deployed into deploy/ 3.configuration files (new added or modified ) in etc/ we want to do a delta update to upgrade the old karaf container, keeping all of already installed features/bundles/kar/config files be there and ready to work. Now the main problem is : How to recognize/reserve the customer installed artifacts in the data/ ? which means only clear the karaf system level artifacts from data/ . Thanks. Xilai *From:*Andreas Pieber [mailto:anpie...@gmail.com] *Sent:* Thursday, February 16, 2012 5:03 PM *To:* user@karaf.apache.org *Subject:* Re: About karaf upgrade OK, currently there is nothing available. And it's also quite dangerous to do so. I think the better approach would be to create your own distribution, upgrade there and produce a completely new rollout package. Still, of cause you can do as you like and it heavily depends on what you do. E.g. how do you deploy your artifacts? Do you have a kar package? Do you install those files one after the other? What you can do depends heavily on your setup. Basically, assume that you change nothing to the original karaf files the following procedure should do: a) stop karaf b) extract the new karaf version c) merge the old system deploy dir into the new one d) remove the old karaf dir e) move the new one to the place where the old had been. Though, pls keep in mind that all bundles you've started with start ID wont start up automatically again (because you've deleted the data dir). Only bundles in the deploy dir will be started again. But tbh I don't know what exactly you would win by this approach. Maybe you can describe how you're using Karaf (and what you modify in the distribution for your setup) a little bin in more detail? Kind regards, Andreas On 02/16/2012 08:39 AM, XiLai Dai wrote: Hello, Just a question from user/production point of view, what is a better way to get karaf container upgrade to a new version with all of the features/bundles/config files existing in the old version karaf container? E.g. from karaf 2.2.2 - karaf 2.2.5. it’s really boring to redeploy these user artifacts again manually L, are there some approach or tools/scripts or something else can make this process more “automatically”? or is it possible to have a script to upgrade existing karaf container which replace the system artifacts which within the lib/, system/ directory to the new version artifacts? Thanks! Regard Xilai -- Jean-Baptiste Onofré jbono...@apache.org http://blog.nanthrax.net Talend - http://www.talend.com
Re: About karaf upgrade
OK, currently there is nothing available. And it's also quite dangerous to do so. I think the better approach would be to create your own distribution, upgrade there and produce a completely new rollout package. Still, of cause you can do as you like and it heavily depends on what you do. E.g. how do you deploy your artifacts? Do you have a kar package? Do you install those files one after the other? What you can do depends heavily on your setup. Basically, assume that you change nothing to the original karaf files the following procedure should do: a) stop karaf b) extract the new karaf version c) merge the old system deploy dir into the new one d) remove the old karaf dir e) move the new one to the place where the old had been. Though, pls keep in mind that all bundles you've started with start ID wont start up automatically again (because you've deleted the data dir). Only bundles in the deploy dir will be started again. But tbh I don't know what exactly you would win by this approach. Maybe you can describe how you're using Karaf (and what you modify in the distribution for your setup) a little bin in more detail? Kind regards, Andreas On 02/16/2012 08:39 AM, XiLai Dai wrote: Hello, Just a question from user/production point of view, what is a better way to get karaf container upgrade to a new version with all of the features/bundles/config files existing in the old version karaf container? E.g. from karaf 2.2.2 - karaf 2.2.5. it's really boring to redeploy these user artifacts again manually L, are there some approach or tools/scripts or something else can make this process more automatically? or is it possible to have a script to upgrade existing karaf container which replace the system artifacts which within the lib/, system/ directory to the new version artifacts? Thanks! Regard Xilai
Re: karaf test framework: multiple test classes files in the same karaf instance
Hey Giacomo, Well, paxexam-karaf completely relays on pax exams features in this context. Since there is no such feature in pax exam right now (at least I don't know a workaround) it's also not possible in paxexam-karaf. BUT maybe Toni or Harald know some internal workaround/hack to make this possible. Best to ask on the ops4j general list [1] for this feature. Kind regards, Andreas [1] http://lists.ops4j.org/mailman/listinfo/general On 02/08/2012 05:30 PM, Giacomo Coletta wrote: Hi, It’s sometime I’m using the karaf integration test framework and it works nicely. I usually run many tests in the same karaf instance. Up to now i only was able to properly configure, start karaf and run the tests in the same test class. Now the number of test is growing and i would like to put tests relating to different aspects in different classes, just to find them more easily, but anyway i would like not to have to configure and restart karaf several times, to save time. Any of you know how this could be done ? Thanks Giacomo
Re: featuresRepositories is overriden by defined feature repositories?
ok, first of all one issue: your featuresrepository is not completely correct. mvn:org.apache.karaf.assemblies.features/standard/2.2.5/xml/features... Besides of this you're facing a completely different problem. You're using karaf 2.2.4 and expect it to start 2.2.5 bundles simply because you change the features.xml repository. This wont work that way because of (e.g.; only one reason) etc/startup.properties. It's NOT a good idea to use different karaf features.xml than comming with your karaf distribution (but feel free to change/add any other repository url). I've tested what you've described with karaf 2.2.5 and everything works as expected. Can you pls check if using Karaf 2.2.5 fixes your problems too? Kind regards, Andreas On Thu, Jan 26, 2012 at 09:04, christoforos.vasilatos christoforos.vasila...@gmail.com wrote: So, your suggestion is to create a features.xml and add explicitly the features that i want to add with their version?Like someone would do in order to add his own bundles, but use it for i.e. camel cxf etc? But isn't it enough to define them in the featuresBoot property of the org.apache.karaf.features.cfg configuration file (for example ssh;version=2.2.5,camel;version=2.9.0)? I tried the above and although the camel and some extra bundles were added correctly at the defined version, the core packages of karaf were not (for example even though defining ssh;version=2.2.5 the bundle installed at last was at 2.2.4 version) Best Regards, Christoforos -- View this message in context: http://karaf.922171.n3.nabble.com/featuresRepositories-is-overriden-by-defined-feature-repositories-tp3688482p3689882.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: featuresRepositories is overriden by defined feature repositories?
OK, now I get it. The problem is https://search.maven.org/remotecontent?filepath=org/apache/servicemix/apache-servicemix/4.4.0/apache-servicemix-4.4.0-features.xml which references features directly using the repository tag. Well, currently there is no way to blackout some repositories, but an additional blackedoutFeaturesRepository tag could be definitely of use in such cases. Can you please create a feature request for this in the JIRA? Thanks and kind regards, Andreas On Thu, Jan 26, 2012 at 10:00, christoforos.vasilatos christoforos.vasila...@gmail.com wrote: No, I am definetely using karaf 2.2.5. The downloaded archive with maven dependency is definitely 2.2.5 There is a part of the list -t 0 command [ 38] [Active ] [Created ] [ ] [ 30] Apache Karaf :: Features :: Management (2.2.5) [ 39] [Active ] [Created ] [ ] [ 30] Apache Karaf :: Shell :: Log Commands (2.2.5) [ 40] [Active ] [Created ] [ ] [ 30] Apache Karaf :: JAAS :: Modules (2.2.5) [ 41] [Active ] [Created ] [ ] [ 30] Apache Karaf :: Shell :: ConfigAdmin Commands (2.2.4) [ 42] [Active ] [Created ] [ ] [ 30] Apache Karaf :: Shell :: SSH (2.2.4) [ 43] [Active ] [Created ] [ ] [ 30] Apache Karaf :: Management (2.2.4) [ 44] [Active ] [Created ] [ ] [ 30] Apache Karaf :: Management :: MBeans :: System (2.2.4) which indicates that some bundles are 2.2.5 but others are 2.2.4 while if i do not include my custon org.apache.karaf.features.cfg configuration file all the above bundles have 2.2.5 version This is the prefiltered configuration file: http://karaf.922171.n3.nabble.com/file/n3689989/org.apache.karaf.features.cfg org.apache.karaf.features.cfg and this is the deployed configuration file after maven filter http://karaf.922171.n3.nabble.com/file/n3689989/org.apache.karaf.features.cfg org.apache.karaf.features.cfg Best Regards, Christoforos -- View this message in context: http://karaf.922171.n3.nabble.com/featuresRepositories-is-overriden-by-defined-feature-repositories-tp3688482p3689989.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: featuresRepositories is overriden by defined feature repositories?
I'm with you. KARAF-971 will fix a lot of problems. Implementing blacklists in two steps (first one with * param (for all, as proposed by Gert) and one which allows a more detailed configuration. BTW, my comment was no attach against SMX :-) I know that you guys would definitely apply; it's just in case someone would like to use a version where this had not been fixed/done correctly. Or some other software where this is simply done wrong. Kind regards, Andreas On Thu, Jan 26, 2012 at 13:02, Jean-Baptiste Onofré j...@nanthrax.net wrote: Hi guys, yes, KARAF-971 is in my TODO. It just requires an enhancement on Pax-URL (to have a URL resolution service transforming an abstract version range in a concrete URL). I think disabling transitive features could address the problem. It's really easy to do, but it's not the same as Karaf 971 IMHO. A mix of both could address all the questions. Regards JB On 01/26/2012 12:12 PM, Gert Vanthienen wrote: Andreas, For our ServiceMix features descriptors, we'll gladly migrate to using version ranges but we'll need to get https://issues.apache.org/**jira/browse/KARAF-971https://issues.apache.org/jira/browse/KARAF-971fixed in order for those to work when we move from one version of Camel/CXF/... to the next one. That should fix most of the issues Christoforos is seeing as well, I guess. How about adding a control flag to completely disable transitive feature repositories? It's a bit more coarse-grained than the solution you're suggesting, but it would allow people to completely control the list of repository URLs being used within their own assembly or distribution without having to go through the features descriptor and figuring out which ones to blacklist. E.g. for ServiceMix, we already have all the URLs listed for CXF, Camel, ActiveMQ, ... so this might even be a good alternative to KARAF-971 for us as well. Regards, Gert Vanthienen FuseSource Web: http://fusesource.com Blog: http://gertvanthienen.**blogspot.com/http://gertvanthienen.blogspot.com/ On Thu, Jan 26, 2012 at 10:29 AM, Andreas Pieber anpie...@gmail.com mailto:anpie...@gmail.com wrote: OK, now I get it. The problem is https://search.maven.org/**remotecontent?filepath=org/** apache/servicemix/apache-**servicemix/4.4.0/apache-** servicemix-4.4.0-features.xmlhttps://search.maven.org/remotecontent?filepath=org/apache/servicemix/apache-servicemix/4.4.0/apache-servicemix-4.4.0-features.xmlwhich references features directly using the repository tag. Well, currently there is no way to blackout some repositories, but an additional blackedoutFeaturesRepository tag could be definitely of use in such cases. Can you please create a feature request for this in the JIRA? Thanks and kind regards, Andreas On Thu, Jan 26, 2012 at 10:00, christoforos.vasilatos christoforos.vasilatos@gmail.**com christoforos.vasila...@gmail.com mailto:christoforos.**vasila...@gmail.comchristoforos.vasila...@gmail.com wrote: No, I am definetely using karaf 2.2.5. The downloaded archive with maven dependency is definitely 2.2.5 There is a part of the list -t 0 command [ 38] [Active ] [Created ] [ ] [ 30] Apache Karaf :: Features :: Management (2.2.5) [ 39] [Active ] [Created ] [ ] [ 30] Apache Karaf :: Shell :: Log Commands (2.2.5) [ 40] [Active ] [Created ] [ ] [ 30] Apache Karaf :: JAAS :: Modules (2.2.5) [ 41] [Active ] [Created ] [ ] [ 30] Apache Karaf :: Shell :: ConfigAdmin Commands (2.2.4) [ 42] [Active ] [Created ] [ ] [ 30] Apache Karaf :: Shell :: SSH (2.2.4) [ 43] [Active ] [Created ] [ ] [ 30] Apache Karaf :: Management (2.2.4) [ 44] [Active ] [Created ] [ ] [ 30] Apache Karaf :: Management :: MBeans :: System (2.2.4) which indicates that some bundles are 2.2.5 but others are 2.2.4 while if i do not include my custon org.apache.karaf.features.cfg configuration file all the above bundles have 2.2.5 version This is the prefiltered configuration file: http://karaf.922171.n3.nabble.**com/file/n3689989/org.apache.** karaf.features.cfghttp://karaf.922171.n3.nabble.com/file/n3689989/org.apache.karaf.features.cfg org.apache.karaf.features.cfg and this is the deployed configuration file after maven filter http://karaf.922171.n3.nabble.**com/file/n3689989/org.apache.** karaf.features.cfghttp://karaf.922171.n3.nabble.com/file/n3689989/org.apache.karaf.features.cfg org.apache.karaf.features.cfg Best Regards, Christoforos -- View this message in context: http://karaf.922171.n3
Re: featuresRepositories is overriden by defined feature repositories?
Well, not really replacing them, though, what you can do is to define the version of the features you wish to start. In the features.xml you can define it via the version tag (feature version=2.2.5theFeatureToUse/feature) and in the features.xml usgin /version; e.g.: myfeature/2.2.5 I hope this helps. Kind regards, Andreas On Wed, Jan 25, 2012 at 19:37, christoforos.vasilatos christoforos.vasila...@gmail.com wrote: Maybe i didnt describe the problem correct. I follow the instructions. After creating the distribution, the file at ./etc/org.apache.karaf.features.cfg folder is the one that i created and the featuresRepositories has the value that i had defined, but without adding manually other urls, they are added automatically at first boot up of karaf (keep in mind that i execute features:listurl right after karaf first boot up after custom archive extraction). My investigation led me to the conclusion that they are automatically added from the features of the defined repositories. (i.e. apache-servicemix http://repo1.maven.org/maven2/org/apache/servicemix/apache-servicemix/4.4.0/apache-servicemix-4.4.0-features.xml http://repo1.maven.org/maven2/org/apache/servicemix/apache-servicemix/4.4.0/apache-servicemix-4.4.0-features.xml ) Thanks again for your time and immediate response. /Christoforos -- View this message in context: http://karaf.922171.n3.nabble.com/featuresRepositories-is-overriden-by-defined-feature-repositories-tp3688482p3688541.html Sent from the Karaf - User mailing list archive at Nabble.com.
Re: paxexam labs question
instead of using mvn:... you can also use file:... and pointing it to the one in your target directory. Kind regards, Andreas On Tue, Jan 17, 2012 at 22:12, Vestal, Rick r...@vestalclan.org wrote: Hi all, Finally back to some osgi work and was very happy to see ServiceMix 4.4.0 out so that I can move to it and not lose the paxexam karaf test framework. I've got a question on how other users are testing their modules. Ideally, I'd like to have a set of integration tests live in the osgi bundle module that use the paxexam labs work. The problem I'm running into is that the 'latest' version of my osgi bundle isn't what is provisioned as the provision option is picking up the last one installed into my local maven repository. Is there an option to make the provision call use the 'just packaged' bundle? Thanks, -- Rick
Re: Starting/Stopping Programmatically Bundles
Hey, On Mon, Jan 16, 2012 at 18:23, Hervé BARRAULT herve.barra...@gmail.comwrote: I haven't register the Bundle Listener, as i thought exposing a BundleListener as an OSGI service does something like this. Exposing a BundleListener typically means register it in the OSGi registry; and this is done as JB had shown. Is this bundleContext linked to the current bundle or all bundle ? Well, bundleContext.getBundle() will return the current bundle; but you can also access other bundles using bundleContext.getBundle(ID) or one of the other overloads. My Bundle Checker shall listener to other bundle and manipulate them. I will try the BundleActivator to get the BundleContext. If you really want to check other bundles once they come up the BundleListener is the perfect way to go. You can access the bundle object of the bundles started through the event. so you can also manipulate them there. Btw, don't you want to upgrade to a newer Karaf version (although It's not required for your specific use case) there where many bugs fixed between 2.0.0 and 2.2.5. Kind regards, Andreas Regards Hervé On 1/16/12, Jean-Baptiste Onofré j...@nanthrax.net wrote: Hi Hervé, Did you register your listener in the bundle context, with something like ?: getBundleContext().addBundleListener(myBundleListener); To start/stop bundle, you can do: getBundleContext().getBundle(id).stop()... Regards JB On 01/16/2012 06:07 PM, Hervé BARRAULT wrote: HI, i'm using Karaf 2.0.0 and i'm trying to find a way to start/stop a bundle with java code. I have difficulties to find the right API to use (and which service to import). I have tried to create a BundleListener and expose it as an osgi service (in order to being notified of bundles start/stop) but i'm not notified. I thought getting data about bundles and being able to manipulate it. What are the correct steps to do this ? Regards Hervé -- Jean-Baptiste Onofré jbono...@apache.org http://blog.nanthrax.net Talend - http://www.talend.com
Re: Starting/Stopping Programmatically Bundles
Happy to hear that it works; btw, wouldn't it be easier retrieving all bundles using bundleContext.getBundles and find the ones you want to start/stop there? Kind regards, Andreas On Mon, Jan 16, 2012 at 18:46, Hervé BARRAULT herve.barra...@gmail.comwrote: Hi, thanks for quick answers. By registering using activator it is working well for notification :) public class Test implements BundleListener, BundleActivator { public void start(BundleContext arg0) throws Exception { arg0.addBundleListener(this); } public void stop(BundleContext arg0) throws Exception { arg0.removeBundleListener(this); } public void bundleChanged(BundleEvent arg0) { final Bundle bundle = arg0.getBundle(); System.out.println(SIGNAL + bundle.getSymbolicName() + - + arg0.getType()); } } now i think i can start/stop bundles by keeping references. As i can't determine before the number of the bundle i should use the symbolicName (not perfect if we have to use different versions). Regards Hervé On 1/16/12, Andreas Pieber anpie...@gmail.com wrote: Hey, On Mon, Jan 16, 2012 at 18:23, Hervé BARRAULT herve.barra...@gmail.comwrote: I haven't register the Bundle Listener, as i thought exposing a BundleListener as an OSGI service does something like this. Exposing a BundleListener typically means register it in the OSGi registry; and this is done as JB had shown. Is this bundleContext linked to the current bundle or all bundle ? Well, bundleContext.getBundle() will return the current bundle; but you can also access other bundles using bundleContext.getBundle(ID) or one of the other overloads. My Bundle Checker shall listener to other bundle and manipulate them. I will try the BundleActivator to get the BundleContext. If you really want to check other bundles once they come up the BundleListener is the perfect way to go. You can access the bundle object of the bundles started through the event. so you can also manipulate them there. Btw, don't you want to upgrade to a newer Karaf version (although It's not required for your specific use case) there where many bugs fixed between 2.0.0 and 2.2.5. Kind regards, Andreas Regards Hervé On 1/16/12, Jean-Baptiste Onofré j...@nanthrax.net wrote: Hi Hervé, Did you register your listener in the bundle context, with something like ?: getBundleContext().addBundleListener(myBundleListener); To start/stop bundle, you can do: getBundleContext().getBundle(id).stop()... Regards JB On 01/16/2012 06:07 PM, Hervé BARRAULT wrote: HI, i'm using Karaf 2.0.0 and i'm trying to find a way to start/stop a bundle with java code. I have difficulties to find the right API to use (and which service to import). I have tried to create a BundleListener and expose it as an osgi service (in order to being notified of bundles start/stop) but i'm not notified. I thought getting data about bundles and being able to manipulate it. What are the correct steps to do this ? Regards Hervé -- Jean-Baptiste Onofré jbono...@apache.org http://blog.nanthrax.net Talend - http://www.talend.com
Re: karaf-3.0.0-SNAPSHOT: NullPointerException
Yep, that's what I've meant. I've seen a strange behavior during my test looking as if karaf is downloading various artifacts at runtime which might cause such strange behavior as you've encountered. Kind regards, Andreas On Thu, Jan 12, 2012 at 12:36, Jamie G. jamie.goody...@gmail.com wrote: I believe he means testing Karaf without an internet connection. Doing such would prevent maven from being able to download artifacts. Cheers, Jamie On Thu, Jan 12, 2012 at 4:37 AM, Guofeng Zhang guof...@radvision.com wrote: No. From the log, the bundle are resolved from system repository. What does Karaf offline mean? From: Andreas Pieber [mailto:anpie...@gmail.com] Sent: Thursday, January 12, 2012 3:21 PM To: user@karaf.apache.org Subject: Re: karaf-3.0.0-SNAPSHOT: NullPointerException Hey, Are you behind a proxy or testing Karaf offline? Kind regards, Andreas On Thu, Jan 12, 2012 at 03:23, Guofeng Zhang guof...@radvision.com wrote: Hi, Recently I upgrade to the last trunk of Karaf. After build it, I unzip it and launch it using bin\karaf.bat without any customization, I see the following exception. There is no such exception saw before. java.lang.NullPointerException at org.apache.karaf.features.command.FeatureFinder.updated(FeatureFinder.java:46)[50:org.apache.karaf.features.command:3.0.0.SNAPSHOT] at Proxy57827e67_aace_4683_924e_c2eb69b9ba83.updated(Unknown Source)[:] at org.apache.felix.cm.impl.ConfigurationManager$ManagedServiceUpdate.run(ConfigurationManager.java:1160)[5:org.apache.felix.configadmin:1.2.8] at org.apache.felix.cm.impl.UpdateThread.run(UpdateThread.java:104)[5:org.apache.felix.configadmin:1.2.8] at java.lang.Thread.run(Thread.java:662)[:1.6.0_24] Thanks. Guofeng
Re: karaf-3.0.0-SNAPSHOT: NullPointerException
Hey, Are you behind a proxy or testing Karaf offline? Kind regards, Andreas On Thu, Jan 12, 2012 at 03:23, Guofeng Zhang guof...@radvision.com wrote: Hi, ** ** Recently I upgrade to the last trunk of Karaf. After build it, I unzip it and launch it using bin\karaf.bat without any customization, I see the following exception. There is no such exception saw before. java.lang.NullPointerException at org.apache.karaf.features.command.FeatureFinder.updated(FeatureFinder.java:46)[50:org.apache.karaf.features.command:3.0.0.SNAPSHOT] at Proxy57827e67_aace_4683_924e_c2eb69b9ba83.updated(Unknown Source)[:] at org.apache.felix.cm.impl.ConfigurationManager$ManagedServiceUpdate.run(ConfigurationManager.java:1160)[5:org.apache.felix.configadmin:1.2.8] at org.apache.felix.cm.impl.UpdateThread.run(UpdateThread.java:104)[5:org.apache.felix.configadmin:1.2.8] at java.lang.Thread.run(Thread.java:662)[:1.6.0_24] ** ** Thanks. ** ** Guofeng ** **