Hello Jbonfre.
No more exceptions. Use version 1.8 of java. I do not know if it has
anything to do.
At first I was like the following error:
29/10/2015 17: 06: 51.419 | WARN | pool-2-thread-1 | AbstractLifeCycle | 122
- org.eclipse.jetty.util - 9.2.10.v20150310 | FAILED HttpServiceContext
Thanks for the help.
It's a bit weird because I'm using pax-jsf-support that includes
pax-cdi-web-webbeans.
I modified the feature pax-jsf-support to use pax-web-weld and get the same
error. Now with the class of Weld:
29/10/2015 16: 49: 15.982 | WARN | 33-thread-pool-1 |
Hello.
I'm trying to use karaf 4.0.1 + pax-cdi in a web application with JSF. When
I run Karaf I get the following error:
(I gather I need a bundle but do not know what can be.)
2015-10-29 15:48:50,549 | INFO | pool-33-thread-1 |
CdiServletContainerInitializer | 161 - org.ops4j.pax.cdi.web -
Hello.
I'm trying to start karaf without Internet access. I changed my file
org.ops4j.pax.url.mvn.cfg to contain:
org.ops4j.pax.url.mvn.repositories = \
file: $ {karaf.home}/${karaf.default.repository}@id=system.repository, \
http://repo1.maven.org/maven2@id=central, \
It appears in the standard feature as:
mvn: org.apache.karaf.diagnostic /
org.apache.karaf.diagnostic.boot / 4.0.1
--
View this message in context:
http://karaf.922171.n3.nabble.com/Exception-starting-Karaf-4-0-1-without-internet-tp4043430p4043448.html
Sent
These jars are only in the lib directory. Both Windows and Solaris.
--
View this message in context:
http://karaf.922171.n3.nabble.com/Exception-starting-Karaf-4-0-1-without-internet-tp4043430p4043447.html
Sent from the Karaf - User mailing list archive at Nabble.com.
This is:
nov 11, 2015 1:15:34 PM org.apache.karaf.main.Main launch
INFORMACIÓN: Installing and starting initial bundles
nov 11, 2015 1:15:34 PM org.apache.karaf.main.Main launch
INFORMACIÓN: All initial bundles installed and set to start
nov 11, 2015 1:15:34 PM
JB Hi.
I've gotten it to work copying from windows content:
.m2/repository/org/apache/karaf/jaas and
.m2/repository/org/apache/karaf/diagnostic
to $ HOME/.m2/repository/org/apache/karaf/
I'm not administrator of Solaris (working with virtual machine). Do you have
enough with this information?
Hi,
I've looked at the karaf.home/lib/boot directory:
Windows:
org.apache.karaf.diagnostic.boot-4.0.1.jar (31,426 bytes)
org.apache.karaf.jaas.boot-4.0.1.jar (16,560 bytes)
Solaris
31426 November 11 12:57 org.apache.karaf.diagnostic.boot-4.0.1.jar
16560 November 11 12:57
Hi, thanx for the help.
My mistake.
I copied the error occurs because the distribution had wrongly generated.
However, once repaired the above. I have the following situation:
- In W7 runs without Internet.
- In Solaris 10 is not executed.
- I have not tried to run it on Solaris 10 with
Good Morning.
I performed the following tests:
1. I downloaded versions 4.0.1,4.0.2 and 4.0.3
2. In Solaris, I have ensured that there is no $USER_HOME/.m2
3. I unpacked and executed 4.0.1
3.1 Karaf not start. It hangs and displays the error Could not transfer
artifact
Hello.
I have several problems with version 4.0.3, possibly caused by my
inexperience with Karaf
1. I created a feature and a distribution using the archetypes of karaf
4.0.3.
When generating the feature I get the next item in the pom.xml:
(OBR)
And to start the distribution which includes that
Hi, thanks for the help.
1. So it's an error in the archetype?
2. Sorry. Rookie mistake. I forgot to include my minimum distribution
features.
What is the best way to do it in my own feature or in the distribution?
--
View this message in context:
If I include this in my pom:
validate
process-resources
features-validate-descriptor
target/features.xml
I have an example in this direction:
https://github.com/LuisLozano/karafExamples
When using version 2.2.0 org.apache.aries.jpa.blueprint service is not
displayed. However, using version 2.1.0 the service itself perfectly
displayed.
--
View this message in context:
Hello. Thank you for the explanation.
You mean if I want to use version 2.2.0 I have to do the following?
1. A blueprint bundle containing the persistence unit and DAO objects.
2. A blueprint bundle setting out the service that uses the DAO.
The DAO objects are using the entityManager and
When use pax-cdi in karaf 4.0.3, I get the following exception:
javax.enterprise.inject.CreationException: java.lang.ClassNotFoundException:
org.apache.webbeans.proxy.OwbNormalScopeProxy not found by
org.ops4j.pax.cdi.extension [147]
You can see the code:
Hi,
any ideas?
I've been trying several things, but I have not received any positive
results.
--
View this message in context:
http://karaf.922171.n3.nabble.com/Karaf-4-0-1-pax-cdi-tp4043254p4043287.html
Sent from the Karaf - User mailing list archive at Nabble.com.
Oh, Sorry. Copy and paste error.
I'm using 4.0.1 karaf and pax-cdi for injections and get the following
exception when I start and when they stop karaf:
karaf @ root ()> ERROR: Bundle communications.impl [9] EventDispatcher:
Error durin
g dispatch. (Java.lang.NullPointerException)
Hi,
Estoy usando karaf 4.0.1 y pax-cdi para realizar inyecciones y obtengo la
siguiente excepción cuando arranco karaf y cuando lo paro:
karaf@root()> ERROR: Bundle communications.impl [9] EventDispatcher: Error
durin
g dispatch. (java.lang.NullPointerException)
java.lang.NullPointerException
I tried to make a simple example, but I can not reproduce it.
It is possible I need to add more complex services.
Perhaps the problem is here:
"org.hibernate.osgi.HibernateBundleActivator.start"
And I need to add JPA to reproduce.
--
View this message in context:
Hello.
If it's any help, I found a workaround.
If my bundle begins after the start of the hibernate-OSGi bundle, the
exception does not occur.
--
View this message in context:
http://karaf.922171.n3.nabble.com/Karaf-4-0-1-pax-cdi-II-tp4043285p4043293.html
Sent from the Karaf - User mailing
Hi Christian:
karaf@root()> service:list EntityManagerFactory
[javax.persistence.EntityManagerFactory]
osgi.unit.name = salazarJPA
osgi.unit.provider = org.hibernate.jpa.HibernatePersistenceProvider
osgi.unit.version = 0.0.1
service.bundleid = 168
Hello Christian. You guessed right!
I have added:
Persistence API
mvn: org.hibernate.javax.persistence /
hibernate-jpa-2.1-api / 1.0.0.Final
And I put my pom:
org.hibernate.javax.persistence
hibernate-jpa-2.1-api
1.0.0.Final
With this works fine.
Thank
Hi, thanks for the help.
These are my boot features:
org.apache.karaf.tooling
karaf-maven-plugin
true
Hi Achim. Thanks for the reply.
I think the problem is in the "transaction-api" feature, which includes
javax.el/javax.el-api/3.0.0.
Do you know any way to make a distribution that does not include a specific
bundle? I need a distribution with the features indicated in my previous
post.
--
View
Hello Christian. Thanks for your help.
*My DataSource is in the deploy folder and has the following contents:
*
http://www.osgi.org/xmlns/blueprint/v1.0.0;
xmlns:cm="http://aries.apache.org/blueprint/xmlns/blueprint-cm/v1.1.0;>
Hello.
I have an application that runs on Karaf 3.0.3 and would like to migrate to
Karaf 4.0.1.
The main problem I have with the migration of data service: I do not get
that blueprint inject the EntityManager.
*This is my pom:*
dataService
0.0.1
bundle
DataBaseService
Hello. Thanks for the help.
I think. I installed the JPA, transaction and Hibernate 4.3.6.Final
features.
And this is my persistence.xml
http://java.sun.com/xml/ns/persistence
http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd;
xmlns = "http://java.sun.com/xml/ns/persistence;
xmlns: xsi
Hello.
I have a problem with a simple service. This service is created using
blueprint with the following file:
http://www.osgi.org/xmlns/blueprint/v1.0.0;
xmlns:jpa="http://aries.apache.org/xmlns/jpan/v1.0.0;
xmlns:tx="http://aries.apache.org/xmlns/transactions/v1.2.0;
Yes. No more bundles. API only contains interfaces.
The Dao object uses EntityManager. The service calls the Dao.
--
View this message in context:
http://karaf.922171.n3.nabble.com/4-0-3-Karaf-JPA-problem-tp4043497p4043504.html
Sent from the Karaf - User mailing list archive at Nabble.com.
Yes i do.
One bundle for de api, and one bundle for de implementation.
--
View this message in context:
http://karaf.922171.n3.nabble.com/4-0-3-Karaf-JPA-problem-tp4043497p4043502.html
Sent from the Karaf - User mailing list archive at Nabble.com.
wrap
obr
aries-blueprint
shell
shell-compat
feature
jaas
ssh
management
bundle
config
deployer
diagnostic
feature
instance
kar
log
package
Hello, JB, Christian.
This solution is perfect.
Thank you very much to you both for the help.
--
View this message in context:
http://karaf.922171.n3.nabble.com/Dynamic-parameters-in-persistence-xml-tp4043602p4043615.html
Sent from the Karaf - User mailing list archive at Nabble.com.
Karaf 4.0.1
I try to install this features:
features:install war pax-jsf-support jpa jndi webconsole hibernate
hibernate-validator transaction
Aparently it is installed without problems, but when i restart karaf i get
the next error and myfaces blundle is not started.
karaf@root()>
What is the best way to export and import instances?
To export:
I guess building a zip of "instance / my_instance_name".
Import:
Unzip the export file and modify the file instance.properties and edit files
in the directory etc
--
View this message in context:
Hello. I have a doubt.
Is there any way to make persistence.xml file parameters are dynamic?
For example, hibernate.show_sql or hibernate.dialect
--
View this message in context:
http://karaf.922171.n3.nabble.com/Dynamic-parameters-in-persistence-xml-tp4043602.html
Sent from the Karaf - User
I have an installation of karaf 4.0.5.
I create the file etc/org.ops4j.datasource-myDS with the following
information:
url = jdbc:h2:file:./db/myDatabase
dataSourceName = myDS
osgi.jdbc.driver.name = H2-pool-xa
The result is that two javax.sql.DataSource services are created with the
same name =
Thanx jbonofre.
That's exactly what I was doing (using )
I will wait to version 4.0.6
--
View this message in context:
http://karaf.922171.n3.nabble.com/Karaf-4-0-5-jdbc-config-tp4046858p4046868.html
Sent from the Karaf - User mailing list archive at Nabble.com.
I have solved the problem as follows:
1. I am using my own assembly.
2. It seems the problem that I comment occurs if I include the configuration
file datasource in my feature.
3. I removed the file of the feature and I put it directly in the /etc my
distribution.
Everything works.
--
View
Good afternoon.
I'm trying to use pax-cdi and get the exception below.
Can you help please?
You can see the code: https://github.com/LuisLozano/karafExamples.git
I think the problem is in pax-jsf-support: This feature uses
pax-cdi-openwebbeans including the openwebbeans bundles-*/1.2.6 when
It works perfectly. Thank you.
--
View this message in context:
http://karaf.922171.n3.nabble.com/Karaf-assembly-tp4045816p4045821.html
Sent from the Karaf - User mailing list archive at Nabble.com.
Hi. Thanx.
Any workaround?
I've also noticed that called twice in the node1 bind method (perhaps to
register the service on the node and register it in the cluster?)
--
View this message in context:
http://karaf.922171.n3.nabble.com/Cellar-dosgi-tp4045758p4045763.html
Sent from the Karaf -
Is it possible to send notifications of disconnect service? I have the
following scenario:
Node1: dosgi service
Node2: client bundle
I would like to notify the node2 disconnection of service on node1 (eg when
node1 is disconnected or updating bundles that implement the service).
--
View this
Hi, I'm using version 4.0.0 of cellar.
What I really want is to know when the service is not available on the
client side. I'm using reference-listener but unbind methods are not called
on the client side when the node1 eliminates the service, or even when you
stop node1.
Indeed, if I do cluster:
Good Morning. I have the following scenario:
Node 1 (master): feature F
Node 2: feature F
Suppose I create a new version of F and I want to install on the cluster
nodes so that all use the new bundles. The way I do it is:
1. cluster: feature-repo-add group mvn: path_to_F/0.2/xml/features
2.
Thank you.
I know how to implement new commands, my question would be how can I invoke
other commands?
--
View this message in context:
http://karaf.922171.n3.nabble.com/Cellar-bundles-features-update-tp4045777p4045779.html
Sent from the Karaf - User mailing list archive at Nabble.com.
Good Morning.
When I run a cxf-dosgi service on a remote instance I get the following
output:
java.net.ConnectException: Invoking ConnectException http: // localhost:
9000
...
I guess karaf are using localhost and port 9000 by default.
How I can specify the host and port where the services are?
Sorry, but I do not understand.
I am using blueprint and I added the property
int the service and the features of cxf-dosgi in karaf.
I have also defined the org.apache.cxf.dosgi.discovery.zookeeper.server.cfg
files and org.apache.cxf.dosgi.discovery.zookeeper.cfg
Karaf has automatically
Now I have the following exception:
javax.xml.stream.FactoryConfigurationError: Provider for class
javax.xml.stream.XMLOutputFactory cannot be created
...
... 48 more
Caused by: java.util.ServiceConfigurationError:
javax.xml.stream.XMLOutputFactory: Provider
com.ctc.wstx.stax.WstxOutputFactory
Thanx Christian.
Have you tried the link?
The problem I have it when you access the link /cxf/ILblService?wsdl
If I delete I can see the wsdl but on port 9000 and the client looks for localhost.
So I wanted to know how to configure the host and port by default.
It's very strange. Can it be due
Try this:
https://github.com/LuisLozano/dosgi
--
View this message in context:
http://karaf.922171.n3.nabble.com/CXF-Dosgi-tp4045786p4045793.html
Sent from the Karaf - User mailing list archive at Nabble.com.
In the following scenario:
Node1: master
Node2: receives bundles b1 and b2. The bundle b2 can not start if you have
not installed the bundle b1.
Is there any way to tell the node that install b1 first and b2 after?
I'm getting errors that can not be initiated b2 without b1 and must restart
node
Thanks Achim.
I'm using as a repository cave. How I can upload a feature repository cave?
--
View this message in context:
http://karaf.922171.n3.nabble.com/Cellar-bundle-update-tp4045764p4045767.html
Sent from the Karaf - User mailing list archive at Nabble.com.
Hi, christian.
You were right. I added the lib folder to my distribution and everything
works properly.
However I still need to configure the interface on the client. The scenario
that I have now is the following:
Node 1 (master) with 2 interfaces: 192.168.21.11 and 172.19.30.158. The log
tells
Hi, christian.
With the file does not work:
karaf @ root ()> config: list | grep httpB
httpBase = http://192.168.21.11:8181
But address is 172 in the client.
I have to run the command:
config: property-set -p cxf-DSW httpBase http://192.168.21.11:8181
Now:
karaf @ root ()> config: list |
Sorry, the command is
config:property-set -p cxf-dsw httpBase http://192.168.21.11:8181
--
View this message in context:
http://karaf.922171.n3.nabble.com/CXF-Dosgi-tp4045786p4045814.html
Sent from the Karaf - User mailing list archive at Nabble.com.
Good afternoon.
For if this can interest someone.
The problem is in the assembly. If pom.xml is added to the appropriate
libraries, everything works fine.
The libraries to be added can be viewed at:
http://karaf.922171.n3.nabble.com/Karaf-assembly-td4045816.html
--
View this message in
Hello
When I use the maven archetype, distribution generated does not include the
jar files that you can find in lib in a downloaded karaf (especially
lib/endorsed jar files).
Is there any way to include libraries of karaf.home/lib folder in a
distribution that is built using maven?
Thanx.
I think I have the answer. The following data file can create a service that
allows you to add redirections. It can serve as a working basis.
http://www.osgi.org/xmlns/blueprint/v1.0.0;>
Good Morning.
I wonder if redirections can be added dynamically without restarting jetty.
I know with pax-web handlers can be added as services but have not found an
appropriate example.
The idea is that by modifying a configuration file (or similar mechanism)
can add or remove redirects on the
Hi. Good Morning
Is there a manual for jdbc commands?
We do not know the sentences to be executed for obtain certain data.
For example:
jdbc:query myDs select count (*) from messages where id < 100
The cursor is left with the value '>' and you have to press CTRL + C to
continue.
Or
jdbc:query
The output:
karaf@root>jdbc:query sacomarDS select count(*) from Mensajes
COUNT(*)
6476
karaf@root>jdbc:query sacomarDS "select count(*) from Mensajes"
Error executing command jdbc:query: too many arguments specified
karaf@root>jdbc:query sacomarDS "select count (*) from Mensajes"
karaf@root>version
4.0.5
karaf@root>jdbc:query sacomarDS "select count \(*\) from Mensajes"
Error executing command jdbc:query: too many arguments specified
karaf@root>jdbc:query sacomarDS "select count\(*\) from Mensajes"
Error executing command jdbc:query: too many arguments specified
Good afternoon.
When i get LogService using:
logSrv = context.getServiceReference(org.apache.karaf.log.core.LogService);
and i use:
logSrv.getEvents();
I get only ROOT events. How can i get events in other loggers?
If i execute log:list i have:
karaf@root>log:list
Logger
Hi JB,
Which parameter should I change?
--
View this message in context:
http://karaf.922171.n3.nabble.com/ssh-ssh-fails-on-karaf-4-0-7-tp4048558p4048608.html
Sent from the Karaf - User mailing list archive at Nabble.com.
I tried the next command:
ssh:ssh -p 22 user@server
using karaf console and I have the next message in the log:
(I can connect to the same server using putty)
java.security.InvalidAlgorithmParameterException: Prime size must be
multiple of 64, and can only range from 512 to 2048 (inclusive)
Thank you Zoran. I will wait to version 4.1.0
--
View this message in context:
http://karaf.922171.n3.nabble.com/ssh-ssh-fails-on-karaf-4-0-7-tp4048558p4048562.html
Sent from the Karaf - User mailing list archive at Nabble.com.
Now it works!
Many thanks!
--
View this message in context:
http://karaf.922171.n3.nabble.com/Log-commands-tp4048333p4048344.html
Sent from the Karaf - User mailing list archive at Nabble.com.
I have entries in the log file.
I've configured my log:
# YAROKI file appender
log4j.appender.yaroki=org.apache.log4j.RollingFileAppender
log4j.appender.yaroki.threshold=DEBUG
log4j.appender.yaroki.layout=org.apache.log4j.PatternLayout
log4j.appender.yaroki.layout.ConversionPattern=%d{ISO8601} |
I have solved using LogReaderService instead of LogService.
--
View this message in context:
http://karaf.922171.n3.nabble.com/LogService-in-karaf-4-0-5-tp4048298p4048304.html
Sent from the Karaf - User mailing list archive at Nabble.com.
Hi.
The command "log:tail" only show the rootLogger messages. Is there any way
to show messages of others logs?
I tried "log:display ALL" but no output was displayed
--
View this message in context:
http://karaf.922171.n3.nabble.com/Log-commands-tp4048333.html
Sent from the Karaf - User
Hi JB, thanx but:
karaf@root>log:list ALL
Logger | Level
---
ROOT| INFO
es.yaroki | ALL
org.apache.karaf.jaas.modules.audit | INFO
But when i write "log:tail es.yaroki" no output
Hi.
My karaf version is 4.0.7.
It certainly seems like an authentication problem. We have eliminated realm
karaf and we have taken it for another.
Can this be related?
This is our realm
http://www.osgi.org/xmlns/blueprint/v1.0.0;
Hi
I used option 1
It's already working!
Thanx
--
View this message in context:
http://karaf.922171.n3.nabble.com/karaf-decanter-kibana-not-start-tp4049028p4049039.html
Sent from the Karaf - User mailing list archive at Nabble.com.
Is it possible to use ha-jdbc and hibernate in karaf?
Can you use it by setting up a datasoruce (org.ops4j.datasource-xxx.cfg)?
Has anyone tried?
--
View this message in context:
http://karaf.922171.n3.nabble.com/HA-JDBC-in-karaf-tp4048999.html
Sent from the Karaf - User mailing list archive
Hi.
It seems a good idea to add it to the official documentation, and much
better reduce the necessary configuration.
I am currently studying the implications of using this configuration with
JPA and two different databases. I think there may be problems with
dialects.
--
View this message in
Hi. Thanks for answering.
I have the same problem. When installing elasticsearch:
java.lang.ClassNotFoundException: com.sun.jna.Native not found by
org.apache.servicemix.bundles.elasticsearch
___
And just when installing kibana
java.lang.IllegalStateException: No LoginService for
Hi.
I can not run the kibana web console (http://localhost:8181/kibana).
I deleted the realm karaf and replaced it with one of my own.
When I install the following features, kibana does not work:
feature:repo-add decanter 1.1.0
feature:install elasticsearch
feature:install kibana
Hi JB. You're right.
I think I've got it. I leave this little guide in case anyone is interested.
How to configure a datasource ha-jdbc in karaf:
1.- We assume that we have installed the features: jndi, jdbc,
pax-jdbc-config, pax-jdbc-pool-dbcp2, pax-jdbc-h2 and any pax-jdbc-ddbb
2.- The
Hi.
Is there any way to audit ssh connections?
I would like to get logs that display the IP and the user who tried the
connection.
I would also like to make statistics (using decanter probably) in which they
will show the IPs that are most connected, the users that connect the most,
the average
Hi.
But can you get the ip that tried the connection?
Thank you.
--
View this message in context:
http://karaf.922171.n3.nabble.com/SSH-Audit-tp4049070p4049072.html
Sent from the Karaf - User mailing list archive at Nabble.com.
Hi, I've made some progress.
For elasticsearch JNA ClassNotFoundException:
It's solved adding bundle:install -s mvn:net.java.dev.jna/jna/4.1.0
For Kibana security issue: I have modified
org.apache.karaf.decanter.kibana-4.x-1.2.0.jar/WEB-INF/web.xml to use my
realm (instead of karaf realm) and it
Hi.
Yes, it has.
My blueprint contains:
http://www.osgi.org/xmlns/blueprint/v1.0.0;
xmlns:jpa="http://aries.apache.org/xmlns/jpan/v1.0.0;
xmlns:tx="http://aries.apache.org/xmlns/transactions/v1.2.0;
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance;>
...
I guess I'll have to
Whe i try:
jdbc:query sacomarDS select msjid from mensajes where "(asto like '%TO:
prueba')"
I get:
Error executing command: Error de Sintaxis en sentencia SQL "SELECT MSJID
FROM MENSAJES WHERE (ASTO LIKE %[*]TO: PRUEBA) "; se esperaba "SELECT, FROM"
Syntax error in SQL statement "SELECT MSJID
Hi.
Yes, it has.
Thanks.
--
View this message in context:
http://karaf.922171.n3.nabble.com/Updating-application-to-karaf-4-1-2-tp4051189p4051191.html
Sent from the Karaf - User mailing list archive at Nabble.com.
I have an application running in version 4.0.8 of karaf and I would like to
test it in version 4.1.2 but I get the following message:
2017-08-09T08:19:32,877 | INFO | features-1-thread-1 |
BlueprintContainerImpl | 12 - org.apache.aries.blueprint.core -
1.8.2 | Bundle
87 matches
Mail list logo