Hi Yogesh,
its a simple file component route. Also I am using Consumer Template to
receive file.
I will send github project link.
thanks
Rakesh
--
View this message in context:
http://camel.465427.n5.nabble.com/ClassNotFoundException-ObjectFactory-tp5782443p5782846.html
Sent from the Camel
Hi,
Need help with below problem:
I am trying to connect database, using camel mybatis component. I am using
spring boot started for both
camel and mybatis.
org.apache.camel
camel-spring-boot-starter
2.17.1
Hi,
I am using camel mina2 endpoint with sync=true . to get the request from UI
(Client) through socket. And respond back to the client. There is some
limitation in the Client end on the Buffer size. In some cases if the
response is huge (For some Reports). I am seeing the partial data loss in
Hello,
I want to decide in the route if the file should be deleted after processing
or not.
My approach to send it to file:${file:path} does not work.
Anyone aware of a solution for this requirement?
Camel version I use: 2.15.1
Best regards
Johannes
--
View this message in context:
I never figured out how to do it via CXF, but I once used a workaround by
just bypassing SOAP altogether and posting the SOAP XML straight over HTTP
instead.
On 19 May 2016 at 13:31, kazvis wrote:
> Hi ,
>
> i want to call a external SOAP service from camel route. i have
Converting body to String did the job. Thank you!
--
View this message in context:
http://camel.465427.n5.nabble.com/Inject-multiple-JsonPath-parameters-to-bean-method-tp5782825p5782833.html
Sent from the Camel - Users mailing list archive at Nabble.com.
You can add a
covertBodyTo(string.class) before the bean so its not stream based.
It smells like the bean component need to reset the stream before
evaluation in case you have multiple parameters from the body which
you are doing, but which is otherwise rarely used.
Hi Claus,
Thank your for very quick response. Unfortunately, stream caching didn't
help - it just changed the exception type.
I've enabled stream caching on CamelContext level, and set spool threshold
to 1 (just to see what will happen). The message body is cached and spooled
to disk:
2016.05.20
HI,
Using a direct and not a quartz schedule just for testing is your best option.
You could however use this to kick of a route that uses a quartz schedule:
ftp://bla?cron.scheduler=quartz2
I could however not configure a triggerId on a quartz2:// route (did also not
look too thoroughly I must
I have a Springboot application and I need to connect to Websphere MQ
Activation specification JNDI using Apache Camel.
Can anyone suggest how can I do it and what is the best approach?
--
View this message in context:
Maybe you need stream caching
http://camel.apache.org/why-is-my-message-body-empty.html
On Fri, May 20, 2016 at 1:12 PM, grzechol
wrote:
> Hi guys,
>
> I have a problem injecting multiple @JsonPath arguments using bean binding.
>
> Here is my sample json:
>
Hi guys,
I have a problem injecting multiple @JsonPath arguments using bean binding.
Here is my sample json:
{"key1":"val1","key2":"val2"}
My bean method processing that JSON:
@Handler
public void process(@JsonPath("$.key1") String value1, @JsonPath("$.key2")
String value2) {
Hi Fongys,
Thanks.
I raised a JIRA for this
https://issues.apache.org/jira/browse/CAMEL-9978
Cheers
--
Andrea Cosentino
--
Apache Camel PMC Member
Apache Karaf Committer
Apache Servicemix Committer
Email: ancosen1...@yahoo.com
Twitter: @oscerd2
Github: oscerd
When using camel-kafka, if the processor / synchronous to (like direct, etc)
take too long (>30s) to process, it will cause the kafka re-balanced.
The issue is caused by the kafka client send heart beat to broker in the
poll function. After receive the ConsumerRecord from poll, there is no
Kafka producer properties 'acks' can have value 0, 1, all
Ref: http://kafka.apache.org/documentation.html#producerconfigs
While camel-kafka 2.17.1 code mark the field as Integer which cannot support
'all'
requestRequiredAcks
--
View this message in context:
Hi Claus,
First of all, thanks for the response.
I'm afraid the advice option won't work for me, since the original from
part is part of the scope of my test. And to clarify, it is not really a
unit test, but more of a integration test.
The property place holder however works for me and
Hi Thomas,
ad 1) That route and the use case behind it is not performance critical at
all. Right now I am already doing a hybrid of what you proposes. I split
from the zip, transform the data from something crude and proprietary to
Json and then aggregate to determine completeness and split and
Hi Mirco,
ad 1) If it saves you from duplicating code & you have no issues
performance-wise: I would do it. It seems more logical (in my opinion).
If you need to check for completeness *before* doing any processing, I think
that´s the only way to go. If you can allow for parallelism: split
18 matches
Mail list logo