Re: Query on ignite-kafka artifact (ignite-kafka-ext)

2020-12-18 Thread akurbanov
Hello,

Please refer to the ticket in IGNITE JIRA dedicated to moving kafka
integration into ignite-extensions repository:
https://issues.apache.org/jira/browse/IGNITE-13394

The details for the change and the discussion mail thread are linked to the
ticket.

>To which version of ignite is this compatible ? Can this be used with
ignite
2.10.0 master branch ?

Currently, they will be considered to be compatible with the latest Apache
Ignite release. On each release the extensions should be verified and
changed if any updates are required, after this a release of extension
component will be peformed. For more details please refer to:
http://apache-ignite-developers.2346864.n4.nabble.com/DISCUSS-dependencies-and-release-process-for-Ignite-Extensions-td44478.html

Best regards,
Anton



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Query on ignite-kafka artifact (ignite-kafka-ext)

2020-12-17 Thread vbm
Hi,

In 2.9.0 release of Ignite, ignite-kafka module was part of the ignite git.
Now when I check in master branch kafka module is not present, it has been
moved to https://github.com/apache/ignite-extensions

Also in maven repository I see there is new artifact corresponding to
ignite-extensions for kafka.

org.apache.ignite
ignite-kafka-ext
1.0.0


To which version of ignite is this compatible ? Can this be used with ignite
2.10.0 master branch ?
When will the next release of ignite-kafka-ext be done ? Will it be done
along with ignite release ?


Regards,
Vishwas



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: Kafka to Ignite

2019-12-02 Thread Evgenii Zhuravlev
Hi,

Probably you can use Kafka streamer:
https://apacheignite-mix.readme.io/docs/kafka-streamer

Evgenii

пн, 2 дек. 2019 г. в 05:29, ashishb888 :

> What are better ways to stream data from Kafka to Ignite cache?
>
>
>
> --
> Sent from: http://apache-ignite-users.70518.x6.nabble.com/
>


Kafka to Ignite

2019-12-02 Thread ashishb888
What are better ways to stream data from Kafka to Ignite cache?



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Re: POC with DataStreamer (Kafka or Ignite - security question)

2018-04-09 Thread Denis Magda
Built-in encryption facilities are being discussed on the dev list. Check
it up:
http://apache-ignite-developers.2346864.n4.nabble.com/IEP-18-Transparent-Data-Encryption-td29001.html

As for now, use operating system level encryption mechanics for your files.

--
Denis
--
Denis

On Fri, Apr 6, 2018 at 5:39 PM, Gaurav Bajaj <gauravhba...@gmail.com> wrote:

> Also I don't think ignite provides any kind encryption for these db files.
>
> Best Regards,
> Gaurav
>
> On 06-Apr-2018 8:23 PM, "David Harvey" <dhar...@jobcase.com> wrote:
>
>> Assuming Ignite Persistence,  you can create a cache in a specific Data
>> Regions, but I'm unclear whether this properties can be set per region.  We
>> are setting them in
>>
>> org.apache.ignite.configuration.DataStorageConfiguration.   What you
>> seem to be asking for is to set these per Data Region.
>>
>>
>>> "/IgnitePersistenceStorage/store"/>
>>
>> 
>>
>>> "/IgnitePersistenceStorage/wal/archive"/>
>>
>>
>>
>>
>> On Fri, Apr 6, 2018 at 1:58 PM, Wilhelm <wilhelm.tho...@anaplan.com>
>> wrote:
>>
>>> Hello,
>>>
>>> I'm building a POC, right now I have Kafka feeding ignite with the
>>> constraint of having 1 topic per tenant for security reason (historical
>>> data
>>> is persisted to files per topic by Kafka and each "file" container is
>>> encrypted differently per customer)
>>>
>>> If I decide to use only ignite with the DataStream (instead of Kafka),
>>> how
>>> can I make sure the customer data will be separated to disk (like
>>> separate
>>> file or db per customer)? and how can I encrypt this cache file per
>>> customer?
>>>
>>> I hope that make sense. So I guess it come down to: can I have the ignite
>>> cache persisted in different file/db per some rules (like a customer
>>> id)? or
>>> does it needs to be in different ignite memory caches? or it is not
>>> possible?
>>>
>>> Thanks for your help
>>>
>>> w
>>>
>>>
>>>
>>> --
>>> Sent from: http://apache-ignite-users.70518.x6.nabble.com/
>>>
>>
>>
>>
>> *Disclaimer*
>>
>> The information contained in this communication from the sender is
>> confidential. It is intended solely for use by the recipient and others
>> authorized to receive it. If you are not the recipient, you are hereby
>> notified that any disclosure, copying, distribution or taking action in
>> relation of the contents of this information is strictly prohibited and may
>> be unlawful.
>>
>> This email has been scanned for viruses and malware, and may have been
>> automatically archived by *Mimecast Ltd*, an innovator in Software as a
>> Service (SaaS) for business. Providing a *safer* and *more useful* place
>> for your human generated data. Specializing in; Security, archiving and
>> compliance. To find out more Click Here
>> <http://www.mimecast.com/products/>.
>>
>


Re: POC with DataStreamer (Kafka or Ignite - security question)

2018-04-06 Thread Gaurav Bajaj
Also I don't think ignite provides any kind encryption for these db files.

Best Regards,
Gaurav

On 06-Apr-2018 8:23 PM, "David Harvey" <dhar...@jobcase.com> wrote:

> Assuming Ignite Persistence,  you can create a cache in a specific Data
> Regions, but I'm unclear whether this properties can be set per region.  We
> are setting them in
>
> org.apache.ignite.configuration.DataStorageConfiguration.   What you seem
> to be asking for is to set these per Data Region.
>
>
>
>
> 
>
>
>
>
>
>
> On Fri, Apr 6, 2018 at 1:58 PM, Wilhelm <wilhelm.tho...@anaplan.com>
> wrote:
>
>> Hello,
>>
>> I'm building a POC, right now I have Kafka feeding ignite with the
>> constraint of having 1 topic per tenant for security reason (historical
>> data
>> is persisted to files per topic by Kafka and each "file" container is
>> encrypted differently per customer)
>>
>> If I decide to use only ignite with the DataStream (instead of Kafka), how
>> can I make sure the customer data will be separated to disk (like separate
>> file or db per customer)? and how can I encrypt this cache file per
>> customer?
>>
>> I hope that make sense. So I guess it come down to: can I have the ignite
>> cache persisted in different file/db per some rules (like a customer id)?
>> or
>> does it needs to be in different ignite memory caches? or it is not
>> possible?
>>
>> Thanks for your help
>>
>> w
>>
>>
>>
>> --
>> Sent from: http://apache-ignite-users.70518.x6.nabble.com/
>>
>
>
>
> *Disclaimer*
>
> The information contained in this communication from the sender is
> confidential. It is intended solely for use by the recipient and others
> authorized to receive it. If you are not the recipient, you are hereby
> notified that any disclosure, copying, distribution or taking action in
> relation of the contents of this information is strictly prohibited and may
> be unlawful.
>
> This email has been scanned for viruses and malware, and may have been
> automatically archived by *Mimecast Ltd*, an innovator in Software as a
> Service (SaaS) for business. Providing a *safer* and *more useful* place
> for your human generated data. Specializing in; Security, archiving and
> compliance. To find out more Click Here
> <http://www.mimecast.com/products/>.
>


Re: POC with DataStreamer (Kafka or Ignite - security question)

2018-04-06 Thread David Harvey
Assuming Ignite Persistence,  you can create a cache in a specific Data
Regions, but I'm unclear whether this properties can be set per region.  We
are setting them in

org.apache.ignite.configuration.DataStorageConfiguration.   What you seem
to be asking for is to set these per Data Region.


   



   




On Fri, Apr 6, 2018 at 1:58 PM, Wilhelm <wilhelm.tho...@anaplan.com> wrote:

> Hello,
>
> I'm building a POC, right now I have Kafka feeding ignite with the
> constraint of having 1 topic per tenant for security reason (historical
> data
> is persisted to files per topic by Kafka and each "file" container is
> encrypted differently per customer)
>
> If I decide to use only ignite with the DataStream (instead of Kafka), how
> can I make sure the customer data will be separated to disk (like separate
> file or db per customer)? and how can I encrypt this cache file per
> customer?
>
> I hope that make sense. So I guess it come down to: can I have the ignite
> cache persisted in different file/db per some rules (like a customer id)?
> or
> does it needs to be in different ignite memory caches? or it is not
> possible?
>
> Thanks for your help
>
> w
>
>
>
> --
> Sent from: http://apache-ignite-users.70518.x6.nabble.com/
>

Disclaimer

The information contained in this communication from the sender is 
confidential. It is intended solely for use by the recipient and others 
authorized to receive it. If you are not the recipient, you are hereby notified 
that any disclosure, copying, distribution or taking action in relation of the 
contents of this information is strictly prohibited and may be unlawful.

This email has been scanned for viruses and malware, and may have been 
automatically archived by Mimecast Ltd, an innovator in Software as a Service 
(SaaS) for business. Providing a safer and more useful place for your human 
generated data. Specializing in; Security, archiving and compliance. To find 
out more visit the Mimecast website.


Re: How to implement StreamSingleTupleExtractor in Kafka connect Ignite sink?

2017-05-29 Thread Humphrey

See answers in the following thread. 

http://apache-ignite-users.70518.x6.nabble.com/Kindly-tell-me-where-to-find-these-jar-files-td12649.html



--
View this message in context: 
http://apache-ignite-users.70518.x6.nabble.com/How-to-implement-StreamSingleTupleExtractor-in-Kafka-connect-Ignite-sink-tp13166p13199.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.


How to implement StreamSingleTupleExtractor in Kafka connect Ignite sink?

2017-05-26 Thread ignitedFox
Hi,

I am using Kafka connect Igniite sink. I have JSON records in kafka like;



I would like to make a tuple extractor, so that the **BUS_ID** will act as
key in Ignite. From the documentation, I have noted that "If you need to
create an Ignite key from a Kafka value, implement
StreamSingleTupleExtractor and specify it as 'singleTupleExtractorCls'". I
want to create an extractor and implement it like this. It will be really
helpful if somebody kindly tell me how to create one, and how to implement
this.

Thanks in advance.



--
View this message in context: 
http://apache-ignite-users.70518.x6.nabble.com/How-to-implement-StreamSingleTupleExtractor-in-Kafka-connect-Ignite-sink-tp13166.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.