I have a Spark 2.2.0 app that writes a RDD to Ignite 2.6.0. It *works* in
local Spark (2.2.0) mode, accessing a remote Ignite 2.6.0 cluster,

In my ignite.xml, I am specifying AWS S3-based Discovery, as my Ignite
cluster is running in AWS.

When I deploy this working-in-local-mode jar to a Spark 2.2.0 cluster, I
get the following error.

support.GenericApplicationContext: Exception encountered during
context initialization - cancelling refresh attempt:
org.springframework.beans.factory.BeanCreationException: Error
creating bean with name 'ignite.cfg' defined in URL
[file:/home/ubuntu/tmp/ignite.xml]: Cannot create inner bean
'org.apache.ignite.spi.discovery.tcp.TcpDiscoverySpi#63cf9de0' of type
[org.apache.ignite.spi.discovery.tcp.TcpDiscoverySpi] while setting
bean property 'discoverySpi'; nested exception is
org.springframework.beans.factory.BeanCreationException: Error
creating bean with name
'org.apache.ignite.spi.discovery.tcp.TcpDiscoverySpi#63cf9de0' defined
in URL [file:/home/ubuntu/tmp/ignite.xml]: Cannot create inner bean
'org.apache.ignite.spi.discovery.tcp.ipfinder.s3.TcpDiscoveryS3IpFinder#3c6c4689'
of type [org.apache.ignite.spi.discovery.tcp.ipfinder.s3.TcpDiscoveryS3IpFinder]
while setting bean property 'ipFinder'; nested exception is
org.springframework.beans.factory.BeanCreationException: Error
creating bean with name
'org.apache.ignite.spi.discovery.tcp.ipfinder.s3.TcpDiscoveryS3IpFinder#3c6c4689'
defined in URL [file:/home/ubuntu/tmp/ignite.xml]: Error setting
property values; nested exception is
org.springframework.beans.NotWritablePropertyException: Invalid
property 'awsCredentialsProvider' of bean class
[org.apache.ignite.spi.discovery.tcp.ipfinder.s3.TcpDiscoveryS3IpFinder]:
*Bean property 'awsCredentialsProvider' is not writable or has an
invalid setter method. Does the parameter type of the setter match the
return type of the getter?*


This property value *is* valid, and as I said before, works in local mode.
Only in Spark cluster mode is this failing. It looks like somehow the
necessary Ignite libraries are not getting loaded, even though I've
deployed the Ignite 2.6.0 jars to the Spark cluster (to get past other
errors).

Is there a workaround for this? Please help.

Mex

Reply via email to