Hi,
Can you please subscribe to the mailing list so that we receive email
notification for your posts? You can refer to this instruction:
http://apache-ignite-users.70518.x6.nabble.com/mailing_list/MailingListOptions.jtp?forum=1
Stolidedog wrote
> I get a OutOfMemoryError when I have multiple threads writing small files
> to IGFS with Hadoop configured as a backing file system.
How much data do you have? Most likely you just need to increase the amount
of heap memory (use -Xmx JVM property). You can also limit the amount of
memory allocated for the file system:
<bean class="org.apache.ignite.configuration.FileSystemConfiguration">
<property name="maxSpaceSize" value="#{4 * 1024 * 1024 * 1024}"/>
</bean>
Note that the limit should be around 1-2G less than maximum heap size,
because some memory is required by the node itself to be functional.
Let us know if it helps.
-Val
--
View this message in context:
http://apache-ignite-users.70518.x6.nabble.com/OutOfMemoryError-with-Hadoop-backing-filesystem-tp1854p1857.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.