Hi,
ìThanks to your suggestions i managed to configure good ignite. All Work but
I found another issue... The computation is 20 or 30 times slower than the
same computation without put word in cache. If i put just 1 word in cache at
the beginning of map function the computation time it's the same.  From what
can depend?  
This is the Server Code

        public static void main(String args[]) throws Exception {
                Ignition.setClientMode(false);
                
                try(Ignite ignite =
Ignition.start("/home/hduser/apache-ignite-2.0.0-src/examples/config/example-cache1.xml")){
                        
                        //CacheConfiguration<String ,Integer> cfg2 = new 
CacheConfiguration<>();
                        CacheConfiguration<String,Integer> cfg2=
Ignition.loadSpringBean("/home/hduser/apache-ignite-2.0.0-src/examples/config/example-cache1.xml",
"cacheconf"); 
                        
                        
                        IgniteCache<String, Integer> cache;
                        cache = ignite.getOrCreateCache(cfg2);
                        //cache.put("uno", 1);
                        //cache.put("due", 2);

                        //System.out.println(cache.get("uno"));         
                        //System.out.println(cache.get("due"));         

                        while(true){
                                
                        }
                }       
                
                
                
                
        }

This is the WordCount code

public class WordCountIgnite extends Configured implements Tool {

        
        
        public static void main(String args[]) throws Exception {
                        
                                        
                int res = ToolRunner.run(new WordCountIgnite(), args);
                System.exit(res);
                        
                
        }

        public int run(String[] args) throws Exception {
                
                Path inputPath = new Path(args[0]);
                Path outputPath = new Path(args[1]);

                Configuration conf = getConf();
                Job job = Job.getInstance(conf, "word count");

                FileInputFormat.setInputPaths(job, inputPath);
                FileOutputFormat.setOutputPath(job, outputPath);

                job.setJarByClass(this.getClass());
                job.setInputFormatClass(TextInputFormat.class);
                job.setOutputFormatClass(TextOutputFormat.class);
                job.setMapOutputKeyClass(Text.class);
                job.setMapOutputValueClass(IntWritable.class);
                job.setOutputKeyClass(Text.class);
                job.setOutputValueClass(IntWritable.class);

                job.setMapperClass(Map.class);
                job.setCombinerClass(Reduce.class);
                job.setReducerClass(Reduce.class);
                
                return job.waitForCompletion(true) ? 0 : 1;
        }

        public static class Map extends Mapper<LongWritable, Text, Text,
IntWritable> {
                
                private final static IntWritable one = new IntWritable(1);
                private Text word = new Text();
                Ignite ignite;
                IgniteCache<String, Integer> cache;
                
                @Override protected void setup(Context context) throws 
IOException,
InterruptedException {
            super.setup(context);

            ignite =
Ignition.start("/home/hduser/apache-ignite-2.0.0-src/examples/config/example-cache2.xml");
                        
                //CacheConfiguration<String ,Integer> cfg2 = new
CacheConfiguration<>();
                CacheConfiguration<String,Integer> cfg2=
Ignition.loadSpringBean("/home/hduser/apache-ignite-2.0.0-src/examples/config/example-cache1.xml",
"cacheconf");
                        
                
                cache = ignite.getOrCreateCache(cfg2);
                //cache.put("test", 1993);
                        
        }
                
                @Override
                public void map(LongWritable key, Text value, Context context) 
throws
IOException, InterruptedException {
//                      String line = value.toString();
//                      StringTokenizer tokenizer = new StringTokenizer(line);
//                      while (tokenizer.hasMoreTokens()) {
//                              word.set(tokenizer.nextToken());
//                              context.write(word, one);
//                      }
                        
                        
                        String[] lines = tokenize(value.toString());
                        
                        try (IgniteDataStreamer<String, Integer> stmr =
ignite.dataStreamer("Cache")) {    
                            // Stream entries.
                                for(String token : lines){
                                        //word.set(token);
                                        //context.write(word, one);
                                        stmr.addData(token, 1);
                                }
                        }
                        
                        
                        
                        //cache.put("interno", 666);
                }
                
                @Override protected void cleanup(Context context) throws 
IOException,
InterruptedException {
            super.setup(context);

            ignite.close();
                        
        }
                
                private String[] tokenize(String text) {
                          text = text.toLowerCase();
                          text = text.replace("'","");
                          text = text.replaceAll("[\\s\\W]+", " ").trim();
                          return text.split(" ");
                        }
        }

        public static class Reduce extends Reducer<Text, IntWritable, Text,
IntWritable> {

                @Override
                public void reduce(Text key, Iterable<IntWritable> values, 
Context
context) throws IOException, InterruptedException {
                        int sum = 0;
                        for (IntWritable value : values) {
                                sum += value.get();
                        }

                        context.write(key, new IntWritable(sum));
                }
                
        }
        

} 

This is the Ignite Configuration

<beans xmlns="http://www.springframework.org/schema/beans";
       xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
       xsi:schemaLocation="
        http://www.springframework.org/schema/beans
        http://www.springframework.org/schema/beans/spring-beans.xsd";>
    <bean id="ignite.cfg"
class="org.apache.ignite.configuration.IgniteConfiguration">

        
        <property name="discoverySpi">
            <bean
class="org.apache.ignite.spi.discovery.tcp.TcpDiscoverySpi">
                <property name="ipFinder">
                    
                    
                    
                    <bean
class="org.apache.ignite.spi.discovery.tcp.ipfinder.multicast.TcpDiscoveryMulticastIpFinder">
                        <property name="addresses">
                            <list>
                                
                                <value>127.0.0.1:47500..47509</value>
                                <value>192.168.30.5:47500..47509</value>
                                <value>192.168.30.22:47500..47509</value>
                                <value>192.168.30.99:47500..47509</value>
                            </list>
                        </property>
                    </bean>
                </property>
            </bean>
        </property>
    </bean>

    <bean id="cacheconf"
class="org.apache.ignite.configuration.CacheConfiguration">
        <property name="name" value="default"/>
        <property name="atomicityMode" value="ATOMIC"/>
        <property name="backups" value="0"/>
        <property name="cacheMode" value="PARTITIONED"/>
        
    </bean>
</beans>


According to Ignite Documentation
https://apacheignite.readme.io/docs/performance-tips  I think maybe is a
problem of cache start dimension but i can't set it. I'm using Ignite 2.0.
When I add <property name="startSize"  value="10"/> in configuration it
launch exception 

Caused by: org.springframework.beans.NotWritablePropertyException: Invalid
property 'startSize' of bean class
[org.apache.ignite.configuration.CacheConfiguration]: Bean property
'startSize' is not writable or has an invalid setter method. Does the
parameter type of the setter match the return type of the getter?

I don't know why.... I tried with cache.put() method but it is slower. Are
they normale this computation times? 

I think the times should be similar to those of a normal hadoop computation.
The test are made by 2 slave who are Ignite server Node too, in Hadoop
computation create 2 Client node in there same slave who put data in cache. 

Thanks
Mimmo



--
View this message in context: 
http://apache-ignite-users.70518.x6.nabble.com/Performance-WordCount-to-Hadoop-tp14084.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.

Reply via email to