Hi all, I'm having a very silly problem that is making me pull the hair out of my head. I created a program and generated a jar out of it. The program was running fine for a long time (months) and I was updating it regularly and everything was going fine. Just yesterday, I started to have a very weird case. I update my program, create the jar file, but when running the new jar file, the old class is the one that is actually running. I'm sure the jar file was created correctly because when I run the jar file directly using "java -jar", it runs the correct updated class. This problem only happens when using hadoop. I think it caches the old class somewhere. I tried to restart my machine to make sure all tmp files are cleared but still the same problem. I even went further by opening the jar file and deleting the main class. Even after deleting the class, hadoop still runs the old class. I removed all classes from the jar file and left only the META-INF files and it still executes the old class.
I must have did something wrong in the last few days that caused this problem to happen but I really can't figure out what the problem is. The only thing I remember is that I created another hadoop installation that runs in different ports to try different things in both clusters. However, I removed all traces of the other binary distribution and I'm currently running the jar file without any hadoop processes running. So, there is no actual cluster running here. Please help me. Thanks Ahmed Best regards, Ahmed Eldawy
