Hello, Need some advice on configuring jmeter shell script to run better on linux (debian/ubuntu in my case).
Uncommenting these parameters in jmeter script has adverse affects: # SURVIVOR="-XX:SurvivorRatio=8 -XX:TargetSurvivorRatio=20%" # EVACUATION="-XX:MaxLiveObjectEvacuationRatio=20%" The JVM doesn't want to start with them on debian, although I ran jmeter.bat with them on default values or improved targets, on Windows XP, Vista, 7 stations: I didn't find an answer in previous threads, though this was discussed some time / some years ago. Still jmeter works without, BUT there is a noticeable performance penalty without them: with the changes I make in jmeter.bat, I can start more then 4000 threads on a Win XP desktop system, with a simple script; on the same hardware with ubuntu installed or on other more powerful machines with debian I can't start more then 2000 threads, same script of course (using GUI or not). Considering that XP has a max threads limit of 10.000 and that the server had 16g of ram, this upsets me, it should work better on linux. The script doesn't depend on an external application, just uses a lot of CPU and memory and I used to tune how jmeter runs, however I have encountered some problems with JVM in tests that ran for longer periods of time. Any suggestions? ---- just in case, here is what is in jmeter shell script: HEAP="-Xms512m -Xmx1512m" NEW="-XX:NewSize=64m -XX:MaxNewSize=128m" #SURVIVOR="-XX:SurvivorRatio=8 -XX:TargetSurvivorRatio=20%" TENURING="-XX:MaxTenuringThreshold=2" #EVACUATION="-XX:MaxLiveObjectEvacuationRatio=20%" RMIGC="-Dsun.rmi.dgc.client.gcInterval=600000 -Dsun.rmi.dgc.server.gcInterval=600000" PERM="-XX:PermSize=64m -XX:MaxPermSize=64m" DEBUG="-verbose:gc -XX:+PrintTenuringDistribution" DUMP="-XX:+HeapDumpOnOutOfMemoryError" (the rest is unchanged). Thanks & regards, Adrian S

