On 24 December 2012 13:20, Philippe Mouawad <[email protected]> wrote: > Hello, > > I am kind of annoyed of reading articles, blogs that say JMeter cannot > perform high Load Tests, consumes lot of memory, generates OutOfMemory ... > > This has become a kind of "Urban Legend" partly due: > - to issues that have been fixed for a while now > - and partly In my opinion to some default configuration parameters that > lead to these issues > > In my opinion, we should: > > 1) change these defaults to avoid new comers, beginners fall into all these > traps and others check they are using it well: > > - Save Service using XML output => Change to CSV
OK > - Distributed Mode that uses the Standard which is far from being the > best performing Sample Sender => Change to Batch or StrippedBatch OK > 2) Add warnings on GUIs of all elements that are more suited during > Scripting than during Load Test : > > - View Result Tree (I keep seeing people use this element during High > Load Test ! ) > - View Results in Table > - Graph Results > - ... Adding this information to the individual manual pages should be sufficient. > 3) Add a popup warning when Start and Remote Start are clicked from GUI to > encourage NON GUI mode use (we could add a checkbox Remind Me later which > could be unchecked to avoid it again, but at least user would know about > it). That could quickly become very annoying. > 4) Finally use some kind of visual indicator (RED background) on some > options that have high impact on performance: > > - Javascript as scripting language > - Body (unescaped) in Regular Expression Extractor (*this one is a real > performance killer !*) > - Encourage JSR223 Samplers + Groovy + Caching instead of Beanshell > - ... I don't think that is a good idea, as the relative performances may well change. However, we can extend the "Lean and mean" documentation, so long as we clarify to which versions the caveats apply. > > Maybe we should post this mail on user mailing list to see what users think > about it. > > -- > Regards > Philippe M.
