On 30/10/2007, Richard Hubbell <[EMAIL PROTECTED]> wrote: > > --- sebb <[EMAIL PROTECTED]> wrote: > > > On 27/10/2007, Richard Hubbell > > <[EMAIL PROTECTED]> wrote: > > > This is the jmx. So you don't have to go back over > > the > > > thread on this I was having problems where jmeter > > > would send requests to the server like > > http://.*/.* > > > > > > I'd really like to figure out how to have it not > > do > > > that since it creates a lot of noise in the server > > > logs and in the jmeter logs, etc. > > > > > > If this is hard to read I can re-send as an > > > attachment. > > > > It *is* hard to read - and makes the mail hard for > > others to read. > > Anyway it is not usable as it stands. > > > > However, please don't send attachments to the list > > either. > > > > Either store the JMX file on a public server, or > > create a Bugzilla > > issue and attach the file to that. > > I guess bugzilla would be best in this case, I'll do > that. In a sense it's unfortunate that the jmx files > can't be discussed right here since this is a user > group and users seem to have many questions about the > jmx since the jmx is everything.
By all means discuss JMeter test plans, but posting anything more than a very short extract is counter-productive IMO. The only easy way to "read" the files is to load them into JMeter, and that is not at all easy to do from a mailgroup posting. Even extracts from jmeter log files are difficult to read when posted in an e-mail because of the line-wrapping that occurs. > Jmeter is the jmx from a user perspective. > > > > > Please make sure that any such test cases are as > > small as possible (no > > extraneous stuff) and all necessary supporting files > > are present. > > Also please attach a copy of the jmeter log file > > from running the test case. > > > > In the case of your JMX, I did manage to get the > > file to load > > (eventually), but it is unusable as there are > > several mising files. > > > > It's not clear that the While Controller is > > guaranteed to exit, and > > the Link Parser is being applied to the previous > > sample in the loop so > > it's not surprising that it sometimes does not find > > a match - are all > > the previous pages guaranteed to contain a link or a > > form? > > It is possible, but what's the alternative for > recursing over links on pages? I tried using the > feature in the HTTP Request sampler to get only > "Embedded URLs must match" regex but that was too > limiting. Can you describe in greater detail how > complex a regex can go into that field? As complex as you like, but of course that may increase the resources needed to process it. > Would something like this work? > href="([^"]+)"|img="([^\s]+)"|imgurl="([^\s]+)" What are you trying to achieve? > I think you get the idea, there are more than one type > of embedded url I'm interested in traversing further. > > I also tried http://.+/.+ but that didn't work, but I > would have thought that would have solved the issue. > Maybe it's a missing feature/bug. The Link Parser only produces useful output when it finds a link. If there is no link in the previous page, it will not update the current sampler. > In regexdom it's a bad idea to use .*, it should be > used sparingly. It's only a problem where there is trailing context, as that causes back-tracking. .* with nothing after it is OK, but .*?; would be better as [^;]*; Same for .+. > > > > > Another issue is using 1000 threads with 1 loop just > > does not make sense. > > Does not make sense in general or just to you? It > makes sense to me. I would have used 10,000 but the > jvm is a bit hungry with memory. There may be some > tuning still needed. Stack size, etc. > > Imagine that each loop does more than one thing. > But given the ramp-up time, the threads don't run in parallel. Even with a very short ramp-up time it's likely that the earlier threads will have finished before the later ones start. Better to run a few threads (or one thread) multiple times. A single thread can represent multiple users. Multiple threads are normally used to represent multiple concurrent users. > > __________________________________________________ > Do You Yahoo!? > Tired of spam? Yahoo! Mail has the best spam protection around > http://mail.yahoo.com > > --------------------------------------------------------------------- > To unsubscribe, e-mail: [EMAIL PROTECTED] > For additional commands, e-mail: [EMAIL PROTECTED] > > --------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]

