Stuart MacDonald wrote:
From: On Behalf Of Guy Harris
On Jan 25, 2007, at 8:23 PM, Stuart MacDonald wrote:
That can't do arbitrary display filtering, but truly *arbitrary*
display filtering has problems with reassembly (i.e., a filter that
matches something in the reassembled portion of
From: On Behalf Of ARAMBULO, Norman R.
So you have captured a large data of 16Gb, is it from a
large network?
No, just a saturated link between two machines, over the course of 24
hours give or take.
What is the average xx Mb/sec
No idea. A rough guess puts it about 50
Hi Stu,
So you have captured a large data of 16Gb, is it from a large network? What is
the average xx Mb/sec Iam also using tcpdump and tshark to capture large files
our network has an average traffic of 500Mb/sec so what specs are you using in
capturing such large files. Thanks
From: On Behalf Of Jeff Morriss
What about:
- split the files into 1000 smaller files
- use a (decent) shell with tshark to process those files with tshark
The latter could be achieved in a Korn style shell with
something like:
(for f in *.eth
do
tshark -r $f -w - -R
From: Stuart MacDonald [mailto:[EMAIL PROTECTED]
I don't think the documentation mentions '-' is supported for -w.
Cancel that, I just missed it last night. It was late.
..Stu
___
Wireshark-users mailing list
Wireshark-users@wireshark.org
From: On Behalf Of Guy Harris
On Jan 25, 2007, at 8:23 PM, Stuart MacDonald wrote:
I've read the man pages on the tools that come with Wireshark. I was
hoping to find a tool that opens a capture, applies a filter and
outputs matching packets to a new file. Here's a sample run of the
What about 'grep'?
I used it a lot in my DOS days. I'm sure there is/are
Windows versions. It's quite powerful with many
wildcard characters and search patterns. It will do a
lot of filtering for you.
You mauy have to run it several times for the
different search parameters.
John
--- Guy
I wonder if ngrep would work for you:
http://ngrep.sourceforge.net/
There are binaries for most platforms including Linux and Windows.
Perhaps you could do something like this:
ngrep -I input.cap -O output.cap regex
I tried and it seems to work, although I only used a 20MB capture file.
--Jim
From: On Behalf Of Small, James
I wonder if ngrep would work for you:
http://ngrep.sourceforge.net/
Nifty! I bet it would, but the tcpdump solution earlier has worked for
me. Thanks though!
..Stu
___
Wireshark-users mailing list
Can it be exported as text?
--- Stuart MacDonald [EMAIL PROTECTED] wrote:
From: On Behalf Of Seymour Dupa
What about 'grep'?
The capture is libpcap format. grep would need to
understand network
packets to be at all effective. This is not a simple
line from a text
file situation.
On 1/26/07, Seymour Dupa [EMAIL PROTECTED] wrote:
Can it be exported as text?
Yes you could but either you loose most of the information having each
packet in a single line or you have the whole tree and the data pane
that spans several lines where grep is not good anymore.
BTW to have it
Wish I had a job where I'd get paid to learn and use
Wireshark.
John
--- Luis Ontanon [EMAIL PROTECTED] wrote:
On 1/26/07, Seymour Dupa [EMAIL PROTECTED]
wrote:
Can it be exported as text?
Yes you could but either you loose most of the
information having each
packet in a single line or
I have a very large capture file from tcpdump, 16 Gb. Wireshark
crashes trying to open it, a known issue.
For some of my investigation I used editcap and split it into smaller
captures, and that worked okay, but there were 1000 of them and each
is still slow to load/filter/etc; the size ranges
Stuart MacDonald wrote:
I have a very large capture file from tcpdump, 16 Gb. Wireshark
crashes trying to open it, a known issue.
For some of my investigation I used editcap and split it into smaller
captures, and that worked okay, but there were 1000 of them and each
is still slow to
14 matches
Mail list logo