Ask Your Question
0

I want to capture concurrently and save it as multiple files where each file has its own distinct capture filter?

asked 2019-08-18 16:18:40 +0000

P Rao gravatar image

updated 2019-08-18 17:15:59 +0000

grahamb gravatar image

Hello,

I am capturing all traffic from an ethernet interface. I want to capture concurrently and save it as multiple files where each file has its own distinct capture filter? For example one pcap file per each source IP address. What are the performance and memory implications? My throughput is 1Gb/s and in some examples, it can be up to 5 Gb/s.

I am looking for a Tshark command and it will be unmanned operation

edit retag flag offensive close merge delete

Comments

What are you actually trying to do here? If you want a record of all traffic to look at later and have sufficient storage space, you can take one large capture. If you are looking for a specific problem, then you use the relevant capture filter to decrease the file size.

Ross Jacobs gravatar imageRoss Jacobs ( 2019-08-18 17:19:58 +0000 )edit

You would have to run a tshark process per capture filter. Not sure how feasible that would be.

Anders gravatar imageAnders ( 2019-08-19 04:21:05 +0000 )edit

@Anders - I agree that this is the general solution, it sounds like he wants to a pcap created dynamically for each IP address? This would not be something that you could specify in a capture filter. I feel that if P Rao would explain his use case, we will be able to better help him (correct me if I'm missing something).

Ross Jacobs gravatar imageRoss Jacobs ( 2019-08-19 04:29:56 +0000 )edit

2 Answers

Sort by ยป oldest newest most voted
0

answered 2019-08-19 18:42:31 +0000

ErikH gravatar image

My understanding is that you want to:

  1. Capture with minimal risk of dropped packets
  2. Create one pcap file per unique IP address
  3. Do this really fast without using much CPU or memory

My recommendation would be this:

  1. Sniff with something fast, like netsniff-ng, and put all packets in one big pcap file. Tcpdump or dumpcap are okay too.
  2. Use SplitCap to split the big pcap file based on IP addresses like this: SplitCap.exe -r dump.pcap -s host

You will now have a bunch of PCAP files, one for each observed unique IP address.

edit flag offensive delete link more
0

answered 2019-08-19 09:14:29 +0000

SYN-bit gravatar image

Please note that disk-IO is the main factor in being able to successfully capture and save all packets. If you want to capture the same packets to multiple files, you will increase the IOPS needed to save all packets without any discards. So performance implication is you need more striped disks to be able to save the data multiple times.

You are better of using dumpcap directly (dumpcap is used by tshark to do the capture-to-disk). As there is nothing currently available in dumpcap to dynamically determine which file to save to (or even which files (multiple) to save to), you will need to run dumpcap multiple times. This will increase the memory footprint in some degree, but as no state is being kept by dumpcap, it is just the memory needed to run multiple instances of dumpcap.

CPU wise, multiple captures, each with a different capture filter, will impact the CPU as it needs to process each packet multiple times.

What is the use-case for this setup, compared to capturing everything to disk once and then do the filtering later?

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

1 follower

Stats

Asked: 2019-08-18 16:18:40 +0000

Seen: 1,882 times

Last updated: Aug 19 '19