I was trying to do a batch input with a bunch of CSVs using a universal forwarder, really simple thing:
inputs.conf:
[batch://]
move_policy=sinkhole
index=myindex
sourcetype=mysourcetype
props.conf:
[mysourcetype]
INDEXED_EXTRACTIONS = CSV
The file gets indexed, but when I'm running searches against my data, some lines are missing from the file.
I've waited for several hours (even tried a day after), but still missing, so no forwarding delay I guess. Also, newer files are already indexed.
The problematic file I've used for checking contains 190K lines. I don't see any errors in the internal logs, only a warning about too much events (100K+) with the same timestamp. My universal forwarder version is 6.6.1
I've moved the same files and the same configs to a heavy forwarder, and tested: works perfectly.
I see the same warning about the timestamp, but the whole content of the file is properly indexed.
Is there a limitation with universal forwarders for such things? Or where should I look to solve this?
↧