-
-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
json log duplicates #14
Comments
can you share your log output please ? |
all below using v0.11.0: using:loggers: to run this test, I did two 'dig @ mydns-server peanut.com' and i logged 16 queries and 14 responses; filename: logger-json.txt using syslog:syslog: twice with about 2-3 seconds; then I grep for 'peanut.com' in my logfile and although I have only 8 lines or so, they are spectacularily long and contains other RR data of other domains; filename: syslog-json.txt im running the collector under a typical x86_64 Ubuntu 20.04 machine, and the dnsdist is a rPi running raspbian. |
Im seeing better results using the dnstap_receiver, but for some reason it doesnt log the returned RR, only the client and the query and response, but not the returned RR data (?). I'm stuck! |
i was perplexed so as a test I installed a Fedora Server 35 machine to run the collector and I am seeing the same behaviour; a single DNS query generates duplicate CQ and CR entries for the same query. I did a wireshark and did a dig on ibm.com and in the packet capture, I see the CQ and CR packets with one instance of ibm.com in them, yet with go-dnscollector my output to json file includes 3 CLIENT_QUERY and 4 CLIENT_RESPONSE for that exchange. |
Have you the same behavior with the text format ? |
Bug confirmed in my side. I pushed a fix. |
my pleasure. how can I test 0.12.0? :) |
v0.12.0 is available, waiting feedback before to close this issue |
perhaps not related specifically to this fix but I am getting a crash after a few minutes of running. this is on Fedora, going to try on vanilla ubuntu 20.04 |
Can you open a new issue regarding this crash please ? i will investigate on it |
no more duplicate logs with the beta2 ? |
Hello @dmachard , I checked the logs and I have +-2 REQ vs RESPONSE in each logfile and no duplicates. seems quite good, thank you kindly for this fix |
using v0.10.0 or v0.11.0 go-collector binary with this config:
trace:
verbose: false
collectors:
dnstap:
enable: true
listen-ip: 0.0.0.0
listen-port: 6000
tls-support: false
loggers:
logfile:
enable: true
file-path: "/var/log/dnstap.log"
max-size: 100
max-files: 20
mode: json
I am getting massive amounts of duplicates. I also tried:
syslog:
enable: true
severity: INFO
facility: DAEMON
transport: local
# Remote address host:port
# remote-address: ""
# output text format, please refer to the default text format to see all available directives
# use this parameter if you want a specific format
# text-format: ""
# output format: text|json
mode: json
and curiously I am getting duplicates as well, however they are in a strange staircase format where the duplicate entry gets appended at the end of the previous one for 2-5 times.
any tips on how to troubleshoot this?
my source is dnsdist v1.5.1
using this as config:
--from https://dmachard.github.io/posts/0001-dnstap-testing/#tcp-stream
fsul = newFrameStreamTcpLogger("xxx.xxx.xxx.xxx:6000")
addAction(AllRule(), DnstapLogAction("dnsdist", fsul))
addResponseAction(AllRule(), DnstapLogResponseAction("dnsdist", fsul))
-- Cache Hits
addCacheHitResponseAction(AllRule(), DnstapLogResponseAction("dnsdist", fsul))
The text was updated successfully, but these errors were encountered: