You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanos, Prometheus and Golang version used:
thanos, version 0.11.0
go1.13.6
prometheus, version 2.6.0
Object Storage Provider:
S3
What happened:
I started the compact service on a VM on which the store service is already running.
It went well during about 2 hours until the compact module stopped suddenly working.
Now the compact service just won't keep running anymore, the working directory is empty and from the logs it seems everything is fine until some exception from http.go is raised level=info ts=2020-03-18T13:48:12.910721661Z caller=http.go:81 service=http/server component=compact msg="internal server shutdown" err=null
What you expected to happen:
Compact should not stop running
How to reproduce it (as minimally and precisely as possible):
Run the compact and store services in the same environment (operating system, users) against the same s3 storage object provider
Full logs to relevant components:
Logs from the service at the point where it stopped working (started Mar 18 08:54:14 stopped Mar 18 10:25:28
Mar 18 08:54:14 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:54:14.256484336Z caller=main.go:152 msg="Tracing will be disabled"
Mar 18 08:54:14 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:54:14.256603659Z caller=factory.go:46 msg="loading bucket configuration"
Mar 18 08:54:14 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:54:14.25713095Z caller=compact.go:386 msg="starting compact node"
Mar 18 08:54:14 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:54:14.257157028Z caller=intrumentation.go:52 msg="changing probe status" status=ready
Mar 18 08:54:14 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:54:14.257425848Z caller=compact.go:858 msg="start sync of metas"
Mar 18 08:54:14 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:54:14.258312265Z caller=intrumentation.go:64 msg="changing probe status" status=healthy
Mar 18 08:54:14 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:54:14.258507844Z caller=http.go:56 service=http/server component=compact msg="listening for r
Mar 18 08:54:15 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:54:15.447950177Z caller=fetcher.go:368 component=block.MetaFetcher msg="successfully fetched
Mar 18 08:54:15 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:54:15.448009153Z caller=compact.go:864 msg="start of GC"
Mar 18 08:54:15 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:54:15.448020436Z caller=compact.go:872 msg="start of compaction"
Mar 18 08:54:17 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:54:17.528335715Z caller=compact.go:441 msg="compact blocks" count=3 mint=1583834400000 maxt=1
Mar 18 08:54:19 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:54:19.470655716Z caller=compact.go:763 compactionGroup=0@14539196864129613008 msg="deleting c
Mar 18 08:54:20 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:54:20.416941362Z caller=compact.go:763 compactionGroup=0@14539196864129613008 msg="deleting c
Mar 18 08:54:21 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:54:21.246485865Z caller=compact.go:763 compactionGroup=0@14539196864129613008 msg="deleting c
Mar 18 08:54:41 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:54:41.005218399Z caller=compact.go:441 msg="compact blocks" count=3 mint=1583839044595 maxt=1
Mar 18 08:54:47 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:54:47.221800803Z caller=compact.go:763 compactionGroup=0@15016429373970399229 msg="deleting c
Mar 18 08:54:48 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:54:48.131118347Z caller=compact.go:763 compactionGroup=0@15016429373970399229 msg="deleting c
Mar 18 08:54:49 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:54:49.278267016Z caller=compact.go:763 compactionGroup=0@15016429373970399229 msg="deleting c
Mar 18 08:55:13 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:55:13.408279073Z caller=compact.go:441 msg="compact blocks" count=4 mint=1583831785464 maxt=1
Mar 18 08:55:19 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:55:19.397433332Z caller=compact.go:763 compactionGroup=0@3010882595722085081 msg="deleting co
Mar 18 08:55:20 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:55:20.695568834Z caller=compact.go:763 compactionGroup=0@3010882595722085081 msg="deleting co
Mar 18 08:55:21 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:55:21.632591564Z caller=compact.go:763 compactionGroup=0@3010882595722085081 msg="deleting co
Mar 18 08:55:22 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:55:22.741652735Z caller=compact.go:763 compactionGroup=0@3010882595722085081 msg="deleting co
Mar 18 08:55:23 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:55:23.755750835Z caller=compact.go:858 msg="start sync of metas"
Mar 18 08:55:24 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:55:24.314209977Z caller=fetcher.go:368 component=block.MetaFetcher msg="successfully fetched
Mar 18 08:55:24 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:55:24.314259974Z caller=compact.go:864 msg="start of GC"
Mar 18 08:55:24 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:55:24.31427172Z caller=compact.go:872 msg="start of compaction"
Mar 18 08:55:25 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:55:25.631922873Z caller=compact.go:441 msg="compact blocks" count=2 mint=1583856000000 maxt=1
Mar 18 08:55:27 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:55:27.347431223Z caller=compact.go:763 compactionGroup=0@14539196864129613008 msg="deleting c
Mar 18 08:55:28 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:55:28.362868279Z caller=compact.go:763 compactionGroup=0@14539196864129613008 msg="deleting c
Mar 18 08:56:11 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:56:11.686405402Z caller=compact.go:441 msg="compact blocks" count=4 mint=1583856000000 maxt=1
Mar 18 08:56:22 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:56:22.169351657Z caller=compact.go:763 compactionGroup=0@15016429373970399229 msg="deleting c
Mar 18 08:56:23 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:56:23.272697635Z caller=compact.go:763 compactionGroup=0@15016429373970399229 msg="deleting c
Mar 18 08:56:24 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:56:24.589822719Z caller=compact.go:763 compactionGroup=0@15016429373970399229 msg="deleting c
Mar 18 08:56:25 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:56:25.814070766Z caller=compact.go:763 compactionGroup=0@15016429373970399229 msg="deleting c
Mar 18 08:57:07 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T08:57:07.892248327Z caller=compact.go:441 msg="compact blocks" count=4 mint=1583856000000 maxt=1
....
....
....
Mar 18 10:20:52 thanos-store.kayrros thanos[31070]: level=warn ts=2020-03-18T10:20:52.08481167Z caller=streamed_block_writer.go:115 msg="empty chunks happened, skip series"
Mar 18 10:20:52 thanos-store.kayrros thanos[31070]: level=warn ts=2020-03-18T10:20:52.085684529Z caller=streamed_block_writer.go:115 msg="empty chunks happened, skip series"
Mar 18 10:20:52 thanos-store.kayrros thanos[31070]: level=warn ts=2020-03-18T10:20:52.09119412Z caller=streamed_block_writer.go:115 msg="empty chunks happened, skip series"
Mar 18 10:20:52 thanos-store.kayrros thanos[31070]: level=warn ts=2020-03-18T10:20:52.099343175Z caller=streamed_block_writer.go:115 msg="empty chunks happened, skip series"
Mar 18 10:20:52 thanos-store.kayrros thanos[31070]: level=warn ts=2020-03-18T10:20:52.10028563Z caller=streamed_block_writer.go:115 msg="empty chunks happened, skip series"
Mar 18 10:20:52 thanos-store.kayrros thanos[31070]: level=warn ts=2020-03-18T10:20:52.106005072Z caller=streamed_block_writer.go:115 msg="empty chunks happened, skip series"
Mar 18 10:20:52 thanos-store.kayrros thanos[31070]: level=warn ts=2020-03-18T10:20:52.114047956Z caller=streamed_block_writer.go:115 msg="empty chunks happened, skip series"
Mar 18 10:20:52 thanos-store.kayrros thanos[31070]: level=warn ts=2020-03-18T10:20:52.115060124Z caller=streamed_block_writer.go:115 msg="empty chunks happened, skip series"
Mar 18 10:20:52 thanos-store.kayrros thanos[31070]: level=warn ts=2020-03-18T10:20:52.120673275Z caller=streamed_block_writer.go:115 msg="empty chunks happened, skip series"
Mar 18 10:20:52 thanos-store.kayrros thanos[31070]: level=warn ts=2020-03-18T10:20:52.128648416Z caller=streamed_block_writer.go:115 msg="empty chunks happened, skip series"
Mar 18 10:20:52 thanos-store.kayrros thanos[31070]: level=warn ts=2020-03-18T10:20:52.129532343Z caller=streamed_block_writer.go:115 msg="empty chunks happened, skip series"
Mar 18 10:20:52 thanos-store.kayrros thanos[31070]: level=warn ts=2020-03-18T10:20:52.135284875Z caller=streamed_block_writer.go:115 msg="empty chunks happened, skip series"
Mar 18 10:20:52 thanos-store.kayrros thanos[31070]: level=warn ts=2020-03-18T10:20:52.143215741Z caller=streamed_block_writer.go:115 msg="empty chunks happened, skip series"
Mar 18 10:20:52 thanos-store.kayrros thanos[31070]: level=warn ts=2020-03-18T10:20:52.144413163Z caller=streamed_block_writer.go:115 msg="empty chunks happened, skip series"
Mar 18 10:20:52 thanos-store.kayrros thanos[31070]: level=warn ts=2020-03-18T10:20:52.150230934Z caller=streamed_block_writer.go:115 msg="empty chunks happened, skip series"
Mar 18 10:20:52 thanos-store.kayrros thanos[31070]: level=warn ts=2020-03-18T10:20:52.162036188Z caller=streamed_block_writer.go:115 msg="empty chunks happened, skip series"
Mar 18 10:20:52 thanos-store.kayrros thanos[31070]: level=warn ts=2020-03-18T10:20:52.162904811Z caller=streamed_block_writer.go:115 msg="empty chunks happened, skip series"
Mar 18 10:20:52 thanos-store.kayrros thanos[31070]: level=warn ts=2020-03-18T10:20:52.169096613Z caller=streamed_block_writer.go:115 msg="empty chunks happened, skip series"
Mar 18 10:24:59 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T10:24:59.781066236Z caller=streamed_block_writer.go:185 msg="finalized downsampled block" mint=1
Mar 18 10:24:59 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T10:24:59.781147938Z caller=downsample.go:295 msg="downsampled block" from=01E3PGX1R5GA71BBDMM0MV
Mar 18 10:25:26 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T10:25:26.257086509Z caller=downsample.go:309 msg="uploaded block" id=01E3PJAMYS2PS7QPTW4Q1EA90V
Mar 18 10:25:26 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T10:25:26.892881082Z caller=compact.go:319 msg="start second pass of downsampling"
Mar 18 10:25:27 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T10:25:27.235465806Z caller=fetcher.go:368 component=block.MetaFetcher msg="successfully fetched
Mar 18 10:25:27 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T10:25:27.235889306Z caller=compact.go:324 msg="downsampling iterations done"
Mar 18 10:25:27 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T10:25:27.236032207Z caller=retention.go:20 msg="start optional retention"
Mar 18 10:25:27 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T10:25:27.509474806Z caller=fetcher.go:368 component=block.MetaFetcher msg="successfully fetched
Mar 18 10:25:27 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T10:25:27.51000045Z caller=retention.go:41 msg="optional retention apply done"
Mar 18 10:25:27 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T10:25:27.510108489Z caller=clean.go:25 msg="started cleaning of aborted partial uploads"
Mar 18 10:25:27 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T10:25:27.897153216Z caller=fetcher.go:368 component=block.MetaFetcher msg="successfully fetched
Mar 18 10:25:27 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T10:25:27.909853174Z caller=clean.go:50 msg="cleaning of aborted partial uploads done"
Mar 18 10:25:28 thanos-store.kayrros thanos[31070]: level=warn ts=2020-03-18T10:25:27.957940476Z caller=intrumentation.go:58 msg="changing probe status" status=not-ready rea
Mar 18 10:25:28 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T10:25:28.023761834Z caller=http.go:81 service=http/server component=compact msg="internal server
Mar 18 10:25:28 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T10:25:28.024002433Z caller=intrumentation.go:70 msg="changing probe status" status=not-healthy r
Mar 18 10:25:28 thanos-store.kayrros thanos[31070]: level=info ts=2020-03-18T10:25:28.061635194Z caller=main.go:213 msg=exiting
Logs from starting manually the service with debug log.level
Anything else we need to know:
I am not sure if that could be related to this and to understand the implications of running both services (compact and store) in the same environment
The text was updated successfully, but these errors were encountered:
Gnoale
changed the title
compact: shutdown unexpectidly
compact: shutdown unexpectedly
Mar 18, 2020
It has done what you've asked it to do - it has done one iteration and exited :P you can make it work continuously by passing --wait. Let me know if that helps.
Thanos, Prometheus and Golang version used:
thanos, version 0.11.0
go1.13.6
prometheus, version 2.6.0
Object Storage Provider:
S3
What happened:
I started the compact service on a VM on which the store service is already running.
It went well during about 2 hours until the compact module stopped suddenly working.
Now the compact service just won't keep running anymore, the working directory is empty and from the logs it seems everything is fine until some exception from http.go is raised
level=info ts=2020-03-18T13:48:12.910721661Z caller=http.go:81 service=http/server component=compact msg="internal server shutdown" err=null
What you expected to happen:
Compact should not stop running
How to reproduce it (as minimally and precisely as possible):
Run the compact and store services in the same environment (operating system, users) against the same s3 storage object provider
Full logs to relevant components:
Logs from the service at the point where it stopped working (started
Mar 18 08:54:14
stoppedMar 18 10:25:28
Logs from starting manually the service with debug log.level
Anything else we need to know:
I am not sure if that could be related to this and to understand the implications of running both services (compact and store) in the same environment
The text was updated successfully, but these errors were encountered: