Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

JSON_v2 parser error when parsing json array #10646

Open
Yahya222 opened this issue Feb 14, 2022 · 6 comments
Open

JSON_v2 parser error when parsing json array #10646

Yahya222 opened this issue Feb 14, 2022 · 6 comments
Labels
area/json json and json_v2 parser/serialiser related bug unexpected problem or unintended behavior

Comments

@Yahya222
Copy link

Yahya222 commented Feb 14, 2022

Relevant telegraf.conf

[agent]
  interval = "100s"
  debug = true

[[inputs.file]]
    files = ["20211209.json"]
    data_format = "json_v2"

    [[inputs.file.json_v2]]
        measurement_name = "metric"

        [[inputs.file.json_v2.object]]
            path = "{time:result.0.data.0.timestamps,metric1:result.0.data.0.values,metric2:result.0.data.1.values}"
            tags = ['time','metric1','metric2']
            timestamp_format="unix_ms" 
[[outputs.file]]
  ## Files to write to, "stdout" is a specially handled file.
  files = ["stdout", "metrics4.out"]

Logs from Telegraf

2022-02-14T12:58:41Z I! Starting Telegraf 1.21.3
2022-02-14T12:58:41Z I! Loaded inputs: file
2022-02-14T12:58:41Z I! Loaded aggregators:
2022-02-14T12:58:41Z I! Loaded processors:
2022-02-14T12:58:41Z I! Loaded outputs: file
2022-02-14T12:58:41Z I! Tags enabled: host=BE1CT887
2022-02-14T12:58:41Z I! [agent] Config: Interval:1m40s, Quiet:false, Hostname:"BE1CT887", Flush Interval:10s
2022-02-14T12:58:41Z D! [agent] Initializing plugins
2022-02-14T12:58:41Z D! [agent] Connecting outputs
2022-02-14T12:58:41Z D! [agent] Attempting connection to [outputs.file]
2022-02-14T12:58:41Z D! [agent] Successfully connected to outputs.file
2022-02-14T12:58:41Z D! [agent] Starting service inputs
2022-02-14T12:58:51Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T12:59:01Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T12:59:11Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T12:59:21Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T12:59:31Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T12:59:41Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T12:59:51Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:00:01Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:00:11Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:00:21Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:00:31Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:00:41Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:00:51Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:01:01Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:01:11Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:01:21Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:01:31Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:01:41Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:01:51Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:02:01Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:02:11Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:02:21Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:02:31Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:02:41Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:02:51Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:03:01Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:03:11Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:03:21Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics

System info

telegraf-1.21.3

Docker

No response

Steps to reproduce

When I run the query I get this json
{"time":[
1638486300000,
1638486600000,
1638486900000,
1638487200000,
1638487500000,
1638487800000,
1638488100000,
1638488400000,
1638488700000,
1638489000000,
1638489300000,
1638489600000,
1638489900000,
1638490200000,
1638490500000,
1638490800000,
1638491100000,
1638491400000,
1638491700000,
1638492000000,
1638492300000,
1638492600000,
....
],"metric1":[
2140,
1718,
1532,
2003,
2486,
1845,
1807,
1405,
1870,
1040,
1404,
886,
2599,
1092,
1349,
1088,
1244,
867,
1332,
1328,
1677,
1776,
1782,
1216,
1957,
2432,
...
],"metric2":[
1306,
1340,
898,
1227,
1707,
1553,
1319,
1060,
1122,
778,
749,
701,
2394,
810,
901,
930,
904,
794,
983,
1235,
942,
...
]}
11Running with default timestamp works but as soon as I try to give time stamp I get this error

Expected behavior

metric metric1=.. metric2=...

Actual behavior

I get nothing just

2022-02-14T13:00:51Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:01:01Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:01:11Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:01:21Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2022-02-14T13:01:31Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics

Additional info

I tried different variants including using the timestamp key. I also tried the data converter and starlark.

@Yahya222 Yahya222 added the bug unexpected problem or unintended behavior label Feb 14, 2022
@sspaink
Copy link
Contributor

sspaink commented Feb 14, 2022

@Yahya222 you might be facing this same issue: #10606, selecting the timestamp does seem broken in v1.21.3. There is already a PR that I think resolves it: #10618

@Yahya222
Copy link
Author

Yahya222 commented Feb 14, 2022

  [[inputs.file]]
      files = ["20211209.json"]
      data_format = "json_v2"
  
      [[inputs.file.json_v2]]
          measurement_name = "metric"
  
          [[inputs.file.json_v2.object]]
  
              path = "{stamp:result.0.data.0.timestamps,metric1:result.0.data.0.values,metric2:result.0.data.1.values}"
              #timestamp_key='stamp'
              #timestamp_format="unix_ms"
              [[inputs.file.json_v2.object.field]]
                path="metric1"
                type= "int"
              [[inputs.file.json_v2.object.field]]
                path="metric2"
                type= "int"
              [[inputs.file.json_v2.object.field]]
                path="stamp"
                type= "int"
              disable_prepend_keys=true
   
  [[outputs.file]]
    files = ["stdout", "metrics4.out"]

I retried with 1.22 using this input as soon as I remove the comments from timestamp_key it bugs I dont even get any error message. The error is still persistent

@Yahya222
Copy link
Author

I figured out a workaround. I used starlark processor. The issue in my opinion is handling the unix_ms with integer. I suspect it is an error while converting int to string. Therefore when I used the starlark I first converted it to int in the first part and then I used the starlark processor to convert it back to string and assign it to metric.time

@kgrimsby
Copy link
Contributor

Hi @Yahya222 ,

Can you provide code for what you did? I have a similar problem.

@Hipska Hipska added area/json json and json_v2 parser/serialiser related area/json_v2 and removed area/starlark area/json_v2 labels Feb 15, 2022
@Yahya222
Copy link
Author

Yahya222 commented Feb 15, 2022

    [[inputs.http.json_v2]]
        measurement_name = "metric3"

        [[inputs.http.json_v2.object]]

            path = "{stamp:result.0.data.0.timestamps,metric1:result.0.data.0.values,metric2:result.0.data.1.values}"

            [[inputs.http.json_v2.object.field]]
              path="metric1"
              type= "int"
            [[inputs.http.json_v2.object.field]]
              path="metric2"
              type= "int"
            [[inputs.http.json_v2.object.field]]
              path="stamp"
              type= "int"
[[processors.starlark]]
 source = '''
load("logging.star", "log")  
def apply(metric):
  log.info(metric)
  metric.time=metric.fields['stamp']*1000000

  return metric
'''

@Yahya222
Copy link
Author

@sspaink @kgrimsby on a complitely unrelated question. Do you have any idea how to change the _start and _stop times when writing a metric into influxdb

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/json json and json_v2 parser/serialiser related bug unexpected problem or unintended behavior
Projects
None yet
Development

No branches or pull requests

4 participants