Package jx implements encoding and decoding of json [RFC 7159]. Lightweight fork of jsoniter.
go get github.com/ogen-go/jx
- Directly encode and decode json values
- No reflect or
interface{}
- Pools and direct buffer access for less (or none) allocations
- Multi-pass decoding
- Validation
See usage for examples. Mostly suitable for fast low-level json manipulation with high control. Used in ogen project for json (un)marshaling code generation based on json and OpenAPI schemas.
Most of jsoniter issues are caused by necessity
to be drop-in replacement for standard encoding/json
. Removing such constrains greatly
simplified implementation and reduced scope, allowing to focus on json stream processing.
- Reduced scope
- No reflection
- No
encoding/json
adapter - 4x less code (8.5K to 2K SLOC)
- Fuzzing, improved test coverage
- Drastically refactored and simplified
- Explicit error returns
- No
Config
orAPI
Use jx.Decoder. Zero value is valid, but constructors are available for convenience:
- jx.Decode(reader io.Reader, bufSize int) for
io.Reader
- jx.DecodeBytes([]byte) for byte slices
- jx.DecodeStr(string) for strings
To reuse decoders and their buffers, use jx.GetDecoder and jx.PutDecoder alongside with reset functions:
- jx.Decoder.Reset(io.Reader) to reset to new
io.Reader
- jx.Decoder.ResetBytes([]byte) to decode another byte slice
Decoder is reset on PutDecoder
.
d := jx.DecodeStr(`{"values":[4,8,15,16,23,42]}`)
// Save all integers from "values" array to slice.
var values []int
// Iterate over each object field.
if err := d.Obj(func(d *jx.Decoder, key string) error {
switch key {
case "values":
// Iterate over each array element.
return d.Arr(func(d *jx.Decoder) error {
v, err := d.Int()
if err != nil {
return err
}
values = append(values, v)
return nil
})
default:
// Skip unknown fields if any.
return d.Skip()
}
}); err != nil {
panic(err)
}
fmt.Println(values)
// Output: [4 8 15 16 23 42]
Use jx.Encoder. Zero value is valid, reuse with
jx.GetEncoder,
jx.PutEncoder and
jx.Encoder.Reset(). Encoder is reset on PutEncoder
.
var e jx.Encoder
e.ObjStart() // {
e.ObjField("values") // "values":
e.ArrStart() // [
for i, v := range []int{4, 8, 15, 16, 23, 42} {
if i != 0 {
e.More() // ,
}
e.Int(v)
}
e.ArrEnd() // ]
e.ObjEnd() // }
fmt.Println(e)
fmt.Println("Buffer len:", len(e.Bytes()))
// Output: {"values":[4,8,15,16,23,42]}
// Buffer len: 28
Use jx.Decoder.Raw to read raw json values, similar to json.RawMessage
.
d := jx.DecodeStr(`{"foo": [1, 2, 3]}`)
var raw jx.Raw
if err := d.Obj(func(d *jx.Decoder, key string) error {
v, err := d.Raw()
if err != nil {
return err
}
raw = v
return nil
}); err != nil {
panic(err)
}
fmt.Println(raw.Type(), raw)
// Output:
// array [1, 2, 3]
Use jx.Decoder.Num to read numbers, similar to json.Number
.
Also supports number strings, like "12345"
, which is common compatible way to represent uint64
.
d := jx.DecodeStr(`{"foo": "10531.0"}`)
var n jx.Num
if err := d.Obj(func(d *jx.Decoder, key string) error {
v, err := d.Num()
if err != nil {
return err
}
n = v
return nil
}); err != nil {
panic(err)
}
fmt.Println(n)
fmt.Println("positive:", n.Positive())
// Can decode floats with zero fractional part as integers:
v, err := n.Int64()
if err != nil {
panic(err)
}
fmt.Println("int64:", v)
// Output:
// "10531.0"
// positive: true
// int64: 10531
Check that byte slice is valid json with jx.Valid:
fmt.Println(jx.Valid([]byte(`{"field": "value"}`))) // true
fmt.Println(jx.Valid([]byte(`"Hello, world!"`))) // true
fmt.Println(jx.Valid([]byte(`["foo"}`))) // false
The jx.Decoder.Capture method allows to unread everything is read in callback. Useful for multi-pass parsing:
d := jx.DecodeStr(`["foo", "bar", "baz"]`)
var elems int
// NB: Currently Capture does not support io.Reader, only buffers.
if err := d.Capture(func(d *jx.Decoder) error {
// Everything decoded in this callback will be rolled back.
return d.Arr(func(d *jx.Decoder) error {
elems++
return d.Skip()
})
}); err != nil {
panic(err)
}
// Decoder is rolled back to state before "Capture" call.
fmt.Println("Read", elems, "elements on first pass")
fmt.Println("Next element is", d.Next(), "again")
// Output:
// Read 3 elements on first pass
// Next element is array again
The Decoder.ObjBytes
method tries not to allocate memory for keys, reusing existing buffer.
d := DecodeStr(`{"id":1,"randomNumber":10}`)
d.ObjBytes(func(d *Decoder, key []byte) error {
switch string(key) {
case "id":
case "randomNumber":
}
return d.Skip()
})
- Support
Raw
decoding - Rework
json.Number
- Rework
Any
- Support
Raw
for io.Reader - Support
Capture
for io.Reader - Decide what to do with
base64
- Code generation for decoding or encoding
- Replacement for
encoding/json
- Reflection or
interface{}
based encoding or decoding - Support for json path or similar
This package should be kept as simple as possible and be used as low-level foundation for high-level projects like code generator.
MIT, same as jsoniter