Skip to content

go-faster/jx

Folders and files

NameName
Last commit message
Last commit date

Latest commit

9ff8bd0 · Jan 14, 2022
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

jx experimental

Package jx implements encoding and decoding of json [RFC 7159]. Lightweight fork of jsoniter.

go get github.com/go-faster/jx

Features

  • Directly encode and decode json values
  • No reflect or interface{}
  • Pools and direct buffer access for less (or none) allocations
  • Multi-pass decoding
  • Validation

See usage for examples. Mostly suitable for fast low-level json manipulation with high control. Used in ogen project for json (un)marshaling code generation based on json and OpenAPI schemas.

Also, jx is pretty fast.

{
  "Timestamp": "1586960586000000000",
  "Attributes": {
    "http.status_code": 500,
    "http.url": "http://example.com",
    "my.custom.application.tag": "hello"
  },
  "Resource": {
    "service.name": "donut_shop",
    "service.version": "2.0.0",
    "k8s.pod.uid": "1138528c-c36e-11e9-a1a7-42010a800198"
  },
  "TraceId": "13e2a0921288b3ff80df0a0482d4fc46",
  "SpanId": "43222c2d51a7abe3",
  "SeverityText": "INFO",
  "SeverityNumber": 9,
  "Body": "20200415T072306-0700 INFO I like donuts"
}

This OpenTelemetry log entry has the following benchmarks:

$ go test -bench OTEL
goos: linux
goarch: amd64
pkg: github.com/go-faster/jx
cpu: AMD Ryzen 9 5950X 16-Core Processor
BenchmarkOTEL_Decode-32  674.1 ns/op   741.71 MB/s  0 B/op  0 allocs/op
BenchmarkOTEL_Write-32   231.0 ns/op  1835.49 MB/s  0 B/op  0 allocs/op
PASS

Flexibility of jx enables highly efficient semantic-aware encoding and decoding, e.g. using [16]byte for TraceId with zero-allocation hex encoding in json.

See otel_test.go for example.

Why

Most of jsoniter issues are caused by necessity to be drop-in replacement for standard encoding/json. Removing such constrains greatly simplified implementation and reduced scope, allowing to focus on json stream processing.

  • Commas are handled automatically while encoding
  • Raw json, Number and Base64 support
  • Reduced scope
    • No reflection
    • No encoding/json adapter
    • 3.5x less code (8.5K to 2.4K SLOC)
  • Fuzzing, improved test coverage
  • Drastically refactored and simplified
    • Explicit error returns
    • No Config or API

Usage

Decode

Use jx.Decoder. Zero value is valid, but constructors are available for convenience:

To reuse decoders and their buffers, use jx.GetDecoder and jx.PutDecoder alongside with reset functions:

Decoder is reset on PutDecoder.

d := jx.DecodeStr(`{"values":[4,8,15,16,23,42]}`)

// Save all integers from "values" array to slice.
var values []int

// Iterate over each object field.
if err := d.Obj(func(d *jx.Decoder, key string) error {
    switch key {
    case "values":
        // Iterate over each array element.
        return d.Arr(func(d *jx.Decoder) error {
            v, err := d.Int()
            if err != nil {
                return err
            }
            values = append(values, v)
            return nil
        })
    default:
        // Skip unknown fields if any.
        return d.Skip()
    }
}); err != nil {
    panic(err)
}

fmt.Println(values)
// Output: [4 8 15 16 23 42]

Encode

Use jx.Encoder. Zero value is valid, reuse with jx.GetEncoder, jx.PutEncoder and jx.Encoder.Reset(). Encoder is reset on PutEncoder.

var e jx.Encoder
e.ObjStart()           // {
e.FieldStart("values") // "values":
e.ArrStart()           // [
for _, v := range []int{4, 8, 15, 16, 23, 42} {
    e.Int(v)
}
e.ArrEnd() // ]
e.ObjEnd() // }
fmt.Println(e)
fmt.Println("Buffer len:", len(e.Bytes()))
// Output: {"values":[4,8,15,16,23,42]}
// Buffer len: 28

Raw

Use jx.Decoder.Raw to read raw json values, similar to json.RawMessage.

d := jx.DecodeStr(`{"foo": [1, 2, 3]}`)

var raw jx.Raw
if err := d.Obj(func(d *jx.Decoder, key string) error {
    v, err := d.Raw()
    if err != nil {
        return err
    }
    raw = v
    return nil
}); err != nil {
    panic(err)
}

fmt.Println(raw.Type(), raw)
// Output:
// array [1, 2, 3]

Number

Use jx.Decoder.Num to read numbers, similar to json.Number. Also supports number strings, like "12345", which is common compatible way to represent uint64.

d := jx.DecodeStr(`{"foo": "10531.0"}`)

var n jx.Num
if err := d.Obj(func(d *jx.Decoder, key string) error {
    v, err := d.Num()
    if err != nil {
        return err
    }
    n = v
    return nil
}); err != nil {
    panic(err)
}

fmt.Println(n)
fmt.Println("positive:", n.Positive())

// Can decode floats with zero fractional part as integers:
v, err := n.Int64()
if err != nil {
    panic(err)
}
fmt.Println("int64:", v)
// Output:
// "10531.0"
// positive: true
// int64: 10531

Base64

Use jx.Encoder.Base64 and jx.Decoder.Base64 or jx.Decoder.Base64Append.

Same as encoding/json, base64.StdEncoding or [RFC 4648].

var e jx.Encoder
e.Base64([]byte("Hello"))
fmt.Println(e)

data, _ := jx.DecodeBytes(e.Bytes()).Base64()
fmt.Printf("%s", data)
// Output:
// "SGVsbG8="
// Hello

Validate

Check that byte slice is valid json with jx.Valid:

fmt.Println(jx.Valid([]byte(`{"field": "value"}`))) // true
fmt.Println(jx.Valid([]byte(`"Hello, world!"`)))    // true
fmt.Println(jx.Valid([]byte(`["foo"}`)))            // false

Capture

The jx.Decoder.Capture method allows to unread everything is read in callback. Useful for multi-pass parsing:

d := jx.DecodeStr(`["foo", "bar", "baz"]`)
var elems int
// NB: Currently Capture does not support io.Reader, only buffers.
if err := d.Capture(func(d *jx.Decoder) error {
	// Everything decoded in this callback will be rolled back.
	return d.Arr(func(d *jx.Decoder) error {
		elems++
		return d.Skip()
	})
}); err != nil {
	panic(err)
}
// Decoder is rolled back to state before "Capture" call.
fmt.Println("Read", elems, "elements on first pass")
fmt.Println("Next element is", d.Next(), "again")

// Output:
// Read 3 elements on first pass
// Next element is array again

ObjBytes

The Decoder.ObjBytes method tries not to allocate memory for keys, reusing existing buffer.

d := DecodeStr(`{"id":1,"randomNumber":10}`)
d.ObjBytes(func(d *Decoder, key []byte) error {
    switch string(key) {
    case "id":
    case "randomNumber":
    }
    return d.Skip()
})

Roadmap

  • Rework and export Any
  • Support Raw for io.Reader
  • Support Capture for io.Reader
  • Improve Num
    • Better validation on decoding
    • Support BigFloat and BigInt
    • Support equivalence check, like eq(1.0, 1) == true
  • Add non-callback decoding of objects

Non-goals

  • Code generation for decoding or encoding
  • Replacement for encoding/json
  • Reflection or interface{} based encoding or decoding
  • Support for json path or similar

This package should be kept as simple as possible and be used as low-level foundation for high-level projects like code generator.

License

MIT, same as jsoniter