-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hang/Crash with large strings #477
Comments
1300 bytes is not nearly large enough for this to be a problem with json-c handling of "large" strings. Things might get noticeably slow after a few MB ... might have some kind of bug after maybe 2 GB ... but at 1300 bytes your problem is something else. |
If you can provide a minimal compile-able example that demonstrates the problem, that would help a lot. |
Sorry it took so long to reply, I didn't get back to this as soon as I should have. After I asked the question, I also noticed that the signature string had newline characters embedded within it. The string came from a JAVA base 64 encoded string. We have since removed those characters to get something similar to what is in the snippet below. Failure is the same. Signature re-written for security. Code is below:
During the parse, it is consistently failing/hanging on the ending qutation mark of the very long signature string. |
hmm ... it's a realloc() from 500 to 1000 bytes ... I wonder if there's something weird about realloc() on windows - json-c does have some optional wrapper for it: Line 201 in 3df1f98
Line 211 in 3df1f98
|
Secondary question, and what ended up being the root of the original crash/hang I was seeing was what I mentioned in my last comment. The newline character. I see that there is special code within the tokener that exits out (shown below) where it exits once it hits an escape character. Why is that? There are escape characters which are valid for JSON strings, correct?
|
Sorry, I guess I forgot to mention that this is running on a NXP K70 running MQX RTOS. I'll look at whether the optional wrapper has any better luck. Maybe a secondary question to this... It would be grossly inefficient, but is there a way to set a larger default size for the buffer? ie, from 500 to 1000? |
It switches from state
That realloc wrapper should not make any difference for this case, it's for handling null and zero in case libc realloc() is not fully posix. It's probably something in that RTOS's libc realloc() not liking these "big" re-allocations. |
If that is the case, is there a way for me to just set the allocations larger by default so there isn't a need to have it reallocate? |
Seems like the K70 has about 128 KiB SRAM, so it makes sense that heap management is very constrained. If you set allocations larger by default, they'll apply to a bunch of smaller things, and you may run out of memory, or into a memory fragmentation issue (lack of big-enough free chunks in the heap), sooner. You may need to switch to a stream-oriented json library that does no allocations, just a sequence of callbacks with offsets. |
We have and are mapped to an external SRAM with a size of 1MB, I think, possibly 2MB. I'll have to do some reading to see why MQX is having an issue with realloc like that. I did bump up the size for the printbuf allocation from initial 500 to 1000. That solved the hang issue and I was able to proceed. Are there downsides to do that? Besides "wasting" space that won't be used for a majority of the transfers? |
The big downside here is that you're just masking the problem, without truly fixing anything. |
Closing due to lack of response. |
I am using v0.12.1. I am seeing that very large strings are causing the tokener code to hang. I'm not exactly sure what the size cutoff is that is causing the issue because I don't control the code sending the messages. I have seen packets 300-500 bytes/characters long. The string that I"m working on now though is 1298 bytes/characters.
Is there a max length that is supported by json_c?
Here is a snippet of my code and where things hang up:
Initially I had this, which worked for the smaller strings.
jsonObj = json_tokener_parse( i_strToParse );
I even tried calling json_tokener_new_ex() and passing a specific depth thinking that possibly it was allocating too much space. That made no difference.
I have to be doing something wrong, but I'm not sure what. The above code works for all of the other strings that I am trying to parse. Any help/guidance is appreciated.
The text was updated successfully, but these errors were encountered: