Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Solved the CryptoStream errors causing the node version incompatibili… #14

Merged
merged 1 commit into from
Feb 3, 2017

Conversation

PseudoSky
Copy link
Contributor

Primarily in reference to spark-protocols use in spark-server

There seems to have been an issue with the Crypto Stream library that lead to node version incompatibilities.

( Previously required node version 10.X, with this fix the module works on node 12.5 )
Tested with both node v0.10.41 and v0.12.5

The crypto stream error was coming from the additional null packet before the end of the stream
Debugging the stream:

<Buffer a0 4a 2e 8e 2d ce de 12 15 03 7a 42 44 ca 84 88 72 64 77 61 72 65 2f 6f 74 61 5f 63 68 75 6e 6b>
<Buffer 5f 73 69 7a 65 ff 35 31 32>
null
CryptoStream transform error TypeError: Cannot read property 'length' of null
Coap Error: Error: Invalid CoAP version. Expected 1, got: 3

Wrapping the cipher text chunk concatenation statement with a check for null chunks seems to solve (I believe) all of the node version dependency issues.

if(chunk){
  if (!ciphertext) {
    ciphertext = chunk;
  } else {
    ciphertext = Buffer.concat([ciphertext, chunk], ciphertext.length + chunk.length)
  }
}

@dmiddlecamp
Copy link
Contributor

awesome, thank you for the PR! If you haven't already could you sign our CLA?

https://docs.google.com/a/spark.io/forms/d/1_2P-vRKGUFg5bmpcKLHO_qNZWGi5HKYnfrrkd-sbZoA/viewform

@schwiet
Copy link

schwiet commented Jan 18, 2016

after re-doing npm install, this change worked for me on 12.5. Thanks!

@PseudoSky
Copy link
Contributor Author

@schwiet Awesome thanks for checking
@dmiddlecamp Signed

@schwiet
Copy link

schwiet commented Jan 19, 2016

fwiw, I also just had a chance to test this with node v5.0. works!

straccio added a commit to straccio/spark-protocol that referenced this pull request Jan 25, 2016
@chuank
Copy link

chuank commented Jan 29, 2016

@PseudoSky a million thanks for this; I can confirm that the fix also works with node v4.2.6 LTS; I did an in-place upgrade from 0.10.41.

EDIT: unfortunately, using node v4.2.6 LTS still presents issues re: the NULL CoAP message, as described in particle-iot/spark-server#69. Devices are able to connect, but disconnect briefly after publishing an event, causing a 1-second lag as it restores connection to the local cloud.

https://community.particle.io/t/node-js-version-and-spark-server/10387/6

@efuturetoday
Copy link

thank you ! it worked for me!

@schwiet
Copy link

schwiet commented Feb 3, 2017

Not sure if this library has just been abandoned. But would really appreciate if this fix could get pulled in. Just ran into it again and had to search for this patch again. Love using my particle devices locally, please don't let this die!

@brycekahle brycekahle merged commit 2c745e7 into particle-iot:master Feb 3, 2017
@chenchen-hci
Copy link

chenchen-hci commented Mar 12, 2017

Issues

@PseudoSky @schwiet I am wondering whether you can get spark-server on V4 node works perfectly. I am currently doing project involved with particle local cloud but on V4 node. After following previous discussions, most of issues can be solved except the CryptoStream transform error, where the prompt information said:

CryptoStream transform error Error: error:06065064:digital envelope routines:EVP_DecryptFinal_ex:bad decrypt

This error is not continuous popped out, which may be ok to ignore for low resolution work. However, it can be fatal for high resolution data acquisition due to the broken pipe because of this. Any suggestions will be appreciated!

some updates

After looking into code, I believe the exception occurs in following code snippet in CryptoStream.js file

try {
    var cipher = this.getCipher(callback);
    cipher.write(chunk);
    cipher.end();  // exception occurred here
    cipher = null;
    if (!this.encrypt) {
        this.vi = new Buffer(16);
        chunk.copy(this.vi, 0, 0, 16);
    }
} catch (ex) {
    // error occurred! - see above for error information!
}

@schwiet
Copy link

schwiet commented Mar 16, 2017

@chenchenece fwiw, I am using spark-protocol with node v4 on a Synology box and not seeing the error with this patch applied.

@chenchen-hci
Copy link

chenchen-hci commented Mar 16, 2017

@schwiet Yest, actually it did work, but only for small data low frequency.
For my project with high frequency (10 to 20 interaction between device and server per seconds) with packet size being around 2~3KB this trivial error occurs. Please let me know if you got some idea in what happen it is! Cheers!

@jlkalberer
Copy link

@chenchenece - did you ever end up figuring this out? I'm pretty sure this bug still exists in Brewskey/spark-protocol

@chenchen-hci
Copy link

@jlkalberer - hey, it turns out the issues are not originated from server side, it is on the firmware side, but we can only mask the error on the server size. This means that the packets received at the server side are corrupted and surely it makes sense why they cannot be decrypted.

At the firmware side, if you go deep into system hal part, you will find that a system call named wiced_tcp_send_buffer(). The interesting thing is that, if packets are transmitted at high rate, the socket stream becomes full. However in this case, this function will still send the data chunk that is written into stream which however is only part of packet. The result is that the server will receive this corrupted packets which surely cannot be decrypted.

So as a short summary, I will say the fundamental issues are the broadcom wice library.

@schwiet i tried same code using node 4 and node 6 as well. I will say the problem will eventually occurred if you increase packet rate until the point when socket stream is full. You will find sometimes only part of corrupted data are transmitted to server, which will cause these errors (bad decrypted/CoAP etc.)

straccio pushed a commit to straccio/spark-protocol that referenced this pull request Jun 13, 2017
fix wrong returning value in  fromBinary().
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants