Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug [atmosphere.js]: long-polling, callback are getting always the whole responseText on readyState == 3 #86

Closed
haed opened this issue Nov 28, 2011 · 10 comments

Comments

@haed
Copy link
Contributor

haed commented Nov 28, 2011

If the server chunks the response (on long polling) the client api (atmosphere.js) is invoking the callback with same responseBody (which will grow).

e.g. first invocation: "message1", second invocation: "message1message2" instead of "message2" only for response.responseBody.

In streaming/websocket scenario it works (i think), because the stream will be cut (see: atmosphere.js:229). This functionality should be offered in long-polling scenario as well. Callbacks should be work transport independent and should not get different results in different scenarios. Also a check for readyState==4 should not work in streaming scenarios.

I implemented a small workaround which cut identical starting text from given responseBody (see haed.notification.js:97).

Environment: Mac, Chrome, jQuery 1.7.2, jetty 7.4.2, jersey, long-polling

Can be tested with haed.notification (sending 100 messages, add a breakpoint at haed.notification.js:97 un-comment debug output).

@jfarcand
Copy link
Member

Closing as duplicate of issue 87.

@chilicat
Copy link

I observe the same issue. We send JSON strings down to the client and sometimes the client receives messages like:

{ "a": "b" }{ "c":"d"}

Is this issue fixed with #87. Issue #87 topic is about too many connections.

A client side fix is not a option because it must also be used by other applications (ipad, etc..) than our web client and I really don't like to have the hack in every application.

Just a side note: We are still on an older version 8.6 but we have a bigger refactoring planned where I would also consider to upgrade to the most actual atmosphere version - if the bug is fixed!

@jfarcand
Copy link
Member

@chilicat Can you share a test case? I think this is fixed in 0.9.4 but would like to work on it Thanks!

@chilicat
Copy link

chilicat commented Jun 1, 2012

@jfarcand please see pull request (4e82f54).

@jfarcand
Copy link
Member

jfarcand commented Jun 1, 2012

@chilicat Looking. I need to test on all browser which is time consuming :-)

@chilicat
Copy link

chilicat commented Jun 1, 2012

@jfarcand Is a server only test. You actually just have to execute the JsonMessagesResourceTest class

@jfarcand jfarcand reopened this Jun 1, 2012
@jfarcand
Copy link
Member

jfarcand commented Jun 1, 2012

Oh! I've mixed issue. Thanks a lot for the test...looking :-)

@jfarcand
Copy link
Member

jfarcand commented Jun 1, 2012

@chilicat Just to be sure, the expected behavior for you would be to never received intermixed message, right? That would mean the underlying WebServer's buffer never get full and flushed automatically. I can add a property to support that but I don't want to turn it on by default because that will makes a lot I/O operations per Broadcast. Thanks!

@chilicat
Copy link

chilicat commented Jun 1, 2012

@jfarcand Yes I guess that would be helpful in our case we do a message cumulation anyway to reduce I/O.

@jfarcand
Copy link
Member

jfarcand commented Jun 1, 2012

OK closing that one, will use the new one

@jfarcand jfarcand closed this as completed Jun 1, 2012
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants