Skip to content
This repository has been archived by the owner on Oct 17, 2023. It is now read-only.

Elasticsearch errors #17

Closed
moooji opened this issue Dec 19, 2014 · 6 comments · Fixed by #18
Closed

Elasticsearch errors #17

moooji opened this issue Dec 19, 2014 · 6 comments · Fixed by #18
Labels

Comments

@moooji
Copy link

moooji commented Dec 19, 2014

First of all, this project is amazing! :)

I am currently trying to insert documents from a mongo collection into elasticsearch. It works for most of the documents, but I see also a lot of errors like below. The first part, I understand (because there is no API endpoint), but I do not understand what the Elasticsearch error (%!s(\u003cnil\u003e)) means.

I am using this transform function. Does the elastic search object need to have a certain format?

module.exports = function(doc) { var saveDoc = { _id: JSON.stringify(doc._id), text: doc.text, meta: doc.meta } return saveDoc }

transporter: EventEmitter Error: http error code, expected 200 or 201, got 405, ({"ts":1419023160,"name":"error","path":"","record":{"_id":"316254105985245184"},"message":"ERROR: Elasticsearch error (%!s(\u003cnil\u003e))"})

@nstott
Copy link
Contributor

nstott commented Dec 19, 2014

The errors I've seen with elasticsearch all have to do with the mapping.
When elasticsearch first sees a field, it guesses at the type, and then expects all subsequent documents to have the same type in that field
ie, if the first document is something like

{
  _id: 1,
  value: "2"
}

and then you hit a document that looked like this:

{
  _id: 1,
  value: 3
}

then elastic search wouldn't like this, because value changed from being a string to a double
this blog post explains some of this and some workarounds.
https://blog.compose.io/transporter-and-elasticsearch-mapping/

the error includes the document id, ... "record":{"_id":"316254105985245184" .., I would have a look at that document, and see if that's different in any way. is it failing because either doc.text is or doc.meta is of a different type?
Your elasticsearch cluster might have some helpful debug information as well

I have to say though that the error really isn't helpful to debug. I made a change (today) that lets the api be optional, but if you don't include the api, you won't get any messages :) (which also isn't very helpful in this case.) I'll be adding some logging options, and trying to fix that in the next bit.

Thanks!

@moooji
Copy link
Author

moooji commented Dec 19, 2014

Thanks a lot for the explanation!
I have tried again with a different test collection that has a very simple structure like { _id: "zzn4672Xpg5MKTnwD" } nothing more. The _id is definitely a string in all cases and surprisingly, transporter inserted ALL documents into elasticsearch, but at the same time threw a ton of errors like below:

transporter: EventEmitter Error: http error code, expected 200 or 201, got 405, ({"ts":1419027073,"name":"error","path":"","record":{"id":"zzn4672Xpg5MKTnwD"},"message":"ERROR: Elasticsearch error (%!s(\u003cnil\u003e))"})

It feels like maybe there is simply a glitch in the parsing of the elasticsearch response? Maybe elasticsearch actually does not return any error (there is nothing in the logs either), could also explain the "nil" maybe?

@nstott nstott added the bug label Dec 19, 2014
@nstott
Copy link
Contributor

nstott commented Dec 19, 2014

Ok, that's odd. I'll dive into this tonight.
was there a transformer involved in your last run, or was it a straight copy from mongo to elasticsearch?

@moooji
Copy link
Author

moooji commented Dec 19, 2014

I tried both and get the same errors with and without transformer.

@nstott
Copy link
Contributor

nstott commented Dec 20, 2014

This should help :) it was just noise that shouldn't have been emited, the documents were inserted properly
The api fields in the config are optional, and I'll be pushing a patch later that will log legitimate errors to stdout whether or not the api is configure in the config
Thanks!

@moooji
Copy link
Author

moooji commented Dec 20, 2014

Awesome! Just tried it out and it works perfectly now.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants