-
-
Notifications
You must be signed in to change notification settings - Fork 189
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature Request: Processing callback #32
Labels
Comments
can't you do something like this?
|
Yes, however you are processing the rows twice, which most of the time is fine, but if you are streaming a really large CSV, it would be more efficient to to edit each row inline, as opposed to processing each chunk twice. |
I'm considering this as a feature for 2.0 |
well, you could do this:
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
One thing I'm starting to see a need for (at least for my stuff) is a callback function for each csv row.
For example, say you are reading chunks of a large CSV and need to insert an additional key/value(s), like this:
SmarterCSV.process file, options, do |rows|
rows = rows.map { |row| row[:_special_id] = spec_id }
send_rows_in_bulk_somewhere( rows )
end
this processes rows twice, as opposed to something like this
SmarterCSV.process file, options.merge( callback: "my_callback" ), do |rows| send_rows_in_bulk_somewhere(rows) end def my_callback row row[:_special_id] = get_spec_id(row) end
Maybe this is just as fast if you processed each row then cached then up before sending them out, I haven't done any comparisons.
The text was updated successfully, but these errors were encountered: