SQL efficient bulk insert/upsert landed in Rails codebase

[https://github.com/rails/rails/pull/35077](https://github.com/rails/rails/pull/35077) was finally merged. You can find more details in my [blog post](https://medium.com/@retrorubies/upcoming-rails-6-bulk-insert-upsert-feature-2d642419557d).

Now you can do (with PostgreSQL adapter):

```ruby
now = Time.now
bulk_data = 1000.times.map {|t| {slug: "slug-#{t}", post: "text #{t}", created_at: now, updated_at: now}}
Post.insert_all!(bulk_data)
```

Boom, done in 1 query (locally under 0.4s).

You can even upsert if you have unique index on slug column:

```ruby
now = Time.now
bulk_data = 1000.times.map {|t| {slug: "slug-#{t}", post: "text #{t}", created_at: now, updated_at: now}}
Post.upsert_all(bulk_data, unique_by: { columns: [:slug]})

```

7 thoughts on “SQL efficient bulk insert/upsert landed in Rails codebase”

  1. Thats awesome. I’ve been using a gem called activerecord-import to do this for a while, cool to see its been baked in.

    Reply
  2. I’m a bot, *bleep*, *bloop*. Someone has linked to this thread from another place on reddit:

    – [/r/ruby] [SQL efficient bulk insert\/upsert landed in Rails codebase](https://www.reddit.com/r/ruby/comments/b0snhs/sql_efficient_bulk_insertupsert_landed_in_rails/)

     *^(If you follow any of the above links, please respect the rules of reddit and don’t vote in the other threads.) ^\([Info](/r/TotesMessenger) ^/ ^[Contact](/message/compose?to=/r/TotesMessenger))*

    Reply
  3. Let’s say I want to call RETS api(slow) data which has big response result, will this cause huge memory use before rails save with this method?

    Reply

Leave a Comment