0

I want to bulk insert MySQL with Sequelize ORM and NodeJS (NestJS). I'm curious if there is a best way to handle very large data. I thought of "chunk" like Laravel did. In Laravel we can do it like this:

$chunks = $insert_data->chunk(500);

foreach ($chunks as $chunk) {
   \DB::table('items_details')->insert($chunk->toArray());
}

Has anyone implemented this with sequelize and NodeJS. Thanks

1 Answer 1

1

I suppose it would be almost the same (using lodash to get chunks):

const _ = require('lodash');

const chunks = _.chunk(insert_data, 500);

for (const chunk of chunks) {
  await ItemDetails.bulkCreate(chunk);
}

P.S. Depending on what you want to achieve you either use a transaction to insert all chunks or none of them if there will is an error while inserting any of chunks OR use try/catch to bypass or continue inserting other chunks after failed one.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.