1

Besides streaming a csv file yourself and painstakingly executing inserts for each line of data, is it possible to use the google cloud sdk to import an entire csv file in bulk, from inside a cloud function. I know in gcp console you can go to the import tab, select a file from storage and just import. But how can I emulate this programmatically?

1 Answer 1

1

in general, one has to parse the .csv and generate SQL from that; one line in the .csv would be represented by one INSERT statement. in the first place, you would have to upload the file - or pull it into Cloud Functions temporary storage, eg. with gcs.bucket.file(filePath).download.

then the most easy might be utilize a library, eg. csv2sql-lite - with the big downside, that one does not have full control over the import - while eg. csv-parse would provide a little more control (eg. checking for possible duplicates, skipping some columns, importing to different tables, whatever).

... and order to connect to Cloud SQL, see Connecting from Cloud Functions.

Sign up to request clarification or add additional context in comments.

2 Comments

thanks so much for confirming what is possible and what's not!! I'll follow this approach then :)
@yen one could also input CSV with a HTTP trigger; ETL (extract, transform, load) is always the same procedure, no matter the environment or language used, no matter where the data comes from - nor where it is written to.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.