--

Hi @c.bugra.alkan,

You would want to create a read stream from your big big data source and then split that super big file into chunks, maybe output to s3 and ingest into BigQuery from there.

Examples how to do so:

1. https://towardsdatascience.com/mysql-data-connector-for-your-data-warehouse-solution-db0d338b782d

2. https://medium.com/towards-data-science/how-to-handle-data-loading-in-bigquery-with-serverless-ingest-manager-and-node-js-4f99fba92436

--

--

💡Mike Shakhomirov
💡Mike Shakhomirov

Written by 💡Mike Shakhomirov

Data Engineer, Data Strategy and Decision Advisor, Keynote Speaker | linktr.ee/mshakhomirov | @MShakhomirov

No responses yet