You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We are using SQL as a source and Blob as a destination. We have limited number of records in our dev SQL source. We want to run a performance testing to understand how Airbyte will perform if we have 1 M, 10 M, 100 M records in a sync.
Is there a way to run performance tests on Airbyte using Pyairbyte with Fake data library? or any other way?
The text was updated successfully, but these errors were encountered:
@ramandatascientist - Thanks for raising this. The benchmark CLI command may be of help to you. I'm about to merge this PR to our auto-generated docs - adding a new docs page for the cli module.
Other things which should be helpful:
After each sync, a performance log path is printed, and that file will have a jsonl line for all sync runs, including records/second, bytes/second and many other helpful performance stats.
Hi @aaronsteers Thank you for your response. Is there a way to mock the datasets against Azure SQL? We are running airbyte against dev Azure SQL databases that has limited number of records, so I am wondering if there is a way to fake/mock the datasets and run performance test against it
Hello
We are using SQL as a source and Blob as a destination. We have limited number of records in our dev SQL source. We want to run a performance testing to understand how Airbyte will perform if we have 1 M, 10 M, 100 M records in a sync.
Is there a way to run performance tests on Airbyte using Pyairbyte with Fake data library? or any other way?
The text was updated successfully, but these errors were encountered: