-
Notifications
You must be signed in to change notification settings - Fork 14
General Feedback #1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hey, Since you have created some issue(s) in Snowflake.Data project - I think you might be interested in this project. |
Hi, I like the idea of your project and am currently looking into it for a customer as we agree with your view on the limitations of the default connector. However, I am a having some trouble getting started: When I run the Snowflake.Client.Tests.IntegrationTests they all fail due to an ssl connection error. The same happens when I try to use your client in a custom project (basically what is suggested in the README). Is this something snowflake enforces recently and you are just still missing? Or do I have to make configurations for SSL support myself? I am running the code in Visual Studio 2019. |
Hi, @FSohns93! Thanks for your interest in this project! There were some issues with SSL in previous versions, but now they all should be fixed (see #7). Can you please create new issue here with some details? I'm gonna need exact error message, your Snowflake URL and region you use. To make integration tests work you should populate testconfig.json with your connection info. |
Thanks @fixer-m for coming back so fast.
I will check this further and comment here. |
@FSohns93, got it. Can you please create new issue with this info? |
Works great -- running into snowflake wanting to chunk on small data sets 1000 rows which has created some issues. But in general super easy to use. |
Can you add support for keyfiles instead of username/password?
|
@Ekus Sorry, it took me too long to answer. That's possible, yes, but unfortunately right now I don't have much time for this project. Currently I'm working on chunks downloader, which will allow to consume big responses from SF. |
@joshbartelsudig Hey! I've finally released chunks downloader, i.e. it can now handle big data responses. Check out new version on NuGet. |
Feel free to delete if this is the wrong thread to ask this question but is it possible to use this library as a provider for the Entity Framework? |
@lawrence-vo, No. If you want to use Snowflake with EF, you actually need other package: Snowflake.Data - ADO.NET connector provided by Snowflake. However I think that using EF with Snowflake or any other OLAP warehouse might be an overkill. |
Hey Mate, great library and we are using it. May I request adding support to pass the warehouse name during the client initialization? We have a default warehouse which is used when none is passed and our IT department would like us to use our specific assigned Warehouse to run any queries. Thanks! |
@sourabhtewari Hi! This already should be working. var snowflakeClient = new SnowflakeClient(new AuthInfo
{
User = "username",
Password = "password",
Account = "account",
Region = "region"
},
new SessionInfo
{
Warehouse = "warehouse" // <= set your warehouse name
}); |
EDIT: I just realized that what I was reading is the v2 Snowflake API, and this library seems to use the v1 API. I have no idea if the below still applies.... please let me know if it doesn't. EDIT2: I've since resolved my issue... at the end of the day, it was caused by Dapper's default behavior which buffers the entire response before returning. Changing that single setting resulted in negligible memory utilization when working with the same queries. @fixer-m I'm curious to know how exactly do you handle partitioned results in this library. We have a scenario here where we are reading a 50 million-record set and this is allocating insane amounts of memory using the standard connector + Dapper (over 60gb of RAM). I'm currently investigating ways to reduce that load to "reasonable" levels, and was looking at potentially switching to your library. At the same time, I was taking a look at the Snowflake REST API documentation and saw what they do with larger sets using partitions, where you make multiple calls to fetch the different partitions: This "partitioned fetching" would work perfect for our use case as we would allocate just enough memory for each partition, write it to the target file, and then fetch the next partition. Sure this would mean a ton of requests, but at least it would be a more scalable solution that doesn't create insane memory spikes for us. I don't want to introduce manual paging to our procedure if I can leverage this built-in partitioning mechanism, but apparently the standard connector doesn't surface this in any way, so we end up loading the entire dataset in memory before we start writing that into our .csv file. At the same time, their REST API is fairly convoluted, so I'd also like to avoid consuming it directly... How does your library deal with it? An ideal implementation in my mind would be to expose an |
Is there a way to pass and validate the sessionToken to the SnowflakeClient and create a new renewSession when expired? |
Hi everyone!
In this thread you can leave your feedback on this project.
Since this is a new project I'm looking for a feedback on whether it's useful or not - considering that there is official .NET connector. What features do you think this client should have to actually compete with official connector? How is Snowflake generally used in .NET applications?
As well as I would appreciate bug reports and features request - please, create new separate issue for this.
Thank you!
The text was updated successfully, but these errors were encountered: