-
Notifications
You must be signed in to change notification settings - Fork 316
Inability to cast int to string when appending data to table using load_table_from_json #906
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@Nathan-Nesbitt What's the table schema? Specifically, what's the type of the "id" column? It seems that the Is the error also reproducible if sending row = {'id': 781263812376123, ...} # note: no quotes |
@plamut the ID column has type integer which makes this even more strange. I will set up a test to see if the ID passed in as an int causes issues! |
I just tested it on this column by converting all of the input into INTs, unfortunately there is no difference in the error. |
@Nathan-Nesbitt OK, thanks for checking that, I will have a closer look after we successfully rename the default branch. |
@Nathan-Nesbitt Unfortunately, I was not able to reproduce the issue with Python 3.9 and BTW, is the code example accurate? Asking because the code is not runnable, Does the error also occur with the latest BigQuery version? (2.25.1 at the time of writing) |
Sorry about going silent, was redirected onto another portion of the project for the last week. I had to make some changes to trim down the code to a minimal chunk and accidentally deleted the kwarg name. This is what I have in my function right now: row = {'id': '781263812376123', ...} # This is a more complex object that I can't share unfortunately, but fits the table schema
table = client.get_table(table)
job_config = LoadJobConfig(schema=table.schema, write_disposition=WriteDisposition.WRITE_APPEND)
job = client.load_table_from_json([row], destination=table, job_config=job_config) I will try out the new version and try to narrow down the issue to be sure there isn't anything else that could be going on. Thanks again! |
@Nathan-Nesbitt Any luck with finding the cause of the problem, or at least with narrowing down the pre-conditions to consistently reproduce the problem locally? Thanks! |
I'm closing this due to inactivity, but if there's any new information an the issue had not been resolved yet by upgrading, feel free to re-open it. |
Issue
I can append this data using the stream functionality, but with a job it fails to convert properly. Is this an error on my part or a bug?
Environment details
google-cloud-bigquery
version: 1.24.0Steps to reproduce
Code example
Stack trace
Which leads to
The text was updated successfully, but these errors were encountered: