-
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
incorrect JSON generated for arrays passed in for columns of type json #374
Comments
Hey Seth. Unfortunately no we don't have the input column data types available, and keeping them available is beyond the scope of node-postgres. I think an ORM or some DAL would be a good place to put that info. There really is no way to fix inbound type conversion other than to manually convert your data to simple types in userland code. I tried to keep the inbound parameter conversions as simple as possible. The more smarts I've added to outbound parameter conversions the more problems it's caused. So...the best 'fix' for this is if you're inserting non-primitive data into Postgres you'll need to turn it into some sort of form postgres will understand when it receives it as a parameter. That being said, there could be a bug in this particular circumstance, and I'm not 100% sure from what you were saying the issue. Could you include a small code sample I could look at to make sure it's something to be pushed out into user code & not an actual bug in the implementation? |
Sure, here is a code sample:
This results in this output:
The key thing here is that the javascript object passed in as the value happens to be a javascript array, not a javascript object. this is legal per the JSON spec and works fine at the postgres level. The issue is that pg assumes that JSON data at the top level is an object. One workaround (which i've taken in my code) is to wrap the array in a top level object. |
Yeah, I see the issue now. It's because of the inbound parameter conversion here: https://github.com/brianc/node-postgres/blob/master/lib/utils.js#L55 Arrays are checked and turned into "postgres arrays" before the default Few options...one thing is to punch over that function like so: var pg = require('pg');
pg.utils.prepareValue = function(val) {
if(val instanceof Date) {
return dateToString(val);
}
if(typeof val === 'undefined') {
return null;
}
if(!val || typeof val !== 'object') {
return val === null ? null : val.toString();
}
return JSON.stringify(val);
} Basically remove the array converter. Another would be to do something like this in your code: client.query('INSERT INTO test (data) VALUES ($1)::JSON', [JSON.stringify(myArray)], function(err) {
// NOTE this is where things go sideways Really no way to be beautifully clean about it since we don't know the inbound parameter types, just make very broad assumptions about them. |
Agreed, no great solution here. Your manual stringification of the inbound parameter is probably the cleanest workaround. Thanks for looking at this. |
I guess we'll close this. |
So, is a datatype of an array of json not supported? I can't find the answer to this in the documentation. |
If you can show me a bit of what exactly you're trying to achieve I should be able to answer better. Some inbound parameter conversion stuff got quite a bit better in a recent release |
To repro:
For example, if my input array is:
["foo", "bar", "baz"]
The generated json string from pg is:
{"foo", "bar", "baz"}
In other words, the array gets wrapped with curly braces not square brackets.
You can see where this is happening in lib/utils.js. The issue is that we are trying to treat the array as a postgres array input, even though this should be treated as a json input because the database column is of type json.
I'd be happy to do a PR to fix this, but i'm not sure of the right strategy here -- it seems like to fix this problem, we need to know the column datatype, not just the input datatype? do we have that context available?
Thanks,
Seth
The text was updated successfully, but these errors were encountered: