-
Notifications
You must be signed in to change notification settings - Fork 132
Introduce converters to consume Postgres Json and convert these to String/byte[] #453
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
OK digging into this a bit more, it appears that the issue is that the Slightly weird behaviour I guess because if you don't read the value then it will leak. Maybe this is intentional? |
One thing I should point out is that even if I change my model to:
... the problem still occurs, presumably because the R2DBC drivers are still doing the conversion to |
Thanks for bringing up that issue. As per Javadoc of
Using Mapping I see two things to do of which we can address one directly:
|
jsonb
data type
As a temporary workaround (in case this helps others), I made a new type with custom converters which always read the data exactly once into a GC collected string:
|
Thanks for your reply @mp911de. For the record, the super confusing thing is that even if I use a |
Can you file a ticket in https://github.com/pgjdbc/r2dbc-postgresql asking for an optimization switch for JSON? Probably the same would make sense for |
…e to String/byte[]. We now ship converters for Postgres' Json data type to map JSON to String and to byte[] for easier and safe consumption.
Spring Data converters are now in place. |
@mp911de Sorry, was this targeted at me? Do you still need this? |
I just created pgjdbc/r2dbc-postgresql#330. So everything is done here. |
@mp911de sorry for commenting on a closed issue, but I am trying to get Postgres This fails on insertion with
Looking at the commit 4506fb6 it seems that the converters are only implemented on the reading side, not on the writing side, so this is kind of expected. Is my analysis correct? Should I open another issue on this topic? |
Your analysis is correct. The main concern is that writing JSON requires JSON-awareness as Postgres must know that it's a JSON value to write. While under the hood, JSON is represented as UTF-8 string, the value OID (wire value type) is set to JSON. Alternatively, Postgres casting ( Converting every string to JSON isn't an option. So the only thing you can do for now is using the |
@mp911de thanks for the quick reply Indeed, I tried to add a String -> Json converter on the writing side and promptly broke all |
Another alternative would be to introduce a There's a Spring Data ticket to introduce property-based converters – right now, we have only type-based converters. That could be a future hook for these customizations. Taking a step back, for now, forcing a String to be written as JSON one is highly dialect-specific. I'm not even sure other databases have similar behavior. So introducing an annotation or the like leaves us with the question of what problem we want to solve and how do we want to go about it. |
@mp911de I see what you're getting at...
I couldn't find the issue for property-based converters, did you mean a generic spring-data ticket or a spring-data-r2dbc one? Anyway, fair point about forcing all Strings to be written to a one size fits all / db specific column. I guess I was imagining some kind of hint alongside Anyway, in short I now have:
Thanks again for your thoughts |
I think I have found a potential memory leak using Spring Data + R2BC Posgres.
I have a model and repository like this (Kotlin code):
The SQL to create this table is:
If I do a query like this:
... then I get a bunch of Netty memory leak errors (https://netty.io/wiki/reference-counted-objects.html). The culprit seems to be this one:
Apologies if this is actually an issue in the core R2BC library. I'm happy to re-raise this there if that is the case.
Thanks!
The text was updated successfully, but these errors were encountered: