-
Notifications
You must be signed in to change notification settings - Fork 58
ERROR: index row size 1648 exceeds maximum 1336 for index "..." #9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Index key can't have size more than 1336 bytes. It is a limitation of the current version. It is true also for GIN (it could has another limit). It seems there is some big string in your table. You can check it with the query: select word, char_length(word)
from ts_stat('select some_column from some_table')
order by char_length(word) desc
limit 10; |
Yes, i got HUGE strings in this table, not the only one, a column with ts_vector of it and a gin index on it. |
Closed |
We are working on this issue. And as I know there are huge urls in your strings. Are you do search by these urls? Are they important to you? If not, then it is easy to fix it. Just don't store it in index (I can explain how). If they important we can fix RUM and cut off urls. But I think we need it anyway, other users may have similar issue. |
Glad to hear that you are working on this issue. We are not using these urls at the moment, but they can be used in the near future. So, it would be great, if you had the RUM fixed according to this fact. |
We find a solution. We can add new OPERATOR CLASS which will store hash of lexems, then you can store huge strings. But in this OPERATOR CLASS we cant use prefix (or partial) matching. Do you use prefix matching? |
Can you trim the url and raise some notice like:
|
I think we can trim the url also. We will fix limits for posting trees, and they will become as in GIN. |
Sounds great, thank you. |
@To4e , please can you check the issue_9_max_item_size branch? |
@select-artur, still having the same problem:
|
Please, try commit 58fee28. |
Well, installed it, tests are ok, rebuilded the extension, started the creation of the index, and ... about 3 hours of total suffering of the machine, where the database is, ended with its self reboot:
|
Could you shared some more details about dataset and machine? Table size, row count, postgresql settings, amount of RAM, processor, disk type and space etc. |
Here you are:
And this is the test server. |
Do we have any chance to access the test server? Or share dataset? Or share some kind of anonymized dataset where issue still occur? |
Sorry, but no, and no. This information is some kind confidential. |
But could you try to generate some random data where same issue will occur? |
Not a good variant too, because the random data will give different size of the vector, tockens and so on. |
New problem... Trying to create the rum index:
create index some_index on some_table using rum (some_column rum_tsvector_ops);
Here is the text of the error:
The text was updated successfully, but these errors were encountered: