-
Notifications
You must be signed in to change notification settings - Fork 826
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Incorrect Handling of DATE_ADD functions while transpiling to spark #4939
Comments
Hi @sidhant-sundial, the section you linked is irrelevant to this issue. The This transpilation issue happens because the parser does not fully capture Presto's |
@sidhant-sundial since this requires (a) type information to fix completely and (b) quite a bit of work to implement & ensure we don't break stuff, combined with the fact that it's low priority for the core team, I'm going to close as not planned for now. We welcome well-tested PRs if you want to take a stab at it. |
Can you point me to a code entry point where I can start from? I can take a stab post that. Just the file name/function name will help. Thanks! |
I would start at "DATE_ADD": lambda args: exp.DateAdd(
this=seq_get(args, 2), expression=seq_get(args, 1), unit=seq_get(args, 0)
), The issue is that |
Read Dialect:
presto
Write Dialect:
Spark
This shall return a result of the type
DATE
however, when running the above date_add with three arguments, spark uses the timestampadd function internally and outputs the result with a timestamp clause.I checked the sqlglot code and seems like an explicit condition has been added to handle the
DAY
unit only.sqlglot/sqlglot/dialects/spark.py
Line 70 in 171eccb
This looks to be incorrect. Can you please explain why was this check added? Could be that we are missing something. Otherwise, we should remove this condition from sqlglot.
Apache Spark uses timestamp_add internally to compute date_add with three arguments.
The change has been in since at least 2022, by looking at the base grammar: apache/spark@6df10ce
The text was updated successfully, but these errors were encountered: