-
Notifications
You must be signed in to change notification settings - Fork 32
test: fix integer compliance tests #20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 8 commits
b5e4562
04da5fd
3112637
79241a4
330d290
544f2b7
a1e5576
eec1150
7ee0b65
26cb225
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -16,18 +16,20 @@ | |
|
||
import pytest | ||
|
||
from sqlalchemy.testing import config | ||
from sqlalchemy.testing import config, db | ||
from sqlalchemy.testing import eq_ | ||
from sqlalchemy.testing import provide_metadata | ||
from sqlalchemy.testing.schema import Column | ||
from sqlalchemy.testing.schema import Table | ||
from sqlalchemy import literal_column | ||
from sqlalchemy import select, case, bindparam | ||
|
||
from sqlalchemy import bindparam, case, literal, select, util | ||
from sqlalchemy import exists | ||
from sqlalchemy import Boolean | ||
from sqlalchemy import String | ||
from sqlalchemy.testing import requires | ||
from sqlalchemy.types import Integer | ||
from sqlalchemy.testing import requires | ||
|
||
from google.api_core.datetime_helpers import DatetimeWithNanoseconds | ||
|
||
from sqlalchemy.testing.suite.test_ddl import * # noqa: F401, F403 | ||
|
@@ -45,6 +47,8 @@ | |
from sqlalchemy.testing.suite.test_select import ExistsTest as _ExistsTest | ||
from sqlalchemy.testing.suite.test_types import BooleanTest as _BooleanTest | ||
|
||
from sqlalchemy.testing.suite.test_types import IntegerTest as _IntegerTest | ||
|
||
|
||
from sqlalchemy.testing.suite.test_types import ( # noqa: F401, F403 | ||
DateTest as _DateTest, | ||
|
@@ -420,3 +424,89 @@ class TimeTests(_TimeMicrosecondsTest, _TimeTest): | |
@pytest.mark.skip("Spanner doesn't coerce dates from datetime.") | ||
class DateTimeCoercedToDateTimeTest(_DateTimeCoercedToDateTimeTest): | ||
pass | ||
|
||
|
||
class IntegerTest(_IntegerTest): | ||
@provide_metadata | ||
def _round_trip(self, datatype, data): | ||
""" | ||
SPANNER OVERRIDE: | ||
|
||
This is the helper method for integer class tests which creates a table and | ||
performs an insert operation. | ||
Cloud Spanner supports tables with an empty primary key, but only one | ||
row can be inserted into such a table - following insertions will fail with | ||
`400 id must not be NULL in table date_table`. | ||
Overriding the tests and adding a manual primary key value to avoid the same | ||
failures and deleting the table at the end. | ||
""" | ||
metadata = self.metadata | ||
int_table = Table( | ||
"integer_table", | ||
metadata, | ||
Column("id", Integer, primary_key=True, test_needs_autoincrement=True), | ||
Column("integer_data", datatype), | ||
) | ||
|
||
metadata.create_all(config.db) | ||
|
||
config.db.execute(int_table.insert(), {"id": 1, "integer_data": data}) | ||
|
||
row = config.db.execute(select([int_table.c.integer_data])).first() | ||
|
||
eq_(row, (data,)) | ||
|
||
if util.py3k: | ||
assert isinstance(row[0], int) | ||
else: | ||
assert isinstance(row[0], (long, int)) # noqa | ||
|
||
config.db.execute(int_table.delete()) | ||
|
||
@provide_metadata | ||
def _literal_round_trip(self, type_, input_, output, filter_=None): | ||
larkee marked this conversation as resolved.
Show resolved
Hide resolved
|
||
""" | ||
TODO: Remove this override method once DDL issue resolved in spanner_dbapi | ||
|
||
SPANNER OVERRIDE: | ||
|
||
Spanner is not able cleanup data and drop the table correctly, | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Do you know why Spanner is not cleaning up the data and dropping the table correctly? Could #28 potentially resolve this? This override reason doesn't seem tied to a feature that Spanner doesn't support such as temporary tables or multiple rows in empty primary key tables so I think this may be a bug rather than something we should override. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Locally, I can get this test to pass without the override by changing the Please add a TODO to remove this override There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Ok, because locally implemented both the PRs #28, googleapis/python-spanner#277, but still failing for me. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The required change is shown below:
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The docs here and some other places do seem to lean towards batching the DDL statements since they might be quite time consuming. That was probably the original idea behind this implementation. --- a/google/cloud/spanner_dbapi/cursor.py
+++ b/google/cloud/spanner_dbapi/cursor.py
try:
classification = parse_utils.classify_stmt(sql)
if classification == parse_utils.STMT_DDL:
self.connection._ddl_statements.append(sql)
+ if self.connection.autocommit:
+ self.connection.run_prior_DDL_statements()
return
# For every other operation, we've got to ensure that
# any prior DDL statements were run.
# self._run_prior_DDL_statements()
self.connection.run_prior_DDL_statements()
if not self.connection.autocommit:
if classification == parse_utils.STMT_UPDATING:
sql = parse_utils.ensure_where_clause(sql) @larkee WDYT? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @larkee Yes, I was talking about There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Correct. Please remove it and update the override explanation to mention that DDL statements aren't executed until a non-DDL statement is executed 👍 There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @larkee Still this doesn't work for me with changes recommended by alex, am i missing something here? def _literal_round_trip(self, type_, input_, output, filter_=None):
t = Table("int_t", self.metadata, Column("x", type_))
t.create()
with db.connect() as conn:
for value in input_:
ins = (
t.insert()
.values(x=literal(value))
.compile(
dialect=db.dialect, compile_kwargs=dict(literal_binds=True),
)
)
conn.execute(ins) --> thorws an error
conn.execute("SELECT 1") # didn't execute
if self.supports_whereclause:
stmt = t.select().where(t.c.x == literal(value))
else:
stmt = t.select()
stmt = stmt.compile(
dialect=db.dialect, compile_kwargs=dict(literal_binds=True),
)
for row in conn.execute(stmt):
value = row[0]
if filter_ is not None:
value = filter_(value)
assert value in output There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @larkee Test passes for the very first time when table doesn't exist and after that it failed every time with an error Solution might be execute non-DDL statement after the There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
This is what I would suggest but I'm not sure where |
||
table was already exists after related tests finished, so it doesn't | ||
create a new table and when started tests for other data type following | ||
insertions will fail with `400 Duplicate name in schema: t. | ||
Overriding the tests to create a new table for test and drop table manually | ||
before it creates a new table to avoid the same failures. | ||
""" | ||
larkee marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
# for literal, we test the literal render in an INSERT | ||
# into a typed column. we can then SELECT it back as its | ||
# official type; ideally we'd be able to use CAST here | ||
# but MySQL in particular can't CAST fully | ||
t = Table("int_t", self.metadata, Column("x", type_)) | ||
t.drop(checkfirst=True) | ||
t.create() | ||
|
||
with db.connect() as conn: | ||
for value in input_: | ||
ins = ( | ||
t.insert() | ||
.values(x=literal(value)) | ||
.compile( | ||
dialect=db.dialect, compile_kwargs=dict(literal_binds=True), | ||
) | ||
) | ||
conn.execute(ins) | ||
|
||
if self.supports_whereclause: | ||
stmt = t.select().where(t.c.x == literal(value)) | ||
else: | ||
stmt = t.select() | ||
|
||
stmt = stmt.compile( | ||
dialect=db.dialect, compile_kwargs=dict(literal_binds=True), | ||
) | ||
for row in conn.execute(stmt): | ||
value = row[0] | ||
if filter_ is not None: | ||
value = filter_(value) | ||
assert value in output |
Uh oh!
There was an error while loading. Please reload this page.