Skip to content

test: fix integer compliance tests #20

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 10 commits into from
Mar 19, 2021
3 changes: 3 additions & 0 deletions google/cloud/sqlalchemy_spanner/sqlalchemy_spanner.py
Original file line number Diff line number Diff line change
Expand Up @@ -146,6 +146,9 @@ def visit_BOOLEAN(self, type_, **kw):
def visit_DATETIME(self, type_, **kw):
return "TIMESTAMP"

def visit_BIGINT(self, type_, **kw):
return "INT64"


class SpannerDialect(DefaultDialect):
"""Cloud Spanner dialect.
Expand Down
96 changes: 93 additions & 3 deletions test/test_suite.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,18 +16,20 @@

import pytest

from sqlalchemy.testing import config
from sqlalchemy.testing import config, db
from sqlalchemy.testing import eq_
from sqlalchemy.testing import provide_metadata
from sqlalchemy.testing.schema import Column
from sqlalchemy.testing.schema import Table
from sqlalchemy import literal_column
from sqlalchemy import select, case, bindparam

from sqlalchemy import bindparam, case, literal, select, util
from sqlalchemy import exists
from sqlalchemy import Boolean
from sqlalchemy import String
from sqlalchemy.testing import requires
from sqlalchemy.types import Integer
from sqlalchemy.testing import requires

from google.api_core.datetime_helpers import DatetimeWithNanoseconds

from sqlalchemy.testing.suite.test_ddl import * # noqa: F401, F403
Expand All @@ -45,6 +47,8 @@
from sqlalchemy.testing.suite.test_select import ExistsTest as _ExistsTest
from sqlalchemy.testing.suite.test_types import BooleanTest as _BooleanTest

from sqlalchemy.testing.suite.test_types import IntegerTest as _IntegerTest


from sqlalchemy.testing.suite.test_types import ( # noqa: F401, F403
DateTest as _DateTest,
Expand Down Expand Up @@ -420,3 +424,89 @@ class TimeTests(_TimeMicrosecondsTest, _TimeTest):
@pytest.mark.skip("Spanner doesn't coerce dates from datetime.")
class DateTimeCoercedToDateTimeTest(_DateTimeCoercedToDateTimeTest):
pass


class IntegerTest(_IntegerTest):
@provide_metadata
def _round_trip(self, datatype, data):
"""
SPANNER OVERRIDE:

This is the helper method for integer class tests which creates a table and
performs an insert operation.
Cloud Spanner supports tables with an empty primary key, but only one
row can be inserted into such a table - following insertions will fail with
`400 id must not be NULL in table date_table`.
Overriding the tests and adding a manual primary key value to avoid the same
failures and deleting the table at the end.
"""
metadata = self.metadata
int_table = Table(
"integer_table",
metadata,
Column("id", Integer, primary_key=True, test_needs_autoincrement=True),
Column("integer_data", datatype),
)

metadata.create_all(config.db)

config.db.execute(int_table.insert(), {"id": 1, "integer_data": data})

row = config.db.execute(select([int_table.c.integer_data])).first()

eq_(row, (data,))

if util.py3k:
assert isinstance(row[0], int)
else:
assert isinstance(row[0], (long, int)) # noqa

config.db.execute(int_table.delete())

@provide_metadata
def _literal_round_trip(self, type_, input_, output, filter_=None):
"""
TODO: Remove this override method once DDL issue resolved in spanner_dbapi

SPANNER OVERRIDE:

Spanner is not able cleanup data and drop the table correctly,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you know why Spanner is not cleaning up the data and dropping the table correctly? Could #28 potentially resolve this?

This override reason doesn't seem tied to a feature that Spanner doesn't support such as temporary tables or multiple rows in empty primary key tables so I think this may be a bug rather than something we should override.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Locally, I can get this test to pass without the override by changing the db_api implementation to execute DDL statements immediately instead of queuing them until a non-DDL statement is executed.

Please add a TODO to remove this override

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, because locally implemented both the PRs #28, googleapis/python-spanner#277, but still failing for me.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The required change is shown below:

--- a/google/cloud/spanner_dbapi/cursor.py
+++ b/google/cloud/spanner_dbapi/cursor.py
@@ -174,14 +174,9 @@ class Cursor(object):
         try:
             classification = parse_utils.classify_stmt(sql)
             if classification == parse_utils.STMT_DDL:
-                self.connection._ddl_statements.append(sql)
+                self.connection.database.update_ddl([sql]).result()
                 return
 
-            # For every other operation, we've got to ensure that
-            # any prior DDL statements were run.
-            # self._run_prior_DDL_statements()
-            self.connection.run_prior_DDL_statements()
-
             if not self.connection.autocommit:
                 if classification == parse_utils.STMT_UPDATING:
                     sql = parse_utils.ensure_where_clause(sql)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The docs here and some other places do seem to lean towards batching the DDL statements since they might be quite time consuming. That was probably the original idea behind this implementation.
Perhaps we can offer a solution where we would execute DDL statement in autocommit=True mode and keep the current implementation in autocommit=False mode.
I think it will be intuitive from the user's perspective.

--- a/google/cloud/spanner_dbapi/cursor.py
+++ b/google/cloud/spanner_dbapi/cursor.py
        try:
             classification = parse_utils.classify_stmt(sql)
             if classification == parse_utils.STMT_DDL:
                 self.connection._ddl_statements.append(sql)
+                if self.connection.autocommit:
+                    self.connection.run_prior_DDL_statements()
                 return
 
            # For every other operation, we've got to ensure that
            # any prior DDL statements were run.
            # self._run_prior_DDL_statements()
            self.connection.run_prior_DDL_statements()

            if not self.connection.autocommit:
                if classification == parse_utils.STMT_UPDATING:
                    sql = parse_utils.ensure_where_clause(sql)

@larkee WDYT?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@larkee Yes, I was talking about autocommit=Flase. I think need to remove TODO note then.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Correct. Please remove it and update the override explanation to mention that DDL statements aren't executed until a non-DDL statement is executed 👍

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@larkee Still this doesn't work for me with changes recommended by alex, am i missing something here?

def _literal_round_trip(self, type_, input_, output, filter_=None):
    t = Table("int_t", self.metadata, Column("x", type_))
        t.create()

        with db.connect() as conn:
            for value in input_:
                ins = (
                    t.insert()
                    .values(x=literal(value))
                    .compile(
                        dialect=db.dialect, compile_kwargs=dict(literal_binds=True),
                    )
                )
                conn.execute(ins) --> thorws an error
                conn.execute("SELECT 1")  # didn't execute
            if self.supports_whereclause:
                stmt = t.select().where(t.c.x == literal(value))
            else:
                stmt = t.select()

            stmt = stmt.compile(
                dialect=db.dialect, compile_kwargs=dict(literal_binds=True),
            )
            for row in conn.execute(stmt):
                value = row[0]
                if filter_ is not None:
                    value = filter_(value)
                assert value in output

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@larkee Test passes for the very first time when table doesn't exist and after that it failed every time with an error 400 Duplicate name in schema: int_t.
DROP TABLE is still not working properly after merging all the related branches (with current master branch of spanner and spaner-sqlalchemy). I think the culprit is autocommit=False because when set autocommit=True in spanner_dbapi everything works fine and table deleted successfully after every test.

Solution might be execute non-DDL statement after the DROP TABLE or commit manually.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Solution might be execute non-DDL statement after the DROP TABLE

This is what I would suggest but I'm not sure where DROP TABLE is called between tests. For now I think the current implementation is fine. We can revisit this another time 👍

table was already exists after related tests finished, so it doesn't
create a new table and when started tests for other data type following
insertions will fail with `400 Duplicate name in schema: t.
Overriding the tests to create a new table for test and drop table manually
before it creates a new table to avoid the same failures.
"""

# for literal, we test the literal render in an INSERT
# into a typed column. we can then SELECT it back as its
# official type; ideally we'd be able to use CAST here
# but MySQL in particular can't CAST fully
t = Table("int_t", self.metadata, Column("x", type_))
t.drop(checkfirst=True)
t.create()

with db.connect() as conn:
for value in input_:
ins = (
t.insert()
.values(x=literal(value))
.compile(
dialect=db.dialect, compile_kwargs=dict(literal_binds=True),
)
)
conn.execute(ins)

if self.supports_whereclause:
stmt = t.select().where(t.c.x == literal(value))
else:
stmt = t.select()

stmt = stmt.compile(
dialect=db.dialect, compile_kwargs=dict(literal_binds=True),
)
for row in conn.execute(stmt):
value = row[0]
if filter_ is not None:
value = filter_(value)
assert value in output