Skip to content

Commit cbcb4b8

Browse files
authored
chore: Prepare for 2.0 release (#278)
* Remove BQ Storage v1beta1 compatibility code * Adjust code to new BQ Storage 2.0 * Remove Python 2/3 compatibility code * Bump test coverage to 100% * Update supported Python versions in README * Add UPGRADING guide. * Regenerate bigquery_v2 code with microgenerator * Adjust hand-written unit tests to regened BQ v2 * Adjust samples to BQ v2 regenerated code * Adjust system tests to regenerated BQ v2 * Skip failing generated unit test The assertion seems to fail for a banal reason, i.e. an extra newline in the string representation. * Delete Kokoro config for Python 2.7 * Fix docs build * Undelete failing test, but mark as skipped * Fix namespace name in docstrings and comments * Define minimum dependency versions for Python 3.6 * Exclude autogenerated docs from docs index * Exclude generated services from the library There are currently no public API endpoints for the ModelServiceClient, thus there is no point in generating that code in the first place. * Bump minumum proto-plus version to 1.10.0 The old pin (1.4.0) does not work, tests detected some problem. * Include generated types in the docs and rebuild * Ignore skipped test in coverage check * Explain moved enums in UPGRADING guide
1 parent fbbe0cb commit cbcb4b8

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

69 files changed

+1974
-1682
lines changed

.kokoro/presubmit/presubmit.cfg

Lines changed: 1 addition & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1 @@
1-
# Format: //devtools/kokoro/config/proto/build.proto
2-
3-
# Disable system tests.
4-
env_vars: {
5-
key: "RUN_SYSTEM_TESTS"
6-
value: "false"
7-
}
1+
# Format: //devtools/kokoro/config/proto/build.proto

.kokoro/presubmit/system-2.7.cfg

Lines changed: 0 additions & 7 deletions
This file was deleted.

.kokoro/samples/python3.6/common.cfg

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,12 @@ env_vars: {
1313
value: "py-3.6"
1414
}
1515

16+
# Declare build specific Cloud project.
17+
env_vars: {
18+
key: "BUILD_SPECIFIC_GCLOUD_PROJECT"
19+
value: "python-docs-samples-tests-py36"
20+
}
21+
1622
env_vars: {
1723
key: "TRAMPOLINE_BUILD_FILE"
1824
value: "github/python-bigquery/.kokoro/test-samples.sh"

.kokoro/samples/python3.7/common.cfg

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,12 @@ env_vars: {
1313
value: "py-3.7"
1414
}
1515

16+
# Declare build specific Cloud project.
17+
env_vars: {
18+
key: "BUILD_SPECIFIC_GCLOUD_PROJECT"
19+
value: "python-docs-samples-tests-py37"
20+
}
21+
1622
env_vars: {
1723
key: "TRAMPOLINE_BUILD_FILE"
1824
value: "github/python-bigquery/.kokoro/test-samples.sh"

.kokoro/samples/python3.8/common.cfg

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,12 @@ env_vars: {
1313
value: "py-3.8"
1414
}
1515

16+
# Declare build specific Cloud project.
17+
env_vars: {
18+
key: "BUILD_SPECIFIC_GCLOUD_PROJECT"
19+
value: "python-docs-samples-tests-py38"
20+
}
21+
1622
env_vars: {
1723
key: "TRAMPOLINE_BUILD_FILE"
1824
value: "github/python-bigquery/.kokoro/test-samples.sh"

CONTRIBUTING.rst

Lines changed: 0 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -80,25 +80,6 @@ We use `nox <https://nox.readthedocs.io/en/latest/>`__ to instrument our tests.
8080

8181
.. nox: https://pypi.org/project/nox/
8282
83-
Note on Editable Installs / Develop Mode
84-
========================================
85-
86-
- As mentioned previously, using ``setuptools`` in `develop mode`_
87-
or a ``pip`` `editable install`_ is not possible with this
88-
library. This is because this library uses `namespace packages`_.
89-
For context see `Issue #2316`_ and the relevant `PyPA issue`_.
90-
91-
Since ``editable`` / ``develop`` mode can't be used, packages
92-
need to be installed directly. Hence your changes to the source
93-
tree don't get incorporated into the **already installed**
94-
package.
95-
96-
.. _namespace packages: https://www.python.org/dev/peps/pep-0420/
97-
.. _Issue #2316: https://github.com/GoogleCloudPlatform/google-cloud-python/issues/2316
98-
.. _PyPA issue: https://github.com/pypa/packaging-problems/issues/12
99-
.. _develop mode: https://setuptools.readthedocs.io/en/latest/setuptools.html#development-mode
100-
.. _editable install: https://pip.pypa.io/en/stable/reference/pip_install/#editable-installs
101-
10283
*****************************************
10384
I'm getting weird errors... Can you help?
10485
*****************************************

README.rst

Lines changed: 7 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -52,11 +52,14 @@ dependencies.
5252

5353
Supported Python Versions
5454
^^^^^^^^^^^^^^^^^^^^^^^^^
55-
Python >= 3.5
55+
Python >= 3.6
5656

57-
Deprecated Python Versions
58-
^^^^^^^^^^^^^^^^^^^^^^^^^^
59-
Python == 2.7. Python 2.7 support will be removed on January 1, 2020.
57+
Unsupported Python Versions
58+
^^^^^^^^^^^^^^^^^^^^^^^^^^^
59+
Python == 2.7, Python == 3.5.
60+
61+
The last version of this library compatible with Python 2.7 and 3.5 is
62+
`google-cloud-bigquery==1.28.0`.
6063

6164

6265
Mac/Linux

UPGRADING.md

Lines changed: 59 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,59 @@
1+
<!--
2+
Copyright 2020 Google LLC
3+
Licensed under the Apache License, Version 2.0 (the "License");
4+
you may not use this file except in compliance with the License.
5+
You may obtain a copy of the License at
6+
https://www.apache.org/licenses/LICENSE-2.0
7+
Unless required by applicable law or agreed to in writing, software
8+
distributed under the License is distributed on an "AS IS" BASIS,
9+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
10+
See the License for the specific language governing permissions and
11+
limitations under the License.
12+
-->
13+
14+
15+
# 2.0.0 Migration Guide
16+
17+
The 2.0 release of the `google-cloud-bigquery` client drops support for Python
18+
versions below 3.6. The client surface itself has not changed, but the 1.x series
19+
will not be receiving any more feature updates or bug fixes. You are thus
20+
encouraged to upgrade to the 2.x series.
21+
22+
If you experience issues or have questions, please file an
23+
[issue](https://github.com/googleapis/python-bigquery/issues).
24+
25+
26+
## Supported Python Versions
27+
28+
> **WARNING**: Breaking change
29+
30+
The 2.0.0 release requires Python 3.6+.
31+
32+
33+
## Supported BigQuery Storage Clients
34+
35+
The 2.0.0 release requires BigQuery Storage `>= 2.0.0`, which dropped support
36+
for `v1beta1` and `v1beta2` versions of the BigQuery Storage API. If you want to
37+
use a BigQuery Storage client, it must be the one supporting the `v1` API version.
38+
39+
40+
## Changed GAPIC Enums Path
41+
42+
> **WARNING**: Breaking change
43+
44+
Generated GAPIC enum types have been moved under `types`. Import paths need to be
45+
adjusted.
46+
47+
**Before:**
48+
```py
49+
from google.cloud.bigquery_v2.gapic import enums
50+
51+
distance_type = enums.Model.DistanceType.COSINE
52+
```
53+
54+
**After:**
55+
```py
56+
from google.cloud.bigquery_v2 import types
57+
58+
distance_type = types.Model.DistanceType.COSINE
59+
```

docs/UPGRADING.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
../UPGRADING.md

docs/bigquery_v2/services.rst

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
Services for Google Cloud Bigquery v2 API
2+
=========================================
3+
4+
.. automodule:: google.cloud.bigquery_v2.services.model_service
5+
:members:
6+
:inherited-members:

docs/bigquery_v2/types.rst

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
Types for Google Cloud Bigquery v2 API
2+
======================================
3+
4+
.. automodule:: google.cloud.bigquery_v2.types
5+
:members:

docs/conf.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -100,6 +100,7 @@
100100
"samples/AUTHORING_GUIDE.md",
101101
"samples/CONTRIBUTING.md",
102102
"samples/snippets/README.rst",
103+
"bigquery_v2/services.rst", # generated by the code generator
103104
]
104105

105106
# The reST default role (used for this markup: `text`) to use for all

docs/gapic/v2/enums.rst

Lines changed: 0 additions & 8 deletions
This file was deleted.

docs/gapic/v2/types.rst

Lines changed: 0 additions & 6 deletions
This file was deleted.

docs/index.rst

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -27,6 +27,16 @@ API Reference
2727
reference
2828
dbapi
2929

30+
Migration Guide
31+
---------------
32+
33+
See the guide below for instructions on migrating to the 2.x release of this library.
34+
35+
.. toctree::
36+
:maxdepth: 2
37+
38+
UPGRADING
39+
3040
Changelog
3141
---------
3242

docs/reference.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -182,6 +182,7 @@ Encryption Configuration
182182

183183
encryption_configuration.EncryptionConfiguration
184184

185+
185186
Additional Types
186187
================
187188

@@ -190,5 +191,4 @@ Protocol buffer classes for working with the Models API.
190191
.. toctree::
191192
:maxdepth: 2
192193

193-
gapic/v2/enums
194-
gapic/v2/types
194+
bigquery_v2/types

google/cloud/bigquery/_pandas_helpers.py

Lines changed: 14 additions & 63 deletions
Original file line numberDiff line numberDiff line change
@@ -22,11 +22,6 @@
2222
import six
2323
from six.moves import queue
2424

25-
try:
26-
from google.cloud import bigquery_storage_v1
27-
except ImportError: # pragma: NO COVER
28-
bigquery_storage_v1 = None
29-
3025
try:
3126
import pandas
3227
except ImportError: # pragma: NO COVER
@@ -287,14 +282,6 @@ def dataframe_to_bq_schema(dataframe, bq_schema):
287282
"""
288283
if bq_schema:
289284
bq_schema = schema._to_schema_fields(bq_schema)
290-
if six.PY2:
291-
for field in bq_schema:
292-
if field.field_type in schema._STRUCT_TYPES:
293-
raise ValueError(
294-
"Uploading dataframes with struct (record) column types "
295-
"is not supported under Python2. See: "
296-
"https://github.com/googleapis/python-bigquery/issues/21"
297-
)
298285
bq_schema_index = {field.name: field for field in bq_schema}
299286
bq_schema_unused = set(bq_schema_index.keys())
300287
else:
@@ -578,19 +565,7 @@ def _bqstorage_page_to_dataframe(column_names, dtypes, page):
578565
def _download_table_bqstorage_stream(
579566
download_state, bqstorage_client, session, stream, worker_queue, page_to_item
580567
):
581-
# Passing a BQ Storage client in implies that the BigQuery Storage library
582-
# is available and can be imported.
583-
from google.cloud import bigquery_storage_v1beta1
584-
585-
# We want to preserve comaptibility with the v1beta1 BQ Storage clients,
586-
# thus adjust constructing the rowstream if needed.
587-
# The assumption is that the caller provides a BQ Storage `session` that is
588-
# compatible with the version of the BQ Storage client passed in.
589-
if isinstance(bqstorage_client, bigquery_storage_v1beta1.BigQueryStorageClient):
590-
position = bigquery_storage_v1beta1.types.StreamPosition(stream=stream)
591-
rowstream = bqstorage_client.read_rows(position).rows(session)
592-
else:
593-
rowstream = bqstorage_client.read_rows(stream.name).rows(session)
568+
rowstream = bqstorage_client.read_rows(stream.name).rows(session)
594569

595570
for page in rowstream.pages:
596571
if download_state.done:
@@ -625,8 +600,7 @@ def _download_table_bqstorage(
625600

626601
# Passing a BQ Storage client in implies that the BigQuery Storage library
627602
# is available and can be imported.
628-
from google.cloud import bigquery_storage_v1
629-
from google.cloud import bigquery_storage_v1beta1
603+
from google.cloud import bigquery_storage
630604

631605
if "$" in table.table_id:
632606
raise ValueError(
@@ -637,41 +611,18 @@ def _download_table_bqstorage(
637611

638612
requested_streams = 1 if preserve_order else 0
639613

640-
# We want to preserve comaptibility with the v1beta1 BQ Storage clients,
641-
# thus adjust the session creation if needed.
642-
if isinstance(bqstorage_client, bigquery_storage_v1beta1.BigQueryStorageClient):
643-
warnings.warn(
644-
"Support for BigQuery Storage v1beta1 clients is deprecated, please "
645-
"consider upgrading the client to BigQuery Storage v1 stable version.",
646-
category=DeprecationWarning,
647-
)
648-
read_options = bigquery_storage_v1beta1.types.TableReadOptions()
649-
650-
if selected_fields is not None:
651-
for field in selected_fields:
652-
read_options.selected_fields.append(field.name)
653-
654-
session = bqstorage_client.create_read_session(
655-
table.to_bqstorage(v1beta1=True),
656-
"projects/{}".format(project_id),
657-
format_=bigquery_storage_v1beta1.enums.DataFormat.ARROW,
658-
read_options=read_options,
659-
requested_streams=requested_streams,
660-
)
661-
else:
662-
requested_session = bigquery_storage_v1.types.ReadSession(
663-
table=table.to_bqstorage(),
664-
data_format=bigquery_storage_v1.enums.DataFormat.ARROW,
665-
)
666-
if selected_fields is not None:
667-
for field in selected_fields:
668-
requested_session.read_options.selected_fields.append(field.name)
669-
670-
session = bqstorage_client.create_read_session(
671-
parent="projects/{}".format(project_id),
672-
read_session=requested_session,
673-
max_stream_count=requested_streams,
674-
)
614+
requested_session = bigquery_storage.types.ReadSession(
615+
table=table.to_bqstorage(), data_format=bigquery_storage.types.DataFormat.ARROW
616+
)
617+
if selected_fields is not None:
618+
for field in selected_fields:
619+
requested_session.read_options.selected_fields.append(field.name)
620+
621+
session = bqstorage_client.create_read_session(
622+
parent="projects/{}".format(project_id),
623+
read_session=requested_session,
624+
max_stream_count=requested_streams,
625+
)
675626

676627
_LOGGER.debug(
677628
"Started reading table '{}.{}.{}' with BQ Storage API session '{}'.".format(

google/cloud/bigquery/client.py

Lines changed: 4 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -17,11 +17,7 @@
1717
from __future__ import absolute_import
1818
from __future__ import division
1919

20-
try:
21-
from collections import abc as collections_abc
22-
except ImportError: # Python 2.7
23-
import collections as collections_abc
24-
20+
from collections import abc as collections_abc
2521
import copy
2622
import functools
2723
import gzip
@@ -435,19 +431,19 @@ def _create_bqstorage_client(self):
435431
warning and return ``None``.
436432
437433
Returns:
438-
Optional[google.cloud.bigquery_storage_v1.BigQueryReadClient]:
434+
Optional[google.cloud.bigquery_storage.BigQueryReadClient]:
439435
A BigQuery Storage API client.
440436
"""
441437
try:
442-
from google.cloud import bigquery_storage_v1
438+
from google.cloud import bigquery_storage
443439
except ImportError:
444440
warnings.warn(
445441
"Cannot create BigQuery Storage client, the dependency "
446442
"google-cloud-bigquery-storage is not installed."
447443
)
448444
return None
449445

450-
return bigquery_storage_v1.BigQueryReadClient(credentials=self._credentials)
446+
return bigquery_storage.BigQueryReadClient(credentials=self._credentials)
451447

452448
def create_dataset(
453449
self, dataset, exists_ok=False, retry=DEFAULT_RETRY, timeout=None

google/cloud/bigquery/dbapi/_helpers.py

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -12,11 +12,8 @@
1212
# See the License for the specific language governing permissions and
1313
# limitations under the License.
1414

15-
try:
16-
from collections import abc as collections_abc
17-
except ImportError: # Python 2.7
18-
import collections as collections_abc
1915

16+
from collections import abc as collections_abc
2017
import datetime
2118
import decimal
2219
import functools

0 commit comments

Comments
 (0)