Skip to content

Commit 2d04683

Browse files
committed
[DataPipe] Adding a 's' to the functional names of open/list DataPipes
ghstack-source-id: 3bb1bce Pull Request resolved: #479
1 parent 3d967c6 commit 2d04683

File tree

4 files changed

+13
-13
lines changed

4 files changed

+13
-13
lines changed

torchdata/datapipes/iter/load/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -35,8 +35,8 @@ Note: refer to the official documentation for detailed installtion instructions
3535

3636
### S3FileLister
3737

38-
`S3FileLister` accepts a list of S3 prefixes and iterates all matching s3 urls. The functional API is `list_file_by_s3`.
39-
Acceptable prefixes include `s3://bucket-name`, `s3://bucket-name/`, `s3://bucket-name/folder`,
38+
`S3FileLister` accepts a list of S3 prefixes and iterates all matching s3 urls. The functional API is
39+
`list_files_by_s3`. Acceptable prefixes include `s3://bucket-name`, `s3://bucket-name/`, `s3://bucket-name/folder`,
4040
`s3://bucket-name/folder/`, and `s3://bucket-name/prefix`. You may also set `length`, `request_timeout_ms` (default 3000
4141
ms in aws-sdk-cpp), and `region`. Note that:
4242

@@ -48,7 +48,7 @@ ms in aws-sdk-cpp), and `region`. Note that:
4848
### S3FileLoader
4949

5050
`S3FileLoader` accepts a list of S3 URLs and iterates all files in `BytesIO` format with `(url, BytesIO)` tuples. The
51-
functional API is `load_file_by_s3`. You may also set `request_timeout_ms` (default 3000 ms in aws-sdk-cpp), `region`,
51+
functional API is `load_files_by_s3`. You may also set `request_timeout_ms` (default 3000 ms in aws-sdk-cpp), `region`,
5252
`buffer_size` (default 120Mb), and `multi_part_download` (default to use multi-part downloading). Note that:
5353

5454
1. Input **must** be a list and S3 URLs must be valid.

torchdata/datapipes/iter/load/fsspec.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -101,11 +101,11 @@ def __iter__(self) -> Iterator[str]:
101101
yield abs_path
102102

103103

104-
@functional_datapipe("open_file_by_fsspec")
104+
@functional_datapipe("open_files_by_fsspec")
105105
class FSSpecFileOpenerIterDataPipe(IterDataPipe[Tuple[str, StreamWrapper]]):
106106
r"""
107107
Opens files from input datapipe which contains `fsspec` paths and yields a tuple of
108-
pathname and opened file stream (functional name: ``open_file_by_fsspec``).
108+
pathname and opened file stream (functional name: ``open_files_by_fsspec``).
109109
110110
Args:
111111
source_datapipe: Iterable DataPipe that provides the pathnames or URLs
@@ -114,7 +114,7 @@ class FSSpecFileOpenerIterDataPipe(IterDataPipe[Tuple[str, StreamWrapper]]):
114114
Example:
115115
>>> from torchdata.datapipes.iter import FSSpecFileLister
116116
>>> datapipe = FSSpecFileLister(root=dir_path)
117-
>>> file_dp = datapipe.open_file_by_fsspec()
117+
>>> file_dp = datapipe.open_files_by_fsspec()
118118
"""
119119

120120
def __init__(self, source_datapipe: IterDataPipe[str], mode: str = "r") -> None:

torchdata/datapipes/iter/load/iopath.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -96,11 +96,11 @@ def __iter__(self) -> Iterator[str]:
9696
yield os.path.join(path, file_name)
9797

9898

99-
@functional_datapipe("open_file_by_iopath")
99+
@functional_datapipe("open_files_by_iopath")
100100
class IoPathFileOpenerIterDataPipe(IterDataPipe[Tuple[str, StreamWrapper]]):
101101
r"""
102102
Opens files from input datapipe which contains pathnames or URLs,
103-
and yields a tuple of pathname and opened file stream (functional name: ``open_file_by_iopath``).
103+
and yields a tuple of pathname and opened file stream (functional name: ``open_files_by_iopath``).
104104
105105
Args:
106106
source_datapipe: Iterable DataPipe that provides the pathnames or URLs
@@ -114,7 +114,7 @@ class IoPathFileOpenerIterDataPipe(IterDataPipe[Tuple[str, StreamWrapper]]):
114114
Example:
115115
>>> from torchdata.datapipes.iter import IoPathFileLister
116116
>>> datapipe = IoPathFileLister(root=S3URL)
117-
>>> file_dp = datapipe.open_file_by_iopath()
117+
>>> file_dp = datapipe.open_files_by_iopath()
118118
"""
119119

120120
def __init__(self, source_datapipe: IterDataPipe[str], mode: str = "r", pathmgr=None) -> None:

torchdata/datapipes/iter/load/s3io.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -13,10 +13,10 @@
1313
from torchdata.datapipes.utils import StreamWrapper
1414

1515

16-
@functional_datapipe("list_file_by_s3")
16+
@functional_datapipe("list_files_by_s3")
1717
class S3FileListerIterDataPipe(IterDataPipe[str]):
1818
r"""
19-
Iterable DataPipe that lists Amazon S3 file URLs with the given prefixes (functional name: ``list_file_by_s3``).
19+
Iterable DataPipe that lists Amazon S3 file URLs with the given prefixes (functional name: ``list_files_by_s3``).
2020
Acceptable prefixes include ``s3://bucket-name``, ``s3://bucket-name/``, ``s3://bucket-name/folder``,
2121
``s3://bucket-name/folder/``, and ``s3://bucket-name/prefix``. You may also set ``length``, ``request_timeout_ms``
2222
(default 3000 ms in aws-sdk-cpp), and ``region``.
@@ -72,10 +72,10 @@ def __len__(self) -> int:
7272
return self.length
7373

7474

75-
@functional_datapipe("load_file_by_s3")
75+
@functional_datapipe("load_files_by_s3")
7676
class S3FileLoaderIterDataPipe(IterDataPipe[Tuple[str, StreamWrapper]]):
7777
r"""
78-
Iterable DataPipe that loads Amazon S3 files from the given S3 URLs (functional name: ``load_file_by_s3``).
78+
Iterable DataPipe that loads Amazon S3 files from the given S3 URLs (functional name: ``load_files_by_s3``).
7979
``S3FileLoader`` iterates all given S3 URLs in ``BytesIO`` format with ``(url, BytesIO)`` tuples.
8080
You may also set ``request_timeout_ms`` (default 3000 ms in aws-sdk-cpp), ``region``,
8181
``buffer_size`` (default 120Mb), and ``multi_part_download`` (default to use multi-part downloading).

0 commit comments

Comments
 (0)