Skip to content

✨ Add exporter code to storage #7218

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
167 commits
Select commit Hold shift + click to select a range
a630375
added new accepted folder path
Feb 12, 2025
1503482
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Feb 12, 2025
ebe1b37
initial implementation
Feb 12, 2025
27831e9
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Feb 12, 2025
d33f717
removed
Feb 12, 2025
12440f7
refactor
Feb 14, 2025
74611a9
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Feb 14, 2025
edbc905
removed uneccessary
Feb 14, 2025
a7d1dbc
refactor
Feb 14, 2025
e884e54
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Feb 14, 2025
a1b8822
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Feb 14, 2025
650350e
remove duplicate import
Feb 14, 2025
fcf8d34
remove unused
Feb 14, 2025
a4d6b01
update comments
Feb 14, 2025
442c700
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Feb 17, 2025
58f0b85
refactor
Feb 17, 2025
d8f37a9
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Feb 17, 2025
a9feeb5
repalce with archiver
Feb 17, 2025
88fb792
rename
Feb 17, 2025
7b66adf
extended tests
Feb 17, 2025
81810c3
added progress_callbacks
Feb 17, 2025
72b9e2b
ensure progress
Feb 17, 2025
3eafcc1
remove else
Feb 17, 2025
3b0a073
added base task to use
Feb 18, 2025
7359f6d
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Feb 19, 2025
1dd6de2
extended base messages
Feb 19, 2025
b3ee27a
added user_notifications base
Feb 19, 2025
fc59ec1
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Feb 19, 2025
dbf3f29
updated openapispecs
Feb 19, 2025
411f9ee
add progress sending via rabbitmq
Feb 19, 2025
412bb0f
added cleanup fixtures
Feb 19, 2025
981cde9
mypy
Feb 19, 2025
e2b7e6f
fixed test
Feb 19, 2025
ded5d00
fixed missing field
Feb 19, 2025
f6a77cb
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Mar 19, 2025
7ced8e7
fixed
Mar 19, 2025
e5859f6
fixed export task
Mar 19, 2025
ed60ba7
fixed task
Mar 19, 2025
1c0d5ab
connected export job
Mar 19, 2025
f3ca256
purged unused
Mar 19, 2025
1cd7bbd
removed
Mar 20, 2025
949cb09
refactor
Mar 20, 2025
0c46c25
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Mar 20, 2025
335b550
pylint
Mar 20, 2025
53b257a
refactor progress
Mar 20, 2025
3a2b858
rename
Mar 20, 2025
c13ffb1
refactor
Mar 20, 2025
4abd21a
fixed test
Mar 20, 2025
85fb767
fixed tests
Mar 20, 2025
16dd061
rename
Mar 20, 2025
39fe1a7
fixed test
Mar 20, 2025
87ba884
refactor
Mar 20, 2025
a796af3
refactor
Mar 20, 2025
5db89cb
minor
Mar 20, 2025
65f6817
moved modules
Mar 20, 2025
1e7aa1c
revert change
Mar 20, 2025
9031e8d
refactored tests
Mar 20, 2025
f49232a
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Mar 20, 2025
3fdf056
rephrase message
Mar 20, 2025
2766d20
updated spec
Mar 20, 2025
2f72676
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Mar 21, 2025
1572713
using shared value
Mar 21, 2025
3d7373e
refactor messages
Mar 21, 2025
7d239dc
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Mar 24, 2025
7622e15
enforce UUID pattern on exports
Mar 24, 2025
1413afb
refactor
Mar 24, 2025
c17b21e
removed exception
Mar 24, 2025
3fde9ad
rename
Mar 24, 2025
899b4fa
renamed
Mar 24, 2025
596c3ac
limit permissions
Mar 24, 2025
0cbc148
rename
Mar 24, 2025
02208c5
feedback
Mar 24, 2025
e313bc6
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Mar 25, 2025
7af7f46
added note
Mar 25, 2025
5de596d
added note
Mar 25, 2025
b091c3e
using progress bar
Mar 25, 2025
92c6090
using existing fixture
Mar 25, 2025
bb38d8b
fixed regex
Mar 25, 2025
8badfd2
fixed tests
Mar 25, 2025
552de63
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Mar 25, 2025
afa7858
fixeed specs
Mar 25, 2025
938c6ae
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Mar 28, 2025
c6f48f1
fixed
Mar 28, 2025
1aa630e
fixed test
Mar 28, 2025
9f021e3
revert interfaces
Mar 28, 2025
5632add
revert changes
Mar 28, 2025
37c05ba
refactor
Mar 28, 2025
d3437c4
refactor
Mar 28, 2025
585ba54
restructured imports
Mar 28, 2025
4bae784
refactir
Mar 28, 2025
9f2d99f
fixed tests
Mar 28, 2025
8cbfeb1
not required
Mar 28, 2025
3094f87
limit to simcore_s3 only
Mar 28, 2025
b5c68e0
interface cleanup
Mar 28, 2025
9a589c0
aligned interface
Mar 28, 2025
4185157
fixed imports
Mar 28, 2025
d91899e
updated interface and tests
Mar 28, 2025
33aca4f
fixed zip, using zip_64 by default
Mar 31, 2025
79baf7b
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Mar 31, 2025
3965ff8
added failing test
Apr 1, 2025
2819eec
renamed module and tests
Apr 1, 2025
1277e0a
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Apr 1, 2025
8cd8e7e
rename
Apr 1, 2025
cf516f3
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Apr 1, 2025
692de11
changed palcement
Apr 1, 2025
d65881a
using spy instead of mock
Apr 1, 2025
ad92ddf
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Apr 1, 2025
c71deb6
refactor
Apr 2, 2025
0d5c6f9
removed timeout on soem tasks
Apr 2, 2025
41d60fa
made error more readable
Apr 2, 2025
84f60e5
fixed tests
Apr 2, 2025
a744836
renamed
Apr 2, 2025
4c444aa
fixed broken tests
Apr 2, 2025
fef2946
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Apr 2, 2025
7ed9d3a
unbounded tasks
Apr 2, 2025
a36ff26
no timeouts
Apr 2, 2025
cb99129
mypy
Apr 2, 2025
6b91cf1
fixed broken import
Apr 2, 2025
b9233da
fixed test
Apr 2, 2025
f9596f6
fixed tests
Apr 2, 2025
325b8ab
imports
Apr 2, 2025
87aa58e
fixed broken tests
Apr 2, 2025
c5c6bc0
fixed failing test
Apr 2, 2025
c5723af
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Apr 2, 2025
921d69f
catching unexpected cases
Apr 2, 2025
9a9026e
making exception handling safer
Apr 2, 2025
62a50d9
refactor error serialization
Apr 2, 2025
1bbd2c1
updated specs
Apr 3, 2025
52b67ed
rename
Apr 3, 2025
3c7d9f9
using proper types
Apr 3, 2025
ede05c6
rename
Apr 3, 2025
434e3aa
removed unused cases
Apr 3, 2025
e1b66d2
refactor export data progress
Apr 3, 2025
78836a6
refactor
Apr 4, 2025
daec075
using proper interface
Apr 4, 2025
b100eec
refactor
Apr 4, 2025
dbde203
using new types
Apr 4, 2025
5d36c64
fixed broken error handling
Apr 4, 2025
2833283
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Apr 4, 2025
640fd98
rephrase
Apr 4, 2025
53247a7
reverted default
Apr 4, 2025
8bdb286
fixed timeout and error handling with access
Apr 4, 2025
01670d3
renamed
Apr 4, 2025
972425c
fixed broken test
Apr 4, 2025
da15df2
accerrrightserror now points to the file which does not have access
Apr 7, 2025
042d093
changed types
Apr 7, 2025
213906a
added dont autoretry for
Apr 7, 2025
8f0088d
fixed typing
Apr 7, 2025
fc28308
remove not necessary
Apr 7, 2025
8646c78
typing
Apr 7, 2025
e64d30c
refactor access_rights
Apr 7, 2025
2de121a
revert change
Apr 7, 2025
e1c93b5
remove unused
Apr 7, 2025
b50d10f
enforcing proper error
Apr 7, 2025
c9a93f7
refactor
Apr 7, 2025
a3522b6
replaced tests with mocks
Apr 8, 2025
9348f7b
added corner case
Apr 8, 2025
761a3fb
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Apr 8, 2025
24189be
added issue reference
Apr 8, 2025
c02ae70
revert
Apr 8, 2025
7b5bec6
updated specs
Apr 8, 2025
3b324af
revert chane
Apr 8, 2025
40a81e9
rename
Apr 8, 2025
a279ad1
fixed broken test for autoretry
Apr 8, 2025
63f2f5c
extended tests for autoretry
Apr 8, 2025
2b55123
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Apr 8, 2025
be9b1e3
added missing
Apr 8, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 7 additions & 7 deletions api/specs/web-server/_long_running_tasks.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
from servicelib.long_running_tasks._models import TaskGet, TaskStatus
from simcore_service_webserver._meta import API_VTAG
from simcore_service_webserver.tasks._exception_handlers import (
_TO_HTTP_ERROR_MAP as data_export_http_error_map,
_TO_HTTP_ERROR_MAP as export_data_http_error_map,
)

router = APIRouter(
Expand All @@ -23,9 +23,9 @@
],
)

_data_export_responses: dict[int | str, dict[str, Any]] = {
_export_data_responses: dict[int | str, dict[str, Any]] = {
i.status_code: {"model": EnvelopedError}
for i in data_export_http_error_map.values()
for i in export_data_http_error_map.values()
}


Expand All @@ -34,7 +34,7 @@
response_model=Envelope[list[TaskGet]],
name="list_tasks",
description="Lists all long running tasks",
responses=_data_export_responses,
responses=_export_data_responses,
)
def get_async_jobs(): ...

Expand All @@ -44,7 +44,7 @@ def get_async_jobs(): ...
response_model=Envelope[TaskStatus],
name="get_task_status",
description="Retrieves the status of a task",
responses=_data_export_responses,
responses=_export_data_responses,
)
def get_async_job_status(
_path_params: Annotated[_PathParam, Depends()],
Expand All @@ -55,7 +55,7 @@ def get_async_job_status(
"/tasks/{task_id}",
name="cancel_and_delete_task",
description="Cancels and deletes a task",
responses=_data_export_responses,
responses=_export_data_responses,
status_code=status.HTTP_204_NO_CONTENT,
)
def abort_async_job(
Expand All @@ -67,7 +67,7 @@ def abort_async_job(
"/tasks/{task_id}/result",
name="get_task_result",
description="Retrieves the result of a task",
responses=_data_export_responses,
responses=_export_data_responses,
)
def get_async_job_result(
_path_params: Annotated[_PathParam, Depends()],
Expand Down
10 changes: 5 additions & 5 deletions api/specs/web-server/_storage.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@
from simcore_service_webserver._meta import API_VTAG
from simcore_service_webserver.storage.schemas import DatasetMetaData, FileMetaData
from simcore_service_webserver.tasks._exception_handlers import (
_TO_HTTP_ERROR_MAP as data_export_http_error_map,
_TO_HTTP_ERROR_MAP as export_data_http_error_map,
)

router = APIRouter(
Expand Down Expand Up @@ -221,9 +221,9 @@ async def is_completed_upload_file(


# data export
_data_export_responses: dict[int | str, dict[str, Any]] = {
_export_data_responses: dict[int | str, dict[str, Any]] = {
i.status_code: {"model": EnvelopedError}
for i in data_export_http_error_map.values()
for i in export_data_http_error_map.values()
}


Expand All @@ -232,7 +232,7 @@ async def is_completed_upload_file(
response_model=Envelope[TaskGet],
name="export_data",
description="Export data",
responses=_data_export_responses,
responses=_export_data_responses,
)
async def export_data(data_export: DataExportPost, location_id: LocationID):
async def export_data(export_data: DataExportPost, location_id: LocationID):
"""Trigger data export. Returns async job id for getting status and results"""
3 changes: 3 additions & 0 deletions packages/aws-library/src/aws_library/s3/_constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,9 @@
MULTIPART_COPY_THRESHOLD: Final[ByteSize] = TypeAdapter(ByteSize).validate_python(
"100MiB"
)
STREAM_READER_CHUNK_SIZE: Final[ByteSize] = TypeAdapter(ByteSize).validate_python(
"10MiB"
)

PRESIGNED_LINK_MAX_SIZE: Final[ByteSize] = TypeAdapter(ByteSize).validate_python("5GiB")
S3_MAX_FILE_SIZE: Final[ByteSize] = TypeAdapter(ByteSize).validate_python("5TiB")
Expand Down
4 changes: 2 additions & 2 deletions packages/aws-library/tests/test_s3_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,8 @@
from aiohttp import ClientSession
from aws_library.s3._client import _AWS_MAX_ITEMS_PER_PAGE, S3ObjectKey, SimcoreS3API
from aws_library.s3._constants import (
MULTIPART_COPY_THRESHOLD,
MULTIPART_UPLOADS_MIN_TOTAL_SIZE,
STREAM_READER_CHUNK_SIZE,
)
from aws_library.s3._errors import (
S3BucketInvalidError,
Expand Down Expand Up @@ -1902,7 +1902,7 @@ async def test_workflow_compress_s3_objects_and_local_files_in_a_single_archive_
get_zip_bytes_iter(
archive_entries,
progress_bar=progress_bar,
chunk_size=MULTIPART_COPY_THRESHOLD,
chunk_size=STREAM_READER_CHUNK_SIZE,
)
),
)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,5 +27,5 @@ class JobAbortedError(BaseAsyncjobRpcError):

class JobError(BaseAsyncjobRpcError):
msg_template: str = (
"Job {job_id} failed with exception type {exc_type} and message {exc_msg}"
"Job '{job_id}' failed with exception type '{exc_type}' and message: {exc_msg}"
)
Original file line number Diff line number Diff line change
@@ -1,14 +1,6 @@
# pylint: disable=R6301

from common_library.errors_classes import OsparcErrorMixin
from models_library.projects_nodes_io import LocationID, StorageFileID
from pydantic import BaseModel, Field


class DataExportTaskStartInput(BaseModel):
location_id: LocationID
file_and_folder_ids: list[StorageFileID] = Field(..., min_length=1)


### Exceptions

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,15 +3,12 @@

from pydantic import BaseModel, Field

from ..api_schemas_storage.data_export_async_jobs import DataExportTaskStartInput
from ..api_schemas_storage.storage_schemas import (
DEFAULT_NUMBER_OF_PATHS_PER_PAGE,
MAX_NUMBER_OF_PATHS_PER_PAGE,
)
from ..projects_nodes_io import LocationID, StorageFileID
from ..rest_pagination import (
CursorQueryParameters,
)
from ..projects_nodes_io import LocationID
from ..rest_pagination import CursorQueryParameters
from ._base import InputSchema


Expand Down Expand Up @@ -40,11 +37,8 @@ class BatchDeletePathsBodyParams(InputSchema):
paths: set[Path]


class DataExportPost(InputSchema):
paths: list[StorageFileID]
PathToExport = Path

def to_rpc_schema(self, location_id: LocationID) -> DataExportTaskStartInput:
return DataExportTaskStartInput(
file_and_folder_ids=self.paths,
location_id=location_id,
)

class DataExportPost(InputSchema):
paths: list[PathToExport]
5 changes: 4 additions & 1 deletion packages/models-library/src/models_library/basic_regex.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@

SEE tests_basic_regex.py for examples
"""

# TODO: for every pattern we should have a formatter function
# NOTE: some sites to manualy check ideas
# https://regex101.com/
Expand Down Expand Up @@ -45,7 +46,9 @@
)

# Storage basic file ID
SIMCORE_S3_FILE_ID_RE = rf"^(api|({UUID_RE_BASE}))\/({UUID_RE_BASE})\/(.+)$"
SIMCORE_S3_FILE_ID_RE = rf"^(exports\/\d+\/{UUID_RE_BASE}\.zip)|((api|({UUID_RE_BASE}))\/({UUID_RE_BASE})\/(.+)$)"


SIMCORE_S3_DIRECTORY_ID_RE = rf"^({UUID_RE_BASE})\/({UUID_RE_BASE})\/(.+)\/$"

# S3 - AWS bucket names [https://docs.aws.amazon.com/AmazonS3/latest/userguide/bucketnamingrules.html]
Expand Down
24 changes: 12 additions & 12 deletions packages/models-library/src/models_library/rabbitmq_messages.py
Original file line number Diff line number Diff line change
Expand Up @@ -95,9 +95,9 @@ class ProgressType(StrAutoEnum):


class ProgressMessageMixin(RabbitMessageBase):
channel_name: Literal[
channel_name: Literal["simcore.services.progress.v2"] = (
"simcore.services.progress.v2"
] = "simcore.services.progress.v2"
)
progress_type: ProgressType = (
ProgressType.COMPUTATION_RUNNING
) # NOTE: backwards compatible
Expand All @@ -118,9 +118,9 @@ def routing_key(self) -> str | None:


class InstrumentationRabbitMessage(RabbitMessageBase, NodeMessageBase):
channel_name: Literal[
channel_name: Literal["simcore.services.instrumentation"] = (
"simcore.services.instrumentation"
] = "simcore.services.instrumentation"
)
metrics: str
service_uuid: NodeID
service_type: str
Expand Down Expand Up @@ -210,9 +210,9 @@ def routing_key(self) -> str | None:


class RabbitResourceTrackingStartedMessage(RabbitResourceTrackingBaseMessage):
message_type: Literal[
message_type: Literal[RabbitResourceTrackingMessageType.TRACKING_STARTED] = (
RabbitResourceTrackingMessageType.TRACKING_STARTED
] = RabbitResourceTrackingMessageType.TRACKING_STARTED
)

wallet_id: WalletID | None
wallet_name: str | None
Expand Down Expand Up @@ -250,9 +250,9 @@ class RabbitResourceTrackingStartedMessage(RabbitResourceTrackingBaseMessage):


class RabbitResourceTrackingHeartbeatMessage(RabbitResourceTrackingBaseMessage):
message_type: Literal[
message_type: Literal[RabbitResourceTrackingMessageType.TRACKING_HEARTBEAT] = (
RabbitResourceTrackingMessageType.TRACKING_HEARTBEAT
] = RabbitResourceTrackingMessageType.TRACKING_HEARTBEAT
)


class SimcorePlatformStatus(StrAutoEnum):
Expand All @@ -261,9 +261,9 @@ class SimcorePlatformStatus(StrAutoEnum):


class RabbitResourceTrackingStoppedMessage(RabbitResourceTrackingBaseMessage):
message_type: Literal[
message_type: Literal[RabbitResourceTrackingMessageType.TRACKING_STOPPED] = (
RabbitResourceTrackingMessageType.TRACKING_STOPPED
] = RabbitResourceTrackingMessageType.TRACKING_STOPPED
)

simcore_platform_status: SimcorePlatformStatus = Field(
...,
Expand Down Expand Up @@ -297,9 +297,9 @@ class CreditsLimit(IntEnum):


class WalletCreditsLimitReachedMessage(RabbitMessageBase):
channel_name: Literal[
channel_name: Literal["io.simcore.service.wallets-credit-limit-reached"] = (
"io.simcore.service.wallets-credit-limit-reached"
] = "io.simcore.service.wallets-credit-limit-reached"
)
created_at: datetime.datetime = Field(
default_factory=lambda: arrow.utcnow().datetime,
description="message creation datetime",
Expand Down
26 changes: 22 additions & 4 deletions packages/models-library/tests/test_project_nodes_io.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,8 @@
# pylint:disable=unused-argument
# pylint:disable=redefined-outer-name

from typing import Any
from typing import Any, Final
from uuid import UUID

import pytest
from faker import Faker
Expand All @@ -11,9 +12,14 @@
DatCoreFileLink,
SimCoreFileLink,
SimcoreS3DirectoryID,
SimcoreS3FileID,
)
from models_library.users import UserID
from pydantic import TypeAdapter, ValidationError

UUID_0: Final[str] = f"{UUID(int=0)}"
USER_ID_0: Final[UserID] = 0


@pytest.fixture()
def minimal_simcore_file_link(faker: Faker) -> dict[str, Any]:
Expand Down Expand Up @@ -115,9 +121,6 @@ def test_store_discriminator():
assert isinstance(rawgraph_node.inputs["input_1"], PortLink)


UUID_0: str = "00000000-0000-0000-0000-000000000000"


def test_simcore_s3_directory_id():
# the only allowed path is the following
result = TypeAdapter(SimcoreS3DirectoryID).validate_python(
Expand Down Expand Up @@ -180,3 +183,18 @@ def test_simcore_s3_directory_get_parent():
SimcoreS3DirectoryID._get_parent( # noqa SLF001
"/hello/object/", parent_index=4
)


@pytest.mark.parametrize(
"object_key",
[
f"api/{UUID_0}/some-random-file.png",
f"exports/{USER_ID_0}/{UUID_0}.zip",
f"{UUID_0}/{UUID_0}/some-random-file.png",
f"api/{UUID_0}/some-path/some-random-file.png",
f"{UUID_0}/{UUID_0}/some-path/some-random-file.png",
],
)
def test_simcore_s3_file_id_accepted_patterns(object_key: str):
file_id = TypeAdapter(SimcoreS3FileID).validate_python(object_key)
assert f"{file_id}" == object_key
6 changes: 2 additions & 4 deletions packages/models-library/tests/test_rabbit_messages.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,3 @@
from typing import Union

import pytest
from faker import Faker
from models_library.progress_bar import ProgressReport
Expand Down Expand Up @@ -41,6 +39,6 @@
)
async def test_raw_message_parsing(raw_data: str, class_type: type):
result = TypeAdapter(
Union[ProgressRabbitMessageNode, ProgressRabbitMessageProject]
ProgressRabbitMessageNode | ProgressRabbitMessageProject
).validate_json(raw_data)
assert type(result) == class_type
assert type(result) is class_type
Original file line number Diff line number Diff line change
@@ -1,8 +1,7 @@
from typing import Final

from pydantic import ByteSize, NonNegativeFloat, NonNegativeInt
from pydantic import ByteSize, NonNegativeInt

_UNIT_MULTIPLIER: Final[NonNegativeFloat] = 1024.0
TQDM_FILE_OPTIONS: Final[dict] = {
"unit": "byte",
"unit_scale": True,
Expand Down
Original file line number Diff line number Diff line change
@@ -1,14 +1,17 @@
import logging
from collections.abc import AsyncIterable
from datetime import UTC, datetime
from stat import S_IFREG
from typing import TypeAlias

from models_library.bytes_iters import BytesIter, DataSize
from stream_zip import ZIP_32, AsyncMemberFile, async_stream_zip
from stream_zip import ZIP_64, AsyncMemberFile, async_stream_zip

from ..progress_bar import ProgressBarData
from ._models import BytesStreamer

_logger = logging.getLogger(__name__)

FileNameInArchive: TypeAlias = str
ArchiveFileEntry: TypeAlias = tuple[FileNameInArchive, BytesStreamer]
ArchiveEntries: TypeAlias = list[ArchiveFileEntry]
Expand All @@ -22,7 +25,7 @@ async def _member_files_iter(
file_name,
datetime.now(UTC),
S_IFREG | 0o600,
ZIP_32,
ZIP_64,
byte_streamer.with_progress_bytes_iter(progress_bar=progress_bar),
)

Expand Down
Loading
Loading