Skip to content

Commit b569d7a

Browse files
authored
Merge branch 'master' into feature/pause-put-while-adding-removing-nodes
2 parents 9bb8d3d + c0db38f commit b569d7a

File tree

11 files changed

+147
-137
lines changed

11 files changed

+147
-137
lines changed

docs/coding-conventions.md

Lines changed: 37 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -1,15 +1,44 @@
1-
# Coding Conventions and Linters
1+
# Coding Conventions, Linters and Definitions
22

33
Some conventions on coding style and tools for the Javascript and Python in this repository.
44

55
----
6-
## Javascript
76

8-
In general the `qooxdoo` naming convention/style is followed. The [Access](http://qooxdoo.org/docs/#/core/oo_feature_summary?id=access) paragraph is the most notable. It is recommended to read the entire document.
7+
## Definitions
8+
9+
What is a ...
10+
11+
- **Controller-Service-Repository** design-pattern ?
12+
- An introduction: https://tom-collings.medium.com/controller-service-repository-16e29a4684e5
13+
- Example of adopted convention: https://github.com/ITISFoundation/osparc-simcore/pull/4389
914

10-
Have a look at `ESLint`'s configuration files [.eslintrc.json](.eslintrc.json) and [.eslintignore](.eslintignore).
1115

1216

17+
----
18+
## General Coding Conventions
19+
20+
<!-- Add below this line coding agreed coding conventions and give them a number !-->
21+
22+
### CC1: Can I use ``TODO:``, ``FIXME:``?
23+
24+
We should avoid merging PRs with ``TODO:`` and ``FIXME:`` into master. One of our bots detects those and flag them as code-smells. If we still want to keep this idea/fix noted in the code, those can be rewritten as ``NOTE:`` and should be extended with a link to a github issue with more details. For a context, see [discussion here](https://github.com/ITISFoundation/osparc-simcore/pull/3380#discussion_r979893502).
25+
26+
27+
### CC2: No commented code
28+
29+
Avoid commented code, but if you *really* want to keep it then add an explanatory `NOTE:`
30+
```python
31+
import os
32+
# import bar
33+
# x = "not very useful"
34+
35+
# NOTE: I need to keep this becase ...
36+
# import foo
37+
# x = "ok"
38+
```
39+
40+
### CC3 ...
41+
1342
----
1443
## Python
1544

@@ -62,28 +91,14 @@ In short we use the following naming convention ( roughly [PEP8](https://peps.p
6291
- Recommended inside of a ``scripts`` folder
6392

6493

65-
----
66-
## General
67-
68-
<!-- Add below this line coding agreed coding conventions and give them a number !-->
69-
70-
### CC1: Can I use ``TODO:``, ``FIXME:``?
71-
72-
We should avoid merging PRs with ``TODO:`` and ``FIXME:`` into master. One of our bots detects those and flag them as code-smells. If we still want to keep this idea/fix noted in the code, those can be rewritten as ``NOTE:`` and should be extended with a link to a github issue with more details. For a context, see [discussion here](https://github.com/ITISFoundation/osparc-simcore/pull/3380#discussion_r979893502).
7394

7495

75-
### CC2: No commented code
96+
----
97+
## Javascript
7698

77-
Avoid commented code, but if you *really* want to keep it then add an explanatory `NOTE:`
78-
```python
79-
import os
80-
# import bar
81-
# x = "not very useful"
99+
In general the `qooxdoo` naming convention/style is followed. The [Access](http://qooxdoo.org/docs/#/core/oo_feature_summary?id=access) paragraph is the most notable. It is recommended to read the entire document.
82100

83-
# NOTE: I need to keep this becase ...
84-
# import foo
85-
# x = "ok"
86-
```
101+
Have a look at `ESLint`'s configuration files [.eslintrc.json](.eslintrc.json) and [.eslintignore](.eslintignore).
87102

88103

89104

services/api-server/openapi.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1570,8 +1570,8 @@
15701570
"PUBLISHED",
15711571
"NOT_STARTED",
15721572
"PENDING",
1573+
"WAITING_FOR_RESOURCES",
15731574
"STARTED",
1574-
"RETRY",
15751575
"SUCCESS",
15761576
"FAILED",
15771577
"ABORTED"

services/api-server/src/simcore_service_api_server/api/routes/files.py

Lines changed: 85 additions & 96 deletions
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,10 @@
5454
}
5555

5656

57-
@router.get("", response_model=list[File])
57+
@router.get(
58+
"",
59+
response_model=list[File],
60+
)
5861
async def list_files(
5962
storage_client: Annotated[StorageApi, Depends(get_api_client(StorageApi))],
6063
user_id: Annotated[int, Depends(get_current_user_id)],
@@ -74,7 +77,7 @@ async def list_files(
7477

7578
file_meta: File = to_file_api_model(stored_file_meta)
7679

77-
except (ValidationError, ValueError, AttributeError) as err:
80+
except (ValidationError, ValueError, AttributeError) as err: # noqa: PERF203
7881
_logger.warning(
7982
"Skipping corrupted entry in storage '%s' (%s)"
8083
"TIP: check this entry in file_meta_data table.",
@@ -112,7 +115,10 @@ def _get_spooled_file_size(file_io: IO) -> int:
112115
return file_size
113116

114117

115-
@router.put("/content", response_model=File)
118+
@router.put(
119+
"/content",
120+
response_model=File,
121+
)
116122
@cancel_on_disconnect
117123
async def upload_file(
118124
request: Request,
@@ -129,6 +135,9 @@ async def upload_file(
129135

130136
assert request # nosec
131137

138+
if file.filename is None:
139+
file.filename = "Undefined"
140+
132141
file_size = await asyncio.get_event_loop().run_in_executor(
133142
None, _get_spooled_file_size, file.file
134143
)
@@ -151,16 +160,20 @@ async def upload_file(
151160
store_id=SIMCORE_LOCATION,
152161
store_name=None,
153162
s3_object=file_meta.storage_file_id,
154-
path_to_upload=UploadableFileObject(file.file, file.filename, file_size),
163+
path_to_upload=UploadableFileObject(
164+
file_object=file.file,
165+
file_name=file.filename,
166+
file_size=file_size,
167+
),
155168
io_log_redirect_cb=None,
156169
)
157-
assert isinstance(upload_result, UploadedFile)
170+
assert isinstance(upload_result, UploadedFile) # nosec
158171

159172
file_meta.checksum = upload_result.etag
160173
return file_meta
161174

162175

163-
# MaG suggested a single function that can upload one or multiple files instead of having
176+
# NOTE: MaG suggested a single function that can upload one or multiple files instead of having
164177
# two of them. Tried something like upload_file( files: Union[list[UploadFile], File] ) but it
165178
# produces an error in the generated openapi.json
166179
#
@@ -183,16 +196,7 @@ async def get_upload_links(
183196
client_file: ClientFile,
184197
user_id: Annotated[PositiveInt, Depends(get_current_user_id)],
185198
):
186-
"""Get upload links for uploading a file to storage
187-
188-
Arguments:
189-
request -- Request object
190-
client_file -- ClientFile object
191-
user_id -- User Id
192-
193-
Returns:
194-
FileUploadSchema
195-
"""
199+
"""Get upload links for uploading a file to storage"""
196200
assert request # nosec
197201
file_meta: File = await File.create_from_client_file(
198202
client_file,
@@ -208,95 +212,27 @@ async def get_upload_links(
208212
file_size=ByteSize(client_file.filesize),
209213
is_directory=False,
210214
)
211-
query = str(upload_links.links.complete_upload.query).removesuffix(":complete")
212-
complete_url = request.url_for(
215+
216+
query = f"{upload_links.links.complete_upload.query}".removesuffix(":complete")
217+
url = request.url_for(
213218
"complete_multipart_upload", file_id=file_meta.id
214219
).include_query_params(**dict(item.split("=") for item in query.split("&")))
215-
query = str(upload_links.links.abort_upload.query).removesuffix(":abort")
216-
abort_url = request.url_for(
220+
upload_links.links.complete_upload = parse_obj_as(AnyUrl, f"{url}")
221+
222+
query = f"{upload_links.links.abort_upload.query}".removesuffix(":abort")
223+
url = request.url_for(
217224
"abort_multipart_upload", file_id=file_meta.id
218225
).include_query_params(**dict(item.split("=") for item in query.split("&")))
226+
upload_links.links.abort_upload = parse_obj_as(AnyUrl, f"{url}")
219227

220-
upload_links.links.complete_upload = parse_obj_as(AnyUrl, f"{complete_url}")
221-
upload_links.links.abort_upload = parse_obj_as(AnyUrl, f"{abort_url}")
222228
return ClientFileUploadSchema(file_id=file_meta.id, upload_schema=upload_links)
223229

224230

225-
@router.post(
226-
"/{file_id}:complete",
231+
@router.get(
232+
"/{file_id}",
227233
response_model=File,
228-
include_in_schema=API_SERVER_DEV_FEATURES_ENABLED,
229-
)
230-
@cancel_on_disconnect
231-
async def complete_multipart_upload(
232-
request: Request,
233-
file_id: UUID,
234-
client_file: Annotated[ClientFile, Body(...)],
235-
uploaded_parts: Annotated[FileUploadCompletionBody, Body(...)],
236-
storage_client: Annotated[StorageApi, Depends(get_api_client(StorageApi))],
237-
user_id: Annotated[PositiveInt, Depends(get_current_user_id)],
238-
):
239-
"""Complete multipart upload
240-
241-
Arguments:
242-
request: The Request object
243-
file_id: The Storage id
244-
file: The File object which is to be completed
245-
uploaded_parts: The uploaded parts
246-
completion_link: The completion link
247-
user_id: The user id
248-
249-
Returns:
250-
The completed File object
251-
"""
252-
assert request # nosec
253-
assert user_id # nosec
254-
255-
file: File = File(id=file_id, filename=client_file.filename, checksum=None)
256-
complete_link: URL = await storage_client.generate_complete_upload_link(
257-
file, dict(request.query_params)
258-
)
259-
260-
e_tag: ETag = await complete_file_upload(
261-
uploaded_parts=uploaded_parts.parts,
262-
upload_completion_link=parse_obj_as(AnyUrl, f"{complete_link}"),
263-
)
264-
265-
file.checksum = e_tag
266-
return file
267-
268-
269-
@router.post(
270-
"/{file_id}:abort",
271-
include_in_schema=API_SERVER_DEV_FEATURES_ENABLED,
234+
responses={**_COMMON_ERROR_RESPONSES},
272235
)
273-
@cancel_on_disconnect
274-
async def abort_multipart_upload(
275-
request: Request,
276-
file_id: UUID,
277-
client_file: Annotated[ClientFile, Body(..., embed=True)],
278-
storage_client: Annotated[StorageApi, Depends(get_api_client(StorageApi))],
279-
user_id: Annotated[PositiveInt, Depends(get_current_user_id)],
280-
):
281-
"""Abort a multipart upload
282-
283-
Arguments:
284-
request: The Request
285-
file_id: The StorageFileID
286-
upload_links: The FileUploadSchema
287-
user_id: The user id
288-
289-
"""
290-
assert request # nosec
291-
assert user_id # nosec
292-
file: File = File(id=file_id, filename=client_file.filename, checksum=None)
293-
abort_link: URL = await storage_client.generate_abort_upload_link(
294-
file, query=dict(request.query_params)
295-
)
296-
await abort_upload(abort_upload_link=parse_obj_as(AnyUrl, str(abort_link)))
297-
298-
299-
@router.get("/{file_id}", response_model=File, responses={**_COMMON_ERROR_RESPONSES})
300236
async def get_file(
301237
file_id: UUID,
302238
storage_client: Annotated[StorageApi, Depends(get_api_client(StorageApi))],
@@ -343,6 +279,57 @@ async def delete_file(
343279
raise NotImplementedError(msg)
344280

345281

282+
@router.post(
283+
"/{file_id}:abort",
284+
include_in_schema=API_SERVER_DEV_FEATURES_ENABLED,
285+
)
286+
async def abort_multipart_upload(
287+
request: Request,
288+
file_id: UUID,
289+
client_file: Annotated[ClientFile, Body(..., embed=True)],
290+
storage_client: Annotated[StorageApi, Depends(get_api_client(StorageApi))],
291+
user_id: Annotated[PositiveInt, Depends(get_current_user_id)],
292+
):
293+
assert request # nosec
294+
assert user_id # nosec
295+
file: File = File(id=file_id, filename=client_file.filename, checksum=None)
296+
abort_link: URL = await storage_client.create_abort_upload_link(
297+
file, query=dict(request.query_params)
298+
)
299+
await abort_upload(abort_upload_link=parse_obj_as(AnyUrl, str(abort_link)))
300+
301+
302+
@router.post(
303+
"/{file_id}:complete",
304+
response_model=File,
305+
include_in_schema=API_SERVER_DEV_FEATURES_ENABLED,
306+
)
307+
@cancel_on_disconnect
308+
async def complete_multipart_upload(
309+
request: Request,
310+
file_id: UUID,
311+
client_file: Annotated[ClientFile, Body(...)],
312+
uploaded_parts: Annotated[FileUploadCompletionBody, Body(...)],
313+
storage_client: Annotated[StorageApi, Depends(get_api_client(StorageApi))],
314+
user_id: Annotated[PositiveInt, Depends(get_current_user_id)],
315+
):
316+
assert request # nosec
317+
assert user_id # nosec
318+
319+
file: File = File(id=file_id, filename=client_file.filename, checksum=None)
320+
complete_link: URL = await storage_client.create_complete_upload_link(
321+
file, dict(request.query_params)
322+
)
323+
324+
e_tag: ETag = await complete_file_upload(
325+
uploaded_parts=uploaded_parts.parts,
326+
upload_completion_link=parse_obj_as(AnyUrl, f"{complete_link}"),
327+
)
328+
329+
file.checksum = e_tag
330+
return file
331+
332+
346333
@router.get(
347334
"/{file_id}/content",
348335
response_class=RedirectResponse,
@@ -370,7 +357,9 @@ async def download_file(
370357

371358
# download from S3 using pre-signed link
372359
presigned_download_link = await storage_client.get_download_link(
373-
user_id, file_meta.id, file_meta.filename
360+
user_id=user_id,
361+
file_id=file_meta.id,
362+
file_name=file_meta.filename,
374363
)
375364

376365
_logger.info("Downloading %s to %s ...", file_meta, presigned_download_link)

services/api-server/src/simcore_service_api_server/api/routes/solvers_jobs.py

Lines changed: 4 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -32,13 +32,13 @@
3232
from ...models.schemas.solvers import Solver, SolverKeyId
3333
from ...services.catalog import CatalogApi
3434
from ...services.director_v2 import DirectorV2Api, DownloadLink, NodeName
35-
from ...services.storage import StorageApi, to_file_api_model
36-
from ...utils.solver_job_models_converters import (
35+
from ...services.solver_job_models_converters import (
3736
create_job_from_project,
3837
create_jobstatus_from_task,
3938
create_new_project_for_job,
4039
)
41-
from ...utils.solver_job_outputs import ResultsTypes, get_solver_output_results
40+
from ...services.solver_job_outputs import ResultsTypes, get_solver_output_results
41+
from ...services.storage import StorageApi, to_file_api_model
4242
from ..dependencies.application import get_product_name, get_reverse_url_mapper
4343
from ..dependencies.authentication import get_current_user_id
4444
from ..dependencies.database import Engine, get_db_engine
@@ -127,11 +127,7 @@ async def get_jobs_page(
127127
url_for: Annotated[Callable, Depends(get_reverse_url_mapper)],
128128
product_name: Annotated[str, Depends(get_product_name)],
129129
):
130-
"""List of jobs on a specific released solver (includes pagination)
131-
132-
133-
Breaking change in *version 0.5*: response model changed from list[Job] to pagination Page[Job].
134-
"""
130+
"""List of jobs on a specific released solver (includes pagination)"""
135131

136132
# NOTE: Different entry to keep backwards compatibility with list_jobs.
137133
# Eventually use a header with agent version to switch to new interface

0 commit comments

Comments
 (0)