Skip to content

Commit 7cfe363

Browse files
committed
Add working 2D example
1 parent 1a78f16 commit 7cfe363

File tree

4 files changed

+158
-0
lines changed

4 files changed

+158
-0
lines changed

examples/14_2D_only_test/README.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
**IMPORTANT**: The `BASE_FOLDER_EXAMPLE` at the top of the script needs to be adapted to point to the examples folder of the user you submit the example from.
2+
3+
This example ran successfully with:
4+
* `fractal-server==0.3.5, fractal-client==0.2.9, fracal-tasks-core==0.2.6`
5+
6+
The features were not calculated yet and couldn't find the issue on first tests. Let's revisit in server 1.x version with the better logs
Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
import json
2+
import sys
3+
4+
5+
msg = (
6+
"Usage 1: python aux_extract_id_from_project_json.py FILENAME"
7+
"project_id\nUsage 2: python aux_extract_id_from_project_json.py"
8+
" FILENAME dataset_id DATASET_NAME"
9+
)
10+
if len(sys.argv[1:]) < 2:
11+
raise Exception(msg)
12+
13+
filename, which_id = sys.argv[1:3]
14+
if which_id == "dataset_id":
15+
if len(sys.argv[1:]) != 3:
16+
raise Exception(msg)
17+
dataset_name = sys.argv[3]
18+
19+
with open(filename, "r") as fin:
20+
d = json.load(fin)
21+
22+
if which_id == "project_id":
23+
# Safety check
24+
print(d["id"])
25+
elif which_id == "dataset_id":
26+
dataset_list = d["dataset_list"]
27+
dataset_id = next(
28+
ds["id"] for ds in dataset_list if ds["name"] == dataset_name
29+
)
30+
print(dataset_id)
31+
else:
32+
raise Exception("ERROR: {which_id=}")
Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
!!python/object:napari_workflows._workflow.Workflow
2+
_tasks:
3+
regionprops_DAPI: !!python/tuple
4+
- !!python/name:napari_skimage_regionprops._regionprops.regionprops_table ''
5+
- dapi_img
6+
- label_img
7+
- true
8+
- true
9+
- false
10+
- false
11+
- false
12+
- false
Lines changed: 108 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,108 @@
1+
# For the demo: Run everything as the current user
2+
USERNAME="$(whoami)"
3+
PORT=8001
4+
echo -e "FRACTAL_USER=$USERNAME@me.com\nFRACTAL_PASSWORD=test\nFRACTAL_SERVER=http://localhost:$PORT" > .fractal.env
5+
6+
###############################################################################
7+
# THINGS TO BE CHANGED BY THE USER
8+
# Adapt to settings to where you run the example
9+
BASE_FOLDER_EXAMPLE="/data/homes/$USERNAME/03_fractal/fractal-demos/examples"
10+
###############################################################################
11+
12+
13+
LABEL="2d_only"
14+
15+
###############################################################################
16+
# IMPORTANT: modify the following lines, depending on your preferences
17+
# 1. They MUST include a `cd` command to a path where your user can write. The
18+
# simplest is to use `cd $HOME`, but notice that this will create many sh
19+
# scripts in your folder. You can also use `cd $HOME/.fractal_scripts`, but
20+
# first make sure that such folder exists
21+
# 2. They MAY include additional commands to load a python environment. The ones
22+
# used in the current example are appropriate for the UZH Pelkmans lab setup.
23+
TARGET_DIR=~/.tmp_fractal
24+
rm -r $TARGET_DIR
25+
mkdir $TARGET_DIR
26+
WORKER_INIT="\
27+
export HOME=$HOME; \
28+
cd $TARGET_DIR; \
29+
source /opt/easybuild/software/Anaconda3/2019.07/etc/profile.d/conda.sh; \
30+
conda activate /data/homes/fractal/sharedfractal; \
31+
"
32+
###############################################################################
33+
34+
# Set useful variables
35+
PRJ_NAME="2proj-"$USERNAME
36+
DS_IN_NAME="input-ds-"$USERNAME
37+
DS_OUT_NAME="output-ds-"$USERNAME
38+
WF_NAME="2WF-2x2-"$USERNAME
39+
40+
# Set cache path and remove any previous file from there
41+
export FRACTAL_CACHE_PATH=`pwd`/".cache"
42+
rm -v ${FRACTAL_CACHE_PATH}/session
43+
rm -v ${FRACTAL_CACHE_PATH}/tasks
44+
45+
# Define/initialize empty project folder and temporary file
46+
PROJ_DIR=$BASE_FOLDER_EXAMPLE/tmp_${LABEL}
47+
rm -r $PROJ_DIR
48+
mkdir $PROJ_DIR
49+
TMPJSON=${PROJ_DIR}/tmp.json
50+
51+
###############################################################################
52+
# IMPORTANT: modify the following lines so that they point to absolute paths
53+
INPUT_PATH="/data/active/fractal/2D/hiPSC_Slice/2D_test_set"
54+
OUTPUT_PATH='/data/active/jluethi/Fractal/20221110-2D-only-test/'
55+
###############################################################################
56+
57+
# Define useful auxiliary command (this will be removed in the future)
58+
CMD_JSON="python aux_extract_id_from_project_json.py $TMPJSON"
59+
60+
# Create project
61+
fractal -j project new $PRJ_NAME $PROJ_DIR > $TMPJSON
62+
PRJ_ID=`$CMD_JSON project_id`
63+
DS_IN_ID=`$CMD_JSON dataset_id "default"`
64+
echo "PRJ_ID: $PRJ_ID"
65+
echo "DS_IN_ID: $DS_IN_ID"
66+
67+
# Update dataset name/type, and add a resource
68+
fractal dataset edit --name "$DS_IN_NAME" -t image --read-only $PRJ_ID $DS_IN_ID
69+
fractal dataset add-resource -g "*.png" $PRJ_ID $DS_IN_ID $INPUT_PATH
70+
71+
# Add output dataset, and add a resource to it
72+
DS_OUT_ID=`fractal --batch project add-dataset $PRJ_ID "$DS_OUT_NAME"`
73+
fractal dataset edit -t zarr --read-write $PRJ_ID $DS_OUT_ID
74+
fractal dataset add-resource -g "*.zarr" $PRJ_ID $DS_OUT_ID $OUTPUT_PATH
75+
76+
# Create workflow
77+
WF_ID=`fractal --batch task new "$WF_NAME" workflow image zarr`
78+
echo "WF_ID: $WF_ID"
79+
80+
# Add subtasks
81+
82+
echo "{\"num_levels\": 5, \"executor\": \"cpu-low\", \"coarsening_xy\": 2, \"channel_parameters\": {\"A01_C01\": {\"label\": \"Channel 1\",\"colormap\": \"00FFFF\",\"start\": 110,\"end\": 700 }, \"A01_C02\": {\"label\": \"Channel 2\",\"colormap\": \"FF00FF\",\"start\": 110,\"end\": 1500 }, \"A02_C03\": {\"label\": \"Channel 3\",\"colormap\": \"FFFF00\",\"start\": 110,\"end\": 1500 }}}" > ${PROJ_DIR}/args_create.json
83+
fractal task add-subtask $WF_ID "Create OME-ZARR structure" --args-file ${PROJ_DIR}/args_create.json
84+
85+
echo "{\"executor\": \"cpu-mid\"}" > ${PROJ_DIR}/args_yoko.json
86+
fractal task add-subtask $WF_ID "Yokogawa to Zarr" > ${PROJ_DIR}/args_yoko.json
87+
88+
# # Paths of illumination correction images need to be accessible on the server.
89+
# # This works if one runs the client from the same machine as the server. Otherwise, change `root_path_corr`
90+
# echo "{\"overwrite\": true, \"executor\": \"cpu-low\", \"dict_corr\": {\"root_path_corr\": \"$BASE_FOLDER_EXAMPLE/../demo-october-2022/illum_corr_images/\", \"A01_C01\": \"20220621_UZH_manual_illumcorr_40x_A01_C01.png\", \"A01_C02\": \"20220621_UZH_manual_illumcorr_40x_A01_C02.png\", \"A02_C03\": \"20220621_UZH_manual_illumcorr_40x_A02_C03.png\"}}" > ${PROJ_DIR}/args_illum.json
91+
# fractal task add-subtask $WF_ID "Illumination correction" --args-file ${PROJ_DIR}/args_illum.json
92+
93+
# Doing an MIP on a 2D dataset isn't meaningful.
94+
# Included here in the test to check that it doesn't mess anything up. It's fine in my tests
95+
echo "{\"executor\": \"cpu-low\"}" > ${PROJ_DIR}/args_replicate.json
96+
fractal task add-subtask $WF_ID "Replicate Zarr structure" --args-file ${PROJ_DIR}/args_replicate.json
97+
98+
echo "{\"executor\": \"cpu-low\"}" > ${PROJ_DIR}/args_mip.json
99+
fractal task add-subtask $WF_ID "Maximum Intensity Projection" --args-file ${PROJ_DIR}/args_mip.json
100+
101+
echo "{\"labeling_level\": 2, \"executor\": \"cpu-low\", \"ROI_table_name\": \"well_ROI_table\"}" > ${PROJ_DIR}/args_labeling.json
102+
fractal task add-subtask $WF_ID "Cellpose Segmentation" --args-file ${PROJ_DIR}/args_labeling.json
103+
104+
echo "{\"level\": 0, \"measurement_table_name\": \"nuclei\", \"executor\": \"cpu-mid\", \"ROI_table_name\": \"well_ROI_table\",\"workflow_file\": \"$BASE_FOLDER_EXAMPLE/14_2D_only_test/regionprops_from_existing_labels_feature.yaml\"}" > ${PROJ_DIR}/args_measurement.json
105+
fractal task add-subtask $WF_ID "Measurement" --args-file ${PROJ_DIR}/args_measurement.json
106+
107+
# Apply workflow
108+
fractal task apply $PRJ_ID $DS_IN_ID $DS_OUT_ID $WF_ID --worker_init "$WORKER_INIT"

0 commit comments

Comments
 (0)