Skip to content
Snippets Groups Projects
Commit f7b2edfc authored by Peter Steinbach's avatar Peter Steinbach
Browse files

Merge pull request #6 from schmiedc/Documentation

Documentation
parents 6f9d770c cd05e2c3
No related branches found
No related tags found
No related merge requests found
Showing
with 787 additions and 134 deletions
Datasets
========================
The scripts are now supporting multiple angles, multiple channels and multiple illumination direction without adjusting the .bsh or creat-jobs.sh scripts.
The scripts are now supporting multiple angles, multiple channels and multiple illumination direction without adjusting the Snakefile or .bsh scripts.
Based on SPIM registration version 3.3.8
Based on SPIM registration version 3.3.9
Supported datasets are in the following format:
......@@ -25,27 +25,72 @@ Timelapse based workflow
Expected setup
--------------
Clone the repository:
The repository contains the example configuration scripts for single and dual channel datasets, the Snakefile which defines the workflow, the beanshell scripts which drive the processing via Fiji and a cluster.json file which contains information for the cluster queuing system.
```bash
/path/to/repo/timelapse
├── single_test.yaml
├── dual_OneChannel.yaml
├── Snakefile
├── cluster.json
├── define_tif_zip.bsh
├── define_czi.bsh
├── registration.bsh
├── deconvolution.bsh
├── transform.bsh
├── registration.bsh
└── xml_merge.bsh
```
A data directory e.g. looks like this:
It contains the .yaml file for the specific dataset. You can either copy it if you want to keep it together with the dataset or make a symlink from the processing repository.
* a data directory e.g. looks like this
```bash
/path/to/data
├── deconvolution_CPU.bsh # copied/symlinked from this repo
├── deconvolution_GPU.bsh # copied/symlinked from this repo
├── hdf5_test_unicore-00-00.h5
├── hdf5_test_unicore-01-00.h5
├── hdf5_test_unicore.h5
├── hdf5_test_unicore.xml
├── registration.bsh # copied/symlinked from this repo
├── tomancak.json # copied/symlinked from this repo
├── transform.bsh # copied/symlinked from this repo
└── xml_merge.bsh # copied/symlinked from this repo
├── dataset.czi
├── dataset(1).czi
├── dataset(2).czi
├── dataset(3).czi
├── dataset(4).czi
└── dataset.yaml # copied/symlinked from this repo
```
* `tomancak.json` contains the parameters that configure the beanshell scripts found in the data directory
* `tomancak.yaml` contains the parameters that configure the beanshell scripts found in the data directory
* `Snakefile` from this directory
* `cluster.json` that resides in the same directory as the `Snakefile`
* cluster runs LSF
Tools:
--------------
The tool directory contains scripts for common file format pre-processing.
Some datasets are currently only usable when resaving them into .tif:
* discontinous .czi datasets
* .czi dataset with multiple groups
The master_preprocesing.sh file is the configuration script that contains the information about the dataset that needs to be resaved or split. rename-zeiss-file.sh is renaming the .czi files into the .tif naming convention for SPIM processing: SPIM_TL{t}_Angle{a}.tif. The different resaving steps are then carried out by creating the jobs and submitting them to the cluster.
```bash
/path/to/repo/tools
├── master_preprocessing.sh
├── rename-zeiss-file.sh
├── compress
├── create-compress-jobs.sh
├── for_czi.bsh
└── submit-jobs
├── czi_resave
├── create-resaving-jobs.sh
├── resaving.bsh
└── submit-jobs
└── split_channels
├── create-split-jobs.sh
├── split.bsh
└── submit.jobs
```
Submitting Jobs
---------------
......@@ -61,3 +106,12 @@ If not:
```bash
snakemake -j2 -d /path/to/data/ --cluster-config ./cluster.json --cluster "bsub -q {cluster.lsf_q} {cluster.lsf_extra}"
```
Log files and supervision of the pipeline
---------------
The log files are written into a new directory in the data directory called "logs".
The log files are ordered according to their position in the workflow. Multiple or alternative steps in the pipeline are indicated by numbers.
force certain rules:
use the -R flag to rerun a particular rule and everything downstream
-R <name of rule>
......@@ -7,14 +7,13 @@ if JOBDIR[-1] != "/": # this checks if jobdir ends with slash if not it adds a s
JOBDIR+="/"
# Test config file single Channel:
# configfile: "single_test.yaml"
configfile: "single_test.yaml"
# Test config file dual channel one channel contains beads:
configfile: "dual_OneChannel.yaml"
# configfile: "dual_OneChannel.yaml"
# data specific config file, expected to be inside JOBDIR
# configfile: "tomancak_test_cluster.yaml"
padding_format = "{0:0"+str(padding_of_file_id(int(config["common"]["ntimepoints"])))+"d}"
ds_format = "-"+padding_format+"-00.h5"
......@@ -27,14 +26,20 @@ rule done:
input: [ ds + "_output_hdf5" for ds in datasets ]
#input: [ ds + "_fusion" for ds in datasets ]
localrules: define_xml_czi, define_xml_tif, hdf5_xml, xml_merge, timelapse,
duplicate_transformations, external_transform, define_output,
hdf5_xml_output
rule resave_prepared:
input: expand("{dataset}.{suffix}",dataset=[ config["common"]["hdf5_xml_filename"] ], suffix=["xml","h5"])
# defining xml for czi dataset
rule define_xml_czi:
input:glob.glob('*.czi'), config["define_xml_czi"]["first_czi"]
input:glob.glob('*.czi'), config["common"]["first_czi"]
output: config["common"]["first_xml_filename"] + ".xml"
log: "define_xml_czi.log"
log: "logs/a1_define_xml_czi.log"
run:
cmd_string = produce_string("""{fiji-prefix} {fiji-app} \
-Dimage_file_directory={jdir} \
......@@ -60,9 +65,9 @@ rule define_xml_czi:
# defining xml for tif dataset
rule define_xml_tif:
input: glob.glob(re.sub("{{.}}","*",config["define_xml_tif"]['image_file_pattern'])) #replaces all occurrences of {{a}} (a can be any character) by * to use the string for globbing
input: glob.glob(re.sub("{{.}}","*",config["common"]['image_file_pattern'])) #replaces all occurrences of {{a}} (a can be any character) by * to use the string for globbing
output: config["common"]["first_xml_filename"] + ".xml"
log: "define_xml_tif.log"
log: "logs/a2_define_xml_tif.log"
run:
cmd_string = produce_string(
"""{fiji-prefix} {fiji-app} \
......@@ -101,7 +106,7 @@ rule hdf5_xml:
input: config["common"]["first_xml_filename"] + ".xml"
output: expand("{dataset}.{suffix}",dataset=[ config["common"]["hdf5_xml_filename"].strip('\"')], suffix=["xml","h5"]),
[ item+"_xml" for item in datasets ]
log: "hdf5_xml.log"
log: "logs/b1_hdf5_xml.log"
run:
part_string = produce_string(
"""{fiji-prefix} {fiji-app} \
......@@ -119,7 +124,6 @@ rule hdf5_xml:
-Drun_only_job_number=0 \
-- --no-splash {path_bsh}""",
config["common"],
config["define_xml_czi"],
config["resave_hdf5"],
jdir=JOBDIR,
path_bsh=config["common"]["bsh_directory"] + config["resave_hdf5"]["bsh_file"])
......@@ -132,7 +136,7 @@ rule hdf5_xml:
rule resave_hdf5:
input: rules.hdf5_xml.output # "{xml_base}-{file_id,\d+}-00.h5_xml"
output: "{xml_base}-{file_id,\d+}-00.h5", "{xml_base}-{file_id,\d+}-00.h5_hdf5"
log: "resave_hdf5-{file_id}.log"
log: "logs/b2_resave_hdf5-{file_id}.log"
run:
part_string = produce_string(
"""{fiji-prefix} {fiji-app} \
......@@ -150,7 +154,6 @@ rule resave_hdf5:
-Drun_only_job_number={job_number} \
-- --no-splash {path_bsh}""", # the & submits everyting at once
config["common"],
config["define_xml_czi"],
config["resave_hdf5"],
jdir=JOBDIR,
path_bsh=config["common"]["bsh_directory"] + config["resave_hdf5"]["bsh_file"],
......@@ -162,7 +165,7 @@ rule resave_hdf5:
rule registration:
input: "{xml_base}-{file_id}-00.h5"
output: "{xml_base}.job_{file_id,\d+}.xml"#, "{xml_base}-{file_id,\d+}-00.h5_registered",
log: "{xml_base}-{file_id}-registration.log"
log: "logs/c_{xml_base}-{file_id}-registration.log"
run:
cmd_string = produce_string(
"""{fiji-prefix} {fiji-app} \
......@@ -216,7 +219,7 @@ rule registration:
rule xml_merge:
input: [ str(config["common"]["hdf5_xml_filename"].strip('\"')+".job_"+(padding_format.format(item))+".xml") for item in range(int(config["common"]["ntimepoints"])) ] #[ item+"_registered" for item in datasets ]
output: "{xml_base}_merge.xml"
log: "{xml_base}_merge.log"
log: "logs/d1_{xml_base}_merge.log"
run:
cmd_string = produce_string(
"""{fiji-prefix} {fiji-app} \
......@@ -234,7 +237,7 @@ rule xml_merge:
rule timelapse:
input: rules.xml_merge.output
output: rules.xml_merge.output[0] + "_timelapse"
log: "{xml_base}_timelapse.log"
log: "logs/d2_{xml_base}_timelapse.log"
run:
cmd_string = produce_string(
"""{fiji-prefix} {fiji-app} \
......@@ -267,7 +270,7 @@ rule timelapse:
rule duplicate_transformations:
input: rules.timelapse.output, merged_xml="{xml_base}_merge.xml"
output: rules.timelapse.output[0] + "_duplicate"
log: "{xml_base}_duplicate_transformations.log"
log: "logs/d3_{xml_base}_duplicate_transformations.log"
run:
cmd_string = produce_string(
"""{fiji-prefix} {fiji-app} \
......@@ -294,7 +297,7 @@ rule duplicate_transformations:
rule fusion:
input: [ str("{xml_base}_merge.xml_" + config["common"]["transformation_switch"] ) ], "{xml_base}-{file_id,\d+}-00.h5", merged_xml="{xml_base}_merge.xml" # rules.timelapse.output, "{xml_base}-{file_id,\d+}-00.h5", merged_xml="{xml_base}_merge.xml"
output: "{xml_base}-{file_id,\d+}-00.h5_fusion"
log: "{xml_base}-{file_id,\d+}-00-fusion.log"
log: "logs/e1_{xml_base}-{file_id,\d+}-00-fusion.log"
run:
cmd_string = produce_string(
"""{fiji-prefix} {fiji-app} \
......@@ -337,7 +340,7 @@ rule fusion:
rule external_transform:
input: rules.timelapse.output, merged_xml="{xml_base}_merge.xml"
output: rules.timelapse.output[0] + "_external_trafo"
log: "external_transform.log"
log: "logs/e2_external_transform.log"
run:
cmd_string = produce_string(
"""{fiji-prefix} {fiji-app} \
......@@ -365,7 +368,7 @@ rule external_transform:
rule deconvolution:
input: [ str("{xml_base}_merge.xml_" + config["common"]["transformation_switch"] ) ], "{xml_base}-{file_id,\d+}-00.h5", merged_xml="{xml_base}_merge.xml" # rules.timelapse.output, "{xml_base}-{file_id,\d+}-00.h5", merged_xml="{xml_base}_merge.xml" # rules.external_transform.output, "{xml_base}-{file_id,\d+}-00.h5", merged_xml="{xml_base}_merge.xml"
output: "{xml_base}-{file_id,\d+}-00.h5_deconvolution"
log: "{xml_base}-{file_id,\d+}-00-deconvolution.log"
log: "logs/e2_{xml_base}-{file_id,\d+}-00-deconvolution.log"
run:
cmd_string = produce_string(
"""{fiji-prefix} {fiji-app} \
......@@ -411,8 +414,8 @@ rule deconvolution:
rule define_output:
input: [ item + "_" + config["common"]["fusion_switch"] for item in datasets ], glob.glob('TP*')
output: config["hdf5_output"]["output_xml"].strip('\"') + ".xml"
log: "define_output.log"
output: config["common"]["output_xml"].strip('\"') + ".xml"
log: "logs/f1_define_output.log"
run:
cmd_string = produce_string(
"""{fiji-prefix} {fiji-app} \
......@@ -442,10 +445,10 @@ rule define_output:
# create mother .xml/.h5
rule hdf5_xml_output:
input: config["hdf5_output"]["output_xml"].strip('\"') + ".xml"
output: expand("{dataset}.{suffix}",dataset=[ config["common"]["hdf5_xml_filename"].strip('\"')], suffix=["xml","h5"]),
input: config["common"]["output_xml"].strip('\"') + ".xml"
output: expand("{dataset}.{suffix}",dataset=[ config["common"]["output_hdf5_xml"].strip('\"')], suffix=["xml","h5"]),
[ item+"_output" for item in datasets ]
log: "output_hdf5_xml.log"
log: "logs/f2_output_hdf5_xml.log"
run:
part_string = produce_string(
"""{fiji-prefix} {fiji-app} \
......@@ -476,7 +479,7 @@ rule hdf5_xml_output:
rule resave_hdf5_output:
input: rules.hdf5_xml_output.output
output: "{xml_base}-{file_id,\d+}-00.h5_output_hdf5"
log: "resave_output-{file_id}.log"
log: "logs/f3_resave_output-{file_id}.log"
run:
part_string = produce_string(
"""{fiji-prefix} {fiji-app} \
......
common: {
# ============================================================================
#
# yaml example file for single channel processing
#
# General settings for processing
#
# ============================================================================
# directory that contains the bean shell scripts and Snakefile
bsh_directory: "/projects/pilot_spim/Christopher/snakemake-workflows/spim_registration/timelapse/",
bsh_directory: "/projects/pilot_spim/Christopher/snakemake-workflows/spim_registration/timelapse/",
# Directory that contains the cuda libraries
directory_cuda: "/sw/users/schmied/cuda/",
directory_cuda: "/sw/users/schmied/cuda/",
# Directory that contains the current working Fiji
fiji-app: "/sw/users/schmied/packages/2015-06-30_Fiji.app.cuda/ImageJ-linux64",
fiji-app: "/sw/users/schmied/packages/2015-06-30_Fiji.app.cuda/ImageJ-linux64",
fiji-prefix: "/sw/users/schmied/packages/xvfb-run -a", # calls xvfb for Fiji headless mode
# ============================================================================
# Processing switches
# Description: Use switches to decide which processing steps you need:
#
# Options:
# transformation_switch: "timelapse" standard processing
# after timelapse registration directly goes into fusion, timelapse_duplicate
# "timelapse_duplicate" for dual channel processing one channel contains the beads
#
# Switches between content based fusion and deconvoltion
# "deconvolution" > for deconvolution
# "fusion" > for content based fusion
# ============================================================================
#
# Transformation switch:
transformation_switch: "timelapse",
# Fusion switch:
fusion_switch: "fusion",
# ============================================================================
# xml file names
#
# xml file names without .xml suffix
# ============================================================================
first_xml_filename: 'single', # Name of the xml file for the .czi or .tif files
hdf5_xml_filename: '"hdf5_single"', # Name of .xml file for the hdf5 data after resave_hdf5
merged_xml: 'hdf5_single_merge', # Name of .xml file after merge
# ============================================================================
# Describe the dataset
#
# Options: number of timepoints
# angles
# channels
# illuminations
# pixel size
# ============================================================================
ntimepoints: 2, # number of timepoints of dataset
angles: "0,72,144,216,288", # angles
angles: "0,72,144,216,288", # angles
channels: "green", # channels
illumination: "0", # illuminations
pixel_distance_x: '0.28590106964', # Manual calibration x
pixel_distance_y: '0.28590106964', # Manual calibration y
pixel_distance_z: '1.50000', # Manual calibration z
pixel_unit: "um", # unit of manual calibration
# Use switches to decide which processing steps you need:
# transformation_switch: "timelapse" standard processing
# after timelapse registration directly goes into fusion, timelapse_duplicate
# "timelapse_duplicate" for dual channel processing one channel contains the beads
# duplicates transformations
transformation_switch: "timelapse",
# Switches between content based fusion and deconvoltion
# "deconvolution" > for deconvolution
# "fusion" > for content based fusion
fusion_switch: "fusion"
# ----------------------------------------------------------------------------
# For .czi datasets
# master .czi file
first_czi: "2015-02-21_LZ1_Stock68_3.czi",
# ----------------------------------------------------------------------------
# For .tif datasets
# file pattern of .tif files:
# for multi channel give spim_TL{tt}_Angle{a}_Channel{c}.tif
# for padded zeros use tt
image_file_pattern: 'img_TL{{t}}_Angle{{a}}.tif',
# ============================================================================
# Detection and registration
#
# Description: settings for interest point detection and registration
# Options: Single channel and dual channel processing
# Difference-of-mean or difference-of-gaussian detection
# ============================================================================
# reg_process_channel:
# Single Channel: '"All channels"'
# Dual Channel: '"All channels"'
# Dual Channel one Channel contains beads: '"Single channel (Select from List)"'
reg_process_channel: '"All channels"',
#
# Dual channel 1 Channel contains the beads: which channel contains the beads?
reg_processing_channel: '"green"',
#
# reg_interest_points_channel:
# Single Channel: '"beads"'
# Dual Channel: '"beads,beads"'
# Dual Channel: Channel does not contain the beads '"[DO NOT register this channel],beads"'
reg_interest_points_channel: '"beads"',
#
# type of detection: '"Difference-of-Mean (Integral image based)"' or '"Difference-of-Gaussian"'
type_of_detection: '"Difference-of-Mean (Integral image based)"',
# Settings for Difference-of-Mean
# For multiple channels 'value1,value2' delimiter is ,
reg_radius_1: '2',
reg_radius_2: '3',
reg_threshold: '0.005',
# Settings for Difference-of-Gaussian
# For multiple channels 'value1,value2' delimiter is ,
sigma: '1.8',
threshold_gaussian: '0.0080',
# ============================================================================
# Timelapse registration
#
# Description: settings for timelapse registration
# Options: reference timepoint
# ============================================================================
reference_timepoint: '0', # Reference timepoint
# ============================================================================
# Content-based multiview fusion
#
# Description: settings for content-based multiview fusion
# Options: downsampling
# Cropping parameters based on full resolution
# ============================================================================
downsample: '2', # set downsampling
minimal_x: '190', # Cropping parameters of full resolution
minimal_y: '-16',
minimal_z: '-348',
maximal_x: '1019',
maximal_y: '1941',
maximal_z: '486',
# ============================================================================
# Multiview deconvolution
#
# Description: settings for multiview deconvolution
# Options: number of iterations
# Cropping parameters taking downsampling into account
# Channel settings for deconvolution
# ============================================================================
iterations: '5', # number of iterations
minimal_x_deco: '190', # Cropping parameters: take downsampling into account
minimal_y_deco: '-16',
minimal_z_deco: '-348',
maximal_x_deco: '1019',
maximal_y_deco: '1941',
maximal_z_deco: '486',
#
# Channel settings for deconvolution
# Single Channel: '"beads"'
# Dual Channel: '"beads,beads"'
# Dual Channel one channel contains beads: '"[Same PSF as channel red],beads"'
detections_to_extract_psf_for_channel: '"[Same PSF as channel red],beads"',
#
# ============================================================================
# Resave output
#
# Description: writes new hdf5 dataset for fusion output
# Options: Naming pattern of output based on channel number
# Channel settings
# File name for resaving output into hdf5
# Pixel size > isotropic resolution
# Image type (16Bit from content-based fusion, 32Bit from deconvolution)
# ============================================================================
# Number of timepoints
output_timepoints: '0-1', # Timepoints format: '1-2'
#
# Naming pattern:
# Single Channel: TP{{t}}_Chgreen_Ill0_Ang0,72,144,216,288.tif > Ch{name} is added here
# Dual Channel: TP{{t}}_Ch{{0}}_Ill0_Ang0,72,144,216,288.tif > Ch{name} is added here
output_image_file_pattern: '"TP{{t}}_Chgreen_Ill0_Ang0,72,144,216,288.tif"',
#
# channel setting:
# Single channel: '"NO (one channel)"'
# Dual channel: '"YES (one file per channel)"'
output_multiple_channels: '"NO (one channel)"',
output_channels: "green",
#
# .xml file names
output_xml: '"fused_Single"',
output_hdf5_xml: '"hdf5_fused_Single"',
#
# pixel size of output: take downsampling into account!
output_pixel_distance_x: 0.28590106964,
output_pixel_distance_y: 0.28590106964,
output_pixel_distance_z: 0.28590106964,
output_pixel_unit: 'um',
#
# File type
output_data_type: "16Bit" # "32Bit" or "16Bit"
}
define_xml_czi: {
first_czi: "2015-02-21_LZ1_Stock68_3.czi", # master .czi file
rotation_around: "X-Axis", # axis of acquistion
bsh_file: "define_czi.bsh" # .bsh script for defining .czi file
rotation_around: "X-Axis", # axis of acquistion
bsh_file: "define_czi.bsh" # .bsh script for defining .czi file
}
define_xml_tif: {
# file pattern of .tif files
# for multi channel give spim_TL{tt}_Angle{a}_Channel{c}.tif
# # SPIM file pattern: for padded zeros use tt
image_file_pattern: 'img_TL{{t}}_Angle{{a}}.tif',
# Settings for ImageJ Opener
type_of_dataset: '"Image Stacks (ImageJ Opener)"',
multiple_timepoints: '"YES (one file per time-point)"', # or NO (one time-point)
......@@ -49,18 +193,18 @@ define_xml_tif: {
multiple_channels: '"NO (one channel)"', # or "\"NO (one channel)\""
multiple_illumination_directions: '"NO (one illumination direction)"', # or YES (one file per illumination direction)
imglib_container: '"ArrayImg (faster)"', # '"ArrayImg (faster)"'
bsh_file: "define_tif_zip.bsh"
bsh_file: "define_tif_zip.bsh"
}
resave_hdf5: {
# Resaves .tif or .czi data into hdf5
# Subsampling and resolution settings for hdf5: data dependent
hdf5_chunk_sizes: '"{{ {{32,32,4}}, {{32,32,4}}, {{16,16,16}}, {{16,16,16}} }}"',
subsampling_factors: '"{{ {{1,1,1}}, {{2,2,1}}, {{4,4,1}}, {{8,8,1}} }}"',
hdf5_chunk_sizes: '"{{ {{32,32,4}}, {{32,32,4}}, {{16,16,16}}, {{16,16,16}} }}"',
subsampling_factors: '"{{ {{1,1,1}}, {{2,2,1}}, {{4,4,1}}, {{8,8,1}} }}"',
# Standard settings for cluster processing
setups_per_partition: '0',
timepoints_per_partition: '1',
resave_timepoint: '"All Timepoints"',
resave_timepoint: '"All Timepoints"',
resave_angle: '"All angles"',
resave_channel: '"All channels"',
resave_illumination: '"All illuminations"',
......@@ -68,30 +212,6 @@ resave_hdf5: {
}
registration: {
# reg_process_channel:
# # Single Channel: '"All channels"'
# Dual Channel: '"All channels"'
# Dual Channel one Channel contains beads: '"Single channel (Select from List)"'
reg_process_channel: '"All channels"',
# reg_processing_channel:
# Dual Channel setting for 1 Channel contains the beads
reg_processing_channel: '"green"',
# reg_interest_points_channel:
# Single Channel: '"beads"'
# Dual Channel: '"beads,beads"'
# Dual Channel: Channel does not contain the beads '"[DO NOT register this channel],beads"'
reg_interest_points_channel: '"beads"',
# type of detection: '"Difference-of-Mean (Integral image based)"' or '"Difference-of-Gaussian"'
type_of_detection: '"Difference-of-Mean (Integral image based)"',
# Settings for Difference-of-Mean
# For multiple channels 'value1,value2' delimiter is ,
reg_radius_1: '2',
reg_radius_2: '3',
reg_threshold: '0.005',
# Settings for Difference-of-Gaussian
# For multiple channels 'value1,value2' delimiter is ,
sigma: '1.8',
threshold_gaussian: '0.0080',
# Processing setting for Difference-of-Gaussian detection
# compute_on:
compute_on: '"GPU accurate (Nvidia CUDA via JNA)"',
......@@ -125,7 +245,6 @@ xml_merge: {
}
timelapse: {
reference_timepoint: '0', # Reference timepoint
# Standard settings for timelapse registration
type_of_registration_timelapse: '"Match against one reference timepoint (no global optimization)"',
timelapse_process_timepoints: '"All Timepoints"',
......@@ -143,16 +262,6 @@ dublicate_transformations: {
}
fusion: {
# content based multiview fusion
# supports multi channel without new settings
downsample: '2', # set downsampling
# Cropping parameters of full resolution
minimal_x: '190',
minimal_y: '-16',
minimal_z: '-348',
maximal_x: '1019',
maximal_y: '1941',
maximal_z: '486',
# fused_image: '"Append to current XML Project"', does not work yet
process_timepoint: '"Single Timepoint (Select from List)"',
process_angle: '"All angles"',
......@@ -179,25 +288,12 @@ external_transform: {
apply_transformation: '"Current view transformations (appends to current transforms)"',
define_mode_transform: '"Matrix"',
# Matrix for downsampling
matrix_transform: '"0.5, 0.0, 0.0, 0.0, 0.0, 0.5, 0.0, 0.0, 0.0, 0.0, 0.5, 0.0"',
matrix_transform: '"0.5, 0.0, 0.0, 0.0, 0.0, 0.5, 0.0, 0.0, 0.0, 0.0, 0.5, 0.0"',
transformation: '"Rigid"',
bsh_file: "transform.bsh"
}
deconvolution: {
iterations: '5', # number of iterations
# Cropping parameters: take downsampling into account
minimal_x_deco: '190',
minimal_y_deco: '-16',
minimal_z_deco: '-348',
maximal_x_deco: '1019',
maximal_y_deco: '1941',
maximal_z_deco: '486',
# Channel settings for deconvolution
# Single Channel: '"beads"'
# Dual Channel: '"beads,beads"'
# Dual Channel one channel contains beads: '"[Same PSF as channel red],beads"'
detections_to_extract_psf_for_channel: '"[Same PSF as channel red],beads"',
# Settings for GPU or CPU processing
# '"CPU (Java)"' or '"GPU (Nvidia CUDA via JNA)"'
compute_on: '"GPU (Nvidia CUDA via JNA)"',
......@@ -220,26 +316,6 @@ deconvolution: {
}
hdf5_output: {
# writes new hdf5 dataset for fusion output: will be obsolete
# Naming pattern of output
# Single Channel: TP{{t}}_Chgreen_Ill0_Ang0,72,144,216,288.tif > Ch{name} is added here
# Dual Channel: TP{{t}}_Ch{{0}}_Ill0_Ang0,72,144,216,288.tif > Ch{name} is added here
output_image_file_pattern: '"TP{{t}}_Chgreen_Ill0_Ang0,72,144,216,288.tif"',
# channel setting
output_multiple_channels: '"NO (one channel)"', # '"YES (one file per channel)"' or '"NO (one channel)"'
output_channels: "green",
# .xml file names
output_xml: '"fused_Single"',
output_hdf5_xml: '"hdf5_fused_Single"',
output_timepoints: '0-1', # Timepoints format: '1-2'
# pixel size of output: take downsampling into account!
output_pixel_distance_x: 0.28590106964,
output_pixel_distance_y: 0.28590106964,
output_pixel_distance_z: 0.28590106964,
output_pixel_unit: 'um',
# give if 16Bit data or 32Bit data
# output of fusion is 16Bit, of deconvolution it is 32Bit
output_data_type: "16Bit", # "32Bit" or "16Bit"
# if data is 32Bit then the data is converted into 16Bit data
convert_32bit: '"[Use min/max of first image (might saturate intenities over time)]"',
# subsampling and chunk size settings: dataset dependent
......
#!/bin/bash
#===============================================================================
#
# FILE: master
#
# DESCRIPTION: source file for pre-processing of file formats
#
# AUTHOR: Christopher Schmied, schmied@mpi-cbg.de
# INSTITUTE: Max Planck Institute for Molecular Cell Biology and Genetics
# BUGS:
# NOTES:
# Version: 1.0
# CREATED: 2015-07-05
# REVISION: 2015-07-05
#
# Preprocessing
# 1) rename .czi files
# 2) resave .czi files into .tif or .zip
# 3) resave ome.tiff files into .tif
# 4) Splitting output for Channel is
# c=0,1 etc
# spim_TL{tt}_Angle{a}_Channel{c}.tif
#===============================================================================
image_file_directory="/projects/pilot_spim/Christopher/Test_pipeline_3.0/czi/"
# --- jobs directory -----------------------------------------------------------
job_directory="/projects/pilot_spim/Christopher/snakemake-workflows/spim_registration/tools/"
#-------------------------------------------------------------------------------
# Resaving, Renaming files and Splitting: General
#
# Important: For renaming and resaving .czi files the first .czi file has to
# carry the index (0)
#-------------------------------------------------------------------------------
pad="3" # for padded zeros
angle_prep="1" # angles format: "1 2 3"
#--- Renaming ------------------------------------------------------------------
first_index="0" # First index of czi files
last_index="391" # Last index of czi files
first_timepoint="0" # The first timepoint
angles_renaming=(1) # Angles format: (1 2 3)
source_pattern=2014-10-23_H2A_gsb_G3\(\{index\}\).czi # Name of .czi files
target_pattern=spim_TL\{timepoint\}_Angle\{angle\}.czi # The output pattern of renaming
#-------------------------------------------------------------------------------
# Fiji settings
#-------------------------------------------------------------------------------
XVFB_RUN="/sw/bin/xvfb-run" # virtual frame buffer
# working Fiji
#Fiji="/sw/users/schmied/packages/2015-05-21_Fiji.app.cuda/ImageJ-linux64" # woriking Fiji
Fiji="/sw/users/schmied/packages/2015-06-08_Fiji.app.cuda/ImageJ-linux64"
#Fiji for Dual Channel timelapse and Dual Channel Deconvolution
FijiDualTimelapse="/sw/users/schmied/packages/2015-05-29_Fiji-2.3.9-SNAP.app.cuda/ImageJ-linux64"
Fiji_resave="/sw/users/schmied/lifeline/Fiji.app.lifeline2/ImageJ-linux64" # Fiji that works for resaving
Fiji_Deconvolution=${FijiDualTimelapse} # Fiji that works for deconvolution
#-------------------------------------------------------------------------------
# Pre-processing
#-------------------------------------------------------------------------------
#--- Resaving .czi into .tif files----------------------------------------------
jobs_resaving=${job_directory}"czi_resave" # directory .czi resaving
resaving=${jobs_resaving}"/resaving.bsh" # script .czi resaving
#--- Resaving ome.tiff into .tif files -----------------------------------------
jobs_resaving_ometiff=${job_directory}"ometiff_resave" # directory .ome.tiff resaving
resaving_ometiff=${jobs_resaving_ometiff}"/resaving-ometiff.bsh" # script .ome.tiff resaving
#--- Compress dataset;----------------------------------------------------------
jobs_compress=${job_directory}"compress" # directory .czi to .zip resaving
czi_compress=${jobs_compress}"/for_czi.bsh" # script .czi to .zip resaving
#--- Split channels-------------------------------------------------------------
jobs_split=${job_directory}"split_channels" # directory
split=${jobs_split}"/split.bsh" # script
#!/bin/bash
# path of master file
source ../master_preprocessing.sh
# path of source and target files
source_pattern=${image_file_directory}${source_pattern}
target_pattern=${image_file_directory}${target_pattern}
# ------------------------------------------------------------------------------
i=${first_index}
t=${first_timepoint}
t=`printf "%0${pad}d" "${t}"`
while [ $i -le $last_index ]; do
for a in "${angles_renaming[@]}"; do
source=${source_pattern/\{index\}/${i}}
tmp=${target_pattern/\{timepoint\}/${t}}
target=${tmp/\{angle\}/${a}
echo ${source} ${target} # displays source file and target file with path
mv ${source} ${target} # renames source file into target pattern
#cp ${source} ${target} # alternatively copy source file and resave into target pattern
let i=i+1
done
t=$(( 10#${t} ))
let t=t+1
t=`printf "%0${pad}d" "${t}"`
done
#!/bin/bash
# path of master file
source ../../master_preprocessing.sh
# creates directory for job files if not present
mkdir -p $jobs_compress
echo $jobs_compress
echo $czi_compress
# splits up resaving into 1 job per .czi file and writes the given parameters
# into the job file
for i in $parallel_timepoints
do
for a in $angle_prep
do
job="$jobs_compress/compress-$i-$a.job"
echo $job
echo "$XVFB_RUN -a $Fiji_resave \
-Ddir=$image_file_directory \
-Dtimepoint=$i \
-Dangle=$a \
-Dpad=$pad \
-- --no-splash $czi_compress" >> "$job"
chmod a+x "$job"
done
done
// Loads Fiji dependencies
import ij.IJ;
import ij.ImagePlus;
import java.lang.Runtime;
import java.io.File;
import java.io.FilenameFilter;
runtime = Runtime.getRuntime();
// Loads parameters form job file
System.out.println( "=======================================================" );
System.out.println( "Load Parameters" ) ;
dir = System.getProperty( "dir" );
int timepoint = Integer.parseInt( System.getProperty( "timepoint" ) );
angle = System.getProperty( "angle" );
int pad = Integer.parseInt( System.getProperty( "pad" ) );
// Prints Parameters into output file
System.out.println( "directory = " + dir );
System.out.println( "timepoint = " + timepoint );
System.out.println( "angle = " + angle );
System.out.println( "pad = " + pad );
// Executes Fiji Plugin "Bioformats Importer" to open .czi file
System.out.println( "=======================================================" );
System.out.println( "Opening Image" ) ;
IJ.run("Bio-Formats Importer",
"open=" + dir + "spim_TL" + IJ.pad( timepoint, pad ) + "_Angle" + angle + ".czi" + " " +
"autoscale " +
"color_mode=Default " +
"specify_range " +
"view=[Standard ImageJ] " +
"stack_order=Default " +
"t_begin=1000 " +
"t_end=1000 " +
"t_step=1");
// Resaves .czi files as .zip file
System.out.println( "Save as compressed image" ) ;
IJ.saveAs("ZIP ", dir + "spim_TL" + IJ.pad( timepoint, pad ) + "_Angle" + angle + ".zip");
/* shutdown */
runtime.exit(0);
names=`grep "exit code 1" out.* -l`
for name in $names; do
job=`sed -n '4,4p' $name | sed -n "s/Job <\([^>]*\)>.*/\1/p"`
echo bsub -q short -n 12 -R rusage[mem=110000] -R span[hosts=1] -o "out.%J" -e "err.%J" ${job}
bsub -q short -n 12 -R rusage[mem=110000] -R span[hosts=1] -o "out.%J" -e "err.%J" ${job}
done
#!/bin/bash
for file in `ls ${1} | grep ".job$"`
do
bsub -q short -n 4 -R span[hosts=1] -o "out.%J" -e "err.%J" ${1}/$file
done
#!/bin/bash
source ../../master_preprocessing.sh
timepoint=`seq 0 391`
dir=/projects/pilot_spim/Christopher/2014-10-23_H2A_gsb_G3/
num_angles=1
pad=3
job_dir=/projects/pilot_spim/Christopher/pipeline_3.0/jobs_alpha_3.1/czi_resave/
for i in $timepoint
do
i=`printf "%0${pad}d" "$i"`
num=$(ls $dir/spim_TL"$i"_Angle*.tif |wc -l)
if [ $num -ne $num_angles ]
then
echo "TL"$i": TP or angles missing"
//bsub -q short -n 4 -R span[hosts=1] -o "out.%J" -e "err.%J" ${1}/*${i}*
else
echo "TL"$i": Correct"
fi
done
#!/bin/bash
source ../../master_preprocessing.sh
timepoint=`seq 0 391`
dir=/projects/pilot_spim/Christopher/2014-10-23_H2A_gsb_G3/
num_angles=1
pad=3
job_dir=/projects/pilot_spim/Christopher/pipeline_3.0/jobs_alpha_3.1/czi_resave/
for i in $timepoint
do
i=`printf "%0${pad}d" "$i"`
num=$(ls $dir/spim_TL"$i"_Angle*.tif |wc -l)
if [ $num -ne $num_angles ]
then
echo "TL"$i": TP or angles missing"
//bsub -q short -n 4 -R span[hosts=1] -o "out.%J" -e "err.%J" ${1}/*${i}*
else
echo "TL"$i": Correct"
fi
done
#!/bin/bash
# path of master file
source ../../master_preprocessing.sh
# creates directory for job files if not present
mkdir -p $jobs_resaving
# splits up resaving into 1 job per .czi file and writes the given parameters
# into the job file
for i in $parallel_timepoints
do
for a in $angle_prep
do
job="$jobs_resaving/resave-$i-$a.job"
echo $job
echo "$XVFB_RUN -a $Fiji_resave \
-Ddir=$image_file_directory \
-Dtimepoint=$i \
-Dangle=$a \
-Dpad=$pad \
-- --no-splash $resaving" >> "$job"
chmod a+x "$job"
done
done
// Loads Fiji dependencies
import ij.IJ;
import ij.ImagePlus;
import java.lang.Runtime;
import java.io.File;
import java.io.FilenameFilter;
runtime = Runtime.getRuntime();
// Loads parameters form job file
System.out.println( "=======================================================" );
System.out.println( "Load Parameters" ) ;
dir = System.getProperty( "dir" );
int timepoint = Integer.parseInt( System.getProperty( "timepoint" ) );
angle = System.getProperty( "angle" );
int pad = Integer.parseInt( System.getProperty( "pad" ) );
// Prints Parameters into output file
System.out.println( "directory = " + dir );
System.out.println( "timepoint = " + timepoint );
System.out.println( "angle = " + angle );
System.out.println( "pad = " + pad );
// Executes Fiji Plugin "Bioformats Importer" to open .czi file
System.out.println( "=======================================================" );
System.out.println( "Opening Image" ) ;
IJ.run("Bio-Formats Importer",
"open=" + dir + "spim_TL" + IJ.pad( timepoint, pad ) + "_Angle" + angle + ".czi" + " " +
"autoscale " +
"color_mode=Default " +
"specify_range " +
"view=[Standard ImageJ] " +
"stack_order=Default " +
"t_begin=1000 " +
"t_end=1000 " +
"t_step=1");
// Resaves .czi files as .tif file
System.out.println( "Save as .tif" ) ;
IJ.saveAs("Tiff ", dir + "spim_TL" + IJ.pad( timepoint, pad ) + "_Angle" + angle + ".tif");
/* shutdown */
runtime.exit(0);
names=`grep "exit code 1" out.* -l`
for name in $names; do
job=`sed -n '4,4p' $name | sed -n "s/Job <\([^>]*\)>.*/\1/p"`
echo bsub -q short -n 12 -R rusage[mem=110000] -R span[hosts=1] -o "out.%J" -e "err.%J" ${job}
bsub -q short -n 12 -R rusage[mem=110000] -R span[hosts=1] -o "out.%J" -e "err.%J" ${job}
done
#!/bin/bash
for file in `ls ${1} | grep ".job$"`
do
bsub -q short -n 4 -R span[hosts=1] -o "out.%J" -e "err.%J" ${1}/$file
done
#!/bin/bash
#===============================================================================
#
# FILE: master_preprocessing.sh
#
# DESCRIPTION: source file for pre-processing of file formats
#
# AUTHOR: Christopher Schmied, schmied@mpi-cbg.de
# INSTITUTE: Max Planck Institute for Molecular Cell Biology and Genetics
# BUGS:
# NOTES:
# Version: 1.0
# CREATED: 2015-07-05
# REVISION: 2015-07-05
#
# Preprocessing
# 1) rename .czi files
# 2) resave .czi files into .tif or .zip
# 3) resave ome.tiff files into .tif
# 4) Splitting output for Channel is
# c=0,1 etc
# spim_TL{tt}_Angle{a}_Channel{c}.tif
#===============================================================================
image_file_directory="/projects/pilot_spim/Christopher/Test_pipeline_3.0/czi/"
# --- jobs directory -----------------------------------------------------------
job_directory="/projects/pilot_spim/Christopher/snakemake-workflows/spim_registration/tools/"
#-------------------------------------------------------------------------------
# Resaving, Renaming files and Splitting: General
#
# Important: For renaming and resaving .czi files the first .czi file has to
# carry the index (0)
#-------------------------------------------------------------------------------
pad="3" # for padded zeros
angle_prep="1" # angles format: "1 2 3"
#--- Renaming ------------------------------------------------------------------
first_index="0" # First index of czi files
last_index="391" # Last index of czi files
first_timepoint="0" # Starts with 0
angles_renaming=(1 2 3 4 5) # Angles format: (1 2 3)
source_pattern=2014-10-23_H2A_gsb_G3\(\{index\}\).czi # Name of .czi files
target_pattern=spim_TL\{timepoint\}_Angle\{angle\}.czi # The output pattern of renaming
#-------------------------------------------------------------------------------
# Fiji settings
#-------------------------------------------------------------------------------
XVFB_RUN="/sw/bin/xvfb-run" # virtual frame buffer
Fiji_resave="/sw/users/schmied/lifeline/Fiji.app.lifeline2/ImageJ-linux64" # Fiji that works for resaving
#-------------------------------------------------------------------------------
# Pre-processing
#-------------------------------------------------------------------------------
#--- Resaving .czi into .tif files----------------------------------------------
jobs_resaving=${job_directory}"czi_resave" # directory .czi resaving
resaving=${jobs_resaving}"/resaving.bsh" # script .czi resaving
#--- Resaving ome.tiff into .tif files -----------------------------------------
jobs_resaving_ometiff=${job_directory}"ometiff_resave" # directory .ome.tiff resaving
resaving_ometiff=${jobs_resaving_ometiff}"/resaving-ometiff.bsh" # script .ome.tiff resaving
#--- Compress dataset;----------------------------------------------------------
jobs_compress=${job_directory}"compress" # directory .czi to .zip resaving
czi_compress=${jobs_compress}"/for_czi.bsh" # script .czi to .zip resaving
#--- Split channels-------------------------------------------------------------
jobs_split=${job_directory}"split_channels" # directory
split=${jobs_split}"/split.bsh" # script
#!/bin/bash
# path of master file
source ../master_preprocessing.sh
# path of source and target files
source_pattern=${image_file_directory}${source_pattern}
target_pattern=${image_file_directory}${target_pattern}
# ------------------------------------------------------------------------------
i=${first_index}
t=${first_timepoint}
t=`printf "%0${pad}d" "${t}"`
while [ $i -le $last_index ]; do
for a in "${angles_renaming[@]}"; do
source=${source_pattern/\{index\}/${i}}
tmp=${target_pattern/\{timepoint\}/${t}}
target=${tmp/\{angle\}/${a}
echo ${source} ${target} # displays source file and target file with path
mv ${source} ${target} # renames source file into target pattern
#cp ${source} ${target} # alternatively copy source file and resave into target pattern
let i=i+1
done
t=$(( 10#${t} ))
let t=t+1
t=`printf "%0${pad}d" "${t}"`
done
#!/bin/bash
source ../../master_preprocessing.sh
mkdir -p ${jobs_split}
for i in $parallel_timepoints
do
for a in $angle_prep
do
job="$jobs_split/split-$i-$a.job"
echo $job
echo "#!/bin/bash" > "$job"
echo "$XVFB_RUN -a $Fiji \
-Dimage_file_directory=$image_file_directory \
-Dparallel_timepoints=$i \
-Dangle_prep=$a \
-Dpad=$pad \
-Dtarget_split=$image_file_directory \
-- --no-splash \
$split" >> "$job"
chmod a+x "$job"
done
done
import ij.IJ;
import ij.ImagePlus;
import ij.ImageStack;
import java.lang.Runtime;
import java.io.File;
import java.io.FilenameFilter;
runtime = Runtime.getRuntime();
image_file_directory = System.getProperty( "image_file_directory" );
int parallel_timepoints = Integer.parseInt( System.getProperty( "parallel_timepoints" ) );
angle_prep = System.getProperty( "angle_prep" );
target_split = System.getProperty( "target_split" );
int pad = Integer.parseInt( System.getProperty( "pad" ) );
System.out.println( "directory = " + image_file_directory );
System.out.println( "timpoint = " + parallel_timepoints );
System.out.println( "angles = " + angle_prep );
System.out.println( "target_split = " + target_split );
System.out.println( "pad = " + pad );
//open image
imp = new ImagePlus( image_file_directory + "spim_TL" + IJ.pad( parallel_timepoints , pad ) + "_Angle" + angle_prep + ".tif" );
System.out.println( imp.getTitle() );
/* split channels */
stack = imp.getStack();
for ( c = 0; c < imp.getNChannels(); ++c )
{
channelStack = new ImageStack( imp.getWidth(), imp.getHeight() );
for ( z = 0; z < imp.getNSlices(); ++z )
channelStack.addSlice(
"",
stack.getProcessor(
imp.getStackIndex( c + 1, z + 1, 1 ) ) );
impc = new ImagePlus( imp.getTitle() + " #" + ( c + 1 ), channelStack );
IJ.save( impc, target_split + imp.getTitle().replaceFirst( ".tif$", "_Channel" + ( c ) + ".tif" ) );
}
/* shutdown */
runtime.exit(0);
#!/bin/bash
for file in `ls ${1} | grep ".job$"`
do
bsub -q short -n 3 -R span[hosts=1] -o "out.%J" -e "err.%J" ${1}/$file
done
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment