• Stars
    star
    151
  • Rank 246,057 (Top 5 %)
  • Language
    Python
  • License
    Other
  • Created over 7 years ago
  • Updated about 4 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Three dimensional cross-modal image inference

Label-free prediction of three-dimensional fluorescence images from transmitted-light microscopy

Build Status Documentation Combined outputs

Support

This code is in active development and is used within our organization. We are currently not supporting this code for external use and are simply releasing the code to the community AS IS. The community is welcome to submit issues, but you should not expect an active response.

For the code corresponding to our Nature Methods paper, please use the release_1 branch here.

System requirements

We recommend installation on Linux and an NVIDIA graphics card with 12+ GB of RAM (e.g., NVIDIA Titan X Pascal) with the latest drivers installed.

Installation

  • We recommend an environment manager such as Conda.
  • Install Python 3.6+ if necessary.
  • All commands listed below assume the bash shell.
  • Clone and install the repo:
git clone https://github.com/AllenCellModeling/pytorch_fnet.git
cd pytorch_fnet
pip install .
  • If you would like to instead install for development:
pip install -e .[dev]
  • If you want to run the demos in the examples directory:
pip install .[examples]

Demo on Canned AICS Data

This will download some images from our Integrated Cell Quilt repository and start training a model

cd examples
python download_and_train.py

When training is complete, you can predict on the held-out data with

python predict.py

Command-line tool

Once the package is installed, users can train and use models through the fnet command-line tool. To see what commands are available, use the -h flag.

fnet -h

The -h flag is also available for all fnet commands. For example,

fnet train -h

Train a model

Model training is done through the the fnet train command, which requires a json indicating various training parameters. e.g., what dataset to use, where to save the model, how the hyperparameters should be set, etc. To create a template json:

fnet train --json /path/to/train_options.json

Users are expected to modify this json to suit their needs. At a minimum, users should verify the following json fields and change them if necessary:

  • "dataset_train": The name of the training dataset.
  • "path_save_dir": The directory where the model will be saved. We recommend that the model be saved in the same directory as the training options json.

Once any modifications are complete, initiate training by repeating the above command:

fnet train --json /path/to/train_options.json

Since this time the json already exists, training should commence.

Perform predictions with a trained model

User can perform predictions using a trained model with the fnet predict command. A path to a saved model and a data source must be specified. For example:

fnet predict --json path/to/predict_options.json

As above, users are expected to modify this json to suit their needs. At a minimum, populate the following fields and/or copy and paste corresponding dataset values from /path/to/train_options.json

e.g.:

    "dataset_kwargs": {
        "col_index": "Index",
        "col_signal": "signal",
        "col_target": "target",
        "path_csv": "path/to/my/train.csv",
    ...
    "path_model_dir": [
        "models/model_0"
    ],
    "path_save_dir": "path/to/predictions/dir",

This will use the model save models/dna to perform predictions on the some.dataset dataset. To see additional command options, use fnet predict -h.

Once any modifications are complete, initiate training by repeating the above command:

fnet predict --json path/to/predict_options.json

Citation

If you find this code useful in your research, please consider citing our manuscript in Nature Methods:

@article{Ounkomol2018,
  doi = {10.1038/s41592-018-0111-2},
  url = {https://doi.org/10.1038/s41592-018-0111-2},
  year  = {2018},
  month = {sep},
  publisher = {Springer Nature America,  Inc},
  volume = {15},
  number = {11},
  pages = {917--920},
  author = {Chawin Ounkomol and Sharmishtaa Seshamani and Mary M. Maleckar and Forrest Collman and Gregory R. Johnson},
  title = {Label-free prediction of three-dimensional fluorescence images from transmitted-light microscopy},
  journal = {Nature Methods}
}

Contact

Gregory Johnson
E-mail: [email protected]

Allen Institute Software License

Allen Institute Software License – This software license is the 2-clause BSD license plus clause a third clause that prohibits redistribution and use for commercial purposes without further permission.
Copyright Β© 2018. Allen Institute. All rights reserved.

Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:

  1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
  2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
  3. Redistributions and use for commercial purposes are not permitted without the Allen Institute’s written permission. For purposes of this license, commercial purposes are the incorporation of the Allen Institute's software into anything for which you will charge fees or other compensation or use of the software to perform a commercial service for a third party. Contact [email protected] for commercial licensing opportunities.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

More Repositories

1

aicsimageio

Image Reading, Metadata Conversion, and Image Writing for Microscopy Images in Python
Python
203
star
2

torch_integrated_cell

Jupyter Notebook
45
star
3

pytorch_integrated_cell

Integrated Cell project implemented in pytorch
Python
41
star
4

napari-aicsimageio

Multiple file format reading directly into napari using pure Python
Python
33
star
5

cookiecutter-pypackage

A cookiecutter template to generate a new Python package.
Python
22
star
6

actk

Automated Cell Toolkit
Python
21
star
7

nvidia-docker-stats

Monitor nvidia gpu unit load associated with docker spawned processes
Python
20
star
8

czifile

czi reader from https://www.lfd.uci.edu/~gohlke/code/czifile.py.html
Jupyter Notebook
13
star
9

AllenCellModeling.github.io

Data stories
Jupyter Notebook
10
star
10

DLITE

Let's infer the forces that cells generate during time series by looking at the angles they make with each other.
Jupyter Notebook
9
star
11

napari-annotation-tools

Python
5
star
12

quilt3distribute

People commonly work with tabular datasets, people want to share their data, this makes that easier through Quilt3.
Python
4
star
13

QuiltLoader

Adding custom attributes to quilt nodes
Python
4
star
14

diffusive_distinguishability

Simulation of homogeneous diffusion, bayesian estimation of underlying diffusion constant, and analysis of distinguishability between diffusivity estimates
Jupyter Notebook
3
star
15

seq2loc

Jupyter Notebook
3
star
16

fish_morphology_code

Basic code and automated workflows to produce analysis friendly data from annotated FISH images
Python
3
star
17

CVAE_testbed

A research testbed on conditional variational autoencoders using Gaussian distributions as input
Jupyter Notebook
3
star
18

aicsimage

Image IO for CZI, OME-TIFF, and PNGs
Jupyter Notebook
3
star
19

brightfield2fish

Python
2
star
20

czi-to-ome-xslt

A repository of XSL transform sheets to map CZI metadata to OME.
XSLT
2
star
21

aicsimageprocessing

A generalized scientific image processing module from the Allen Institute for Cell Science.
Python
2
star
22

cookiecutter-stepworkflow

AICS Cookiecutter Template for a simple data + code workflow
Python
1
star
23

aics_dask_utils

Utility functions commonly used by AICS projects for interacting with Dask
Python
1
star
24

fish_morphology_manuscript_notebooks

Notebooks for figures in our manuscript
Jupyter Notebook
1
star
25

normal_mode_analysis

Normal mode analysis from meshes
Jupyter Notebook
1
star
26

long-term-eng

Long Term Engineering + Infrastructure Projects
1
star
27

aics-deformation

Tools to use when working on deformation projects.
Python
1
star
28

datasetdatabase

Modeling DB Schema, Creation, and IO
Python
1
star
29

elastic_net_gene_selection

scRNA-seq gene selection code
Python
1
star
30

bioloaders

A repository of dataset objects for common datatypes in bio sciences.
Python
1
star
31

datastep

A base class and utility functions for creating pure functions steps for DAGs.
Python
1
star
32

cyto-dl

Python
1
star