library(dynwrap)
library(dplyr)
Once you have wrapped a method using a script and definition, all you need to share your method is a Dockerfile which lists all the dependencies that need to be installed.
We’ll work with the following definition.yml:
method:
id: comp_1
parameters:
- id: component
default: 1
type: integer
distribution:
type: uniform
lower: 1
upper: 10
description: The nth component to use
wrapper:
input_required: expression
input_optional: start_id
and run.py:
#!/usr/bin/env python
import dynclipy
= dynclipy.main()
dataset
import pandas as pd
import sklearn.decomposition
# infer trajectory
= sklearn.decomposition.PCA()
pca = pca.fit_transform(dataset['expression'])
dimred = pd.Series(
pseudotime 'parameters']['component']-1],
dimred[:, dataset[= dataset['expression'].index
index
)
# build trajectory
= dynclipy.wrap_data(cell_ids = dataset['expression'].index)
trajectory = pseudotime)
trajectory.add_linear_trajectory(pseudotime
# save output
'output']) trajectory.write_output(dataset[
Make sure it is executable.
chmod +x run.py
Assuming that the definition.yml and run.R are located in current directory, a minimal Dockerfile would look like:
FROM dynverse/dynwrappy_tester:latest
COPY definition.yml run.py /code/
ENTRYPOINT ["/code/run.py"]
dynverse/dynwrappy
is here the base image, which contains the latest version of R, python, dynwrap, dyncli and most tidyverse dependencies. For R methods, you can use the dynverse/dynwrapr
base. While not required, it’s recommended to start from these base images, because dyncli provides an interface to run each method using the docker container from the command line. As discussed before, wrapping is also a lot easier using dynwrap.
For reproducibility, it’s best to specify the tag of the base image. You can find this these tags on dockerhub: https://hub.docker.com/r/dynverse/dynwrapr/tags.
The Dockerfile then copies over the definition.yml and run.py file inside the “code” directory. Typically, you won’t change the locations of these files, simply to maintain consistency with the rest of the method wrappers included in dynmethods.
Finally, we specify the entrypoint, which is the script that will be executed when the docker is run.
Do not specify this entrypoint using ENTRYPOINT /code/run.R
, because this will create issues with specifying command-line arguments.
That’s it! Assuming that you have a functioning docker installation, you can build this container using
system("docker build -t my_ti_method .")
This method can now be loaded inside R using dynwrap
<- create_ti_method_container("my_ti_method") method
## Warning in readLines(path_local): incomplete final line found on '/tmp/
## RtmpDSWRR7/file261d8c13bee49d//tmpfile'
<- dynwrap::example_dataset
dataset <- infer_trajectory(dataset, method(), verbose = TRUE) trajectory
## Executing 'comp_1' on 'example'
## With parameters: list(component = 1L)
## inputs: expression
## priors :
## Loading required namespace: hdf5r
## Input saved to /tmp/RtmpDSWRR7/file261d8c1ee593ba/ti
## Running method using babelwhale
## Running /usr/bin/docker run --name 20210323_230531__container__GVOTvvJdZn -e \
## 'TMPDIR=/tmp2' --workdir /ti/workspace -v \
## '/tmp/RtmpDSWRR7/file261d8c1ee593ba/ti:/ti' -v \
## '/tmp/RtmpDSWRR7/file261d8c1b1f80bb/tmp:/tmp2' my_ti_method --dataset \
## /ti/input.h5 --output /ti/output.h5
## /usr/local/lib/python3.8/site-packages/pandas/compat/__init__.py:117: UserWarning: Could not import the lzma module. Your installed Python is incomplete. Attempting to use lzma compression will result in a RuntimeError.
## warnings.warn(msg)
## R[write to console]: Warning messages:
##
## R[write to console]: 1:
## R[write to console]: In readLines(file) :
## R[write to console]:
##
## R[write to console]: incomplete final line found on '/code/definition.yml'
##
## R[write to console]: 2: Column `input_id` has different attributes on LHS and RHS of join
##
## Output saved to /tmp/RtmpDSWRR7/file261d8c1ee593ba/ti/output.h5
## Attempting to read in output with hdf5
If you have dynplot installed, you can also plot the trajectory:
library(dynplot)
# for now, install from github using:
# remotes::install_github("dynverse/dynplot")
plot_graph(trajectory)
plot_heatmap(trajectory, expression_source = dataset$expression)
Congratulations! You now have a TI method that can be easily installed anywhere without dependency issues, and that can be included within the whole dynverse pipeline.
So what’s left?
To make a project like this maintainable in the long run, it is important that everytime something is changed, the method is tested to make sure it works fine.
To do this, we first add an example.sh file, which will generate an example dataset that will certainly run without any error with the method. In this case, this example is just the example dataset included in dynwrap.
#!/usr/bin/env Rscript
<- dynwrap::example_dataset
dataset
<- commandArgs(trailingOnly = TRUE)[[1]]
file ::write_h5(dataset, file) dynutils
You can of course provide your own example data here, but make sure it doesn’t take too long to generate, and not too long for the TI method run.
You can also add extra parameters and a fixed seed to the example data, e.g.: dataset$seed <- 1
and dataset$parameters <- list(component = 42)
Now run the example script and test it:
chmod +x example.sh
./example.sh example.h5
## Loading required namespace: hdf5r
<- dynutils::read_h5("example.h5")
dataset <- infer_trajectory(dataset, method()) trajectory
To automate the testing (and building) of containers, you’ll have to use continuous integration. We use travis-ci for this, a free service for open-source projects. Because the exact code to use this continuous integration requires some manual steps, we suggest you create an issue at dynmethods so that we can help you further. If you’re really adventurous, you can have a look at some of our GitHub Actions Workflow files, e.g.: https://github.com/dynverse/ti_paga/blob/master/.github/workflows/make.yml.
Once the continuous integration works, you’re method is ready to be included in dynmethods!