Skip to content

Commit

Permalink
Develop (#19)
Browse files Browse the repository at this point in the history
* Add github action to publish package to PyPi

* Feature GitHub action conda build (#1)

* Add conda github action that builds the package with conda. It is triggered on pull request.

* Add pyro-ppl and pip as conda dependency. Trigger setup automatically via conda yaml. Split steps in conda build github action.

* Less strict in the setup.py dependencies for torch. Small change in the installation instructions in the documentation.

* Feature - Add basic unit tests to cover activation_functions.py (#3)

* adding first unit tests

* Format conda-build YAML file

* Update action name

* Fix path to __version__

* Test commit - Trigger workflow

* Revert dummy change

Co-authored-by: Mathias Rechenmann <[email protected]>

* Feature clean helper functions (#4)

* delete _helper_functions.py

* delete _hyperparams_optimization.py

* delete plot_every_epoch function

* delete import _helper_functions

* delete gpyopt dependency

* edit only README

* Add pylint github action

* Remove pypi github action (#9)

* support CPU, modify README (#10)

* support CPU, modify README

* shorten the import, add model to __init__

* More lenient scar installation specifications (#8)

* Split scAR.yml to scAR-gpu.yaml and scAR-cpu.yaml.
* More lenient installation specifications.
* Bump version to 0.2.0

* fix a typo, reorganise __init__.py

* fix bugs

* add synthetic data for integration test (#13)

* add synthetic data for integration test

* change paths

* change paths

* Remove torchaudio. Bump version to 0.2.2 (#14)

* Remove torchaudio
* Bump version to 0.2.2

* Update readme (#16)

* update README

* Black github action (#17)

Addition of black github action that runs on every push and every pull request. It shows in the stdout all the changes that need to be made (--diff), but returns exit code 0, even if errors are observed.

* Addition of integration test (#18)

* Add integration test as unit test

* update version

Co-authored-by: Gypas, Foivos <[email protected]>
Co-authored-by: Foivos Gypas <[email protected]>
Co-authored-by: Mathias Rechenmann <[email protected]>
  • Loading branch information
4 people authored Apr 19, 2022
1 parent e432ad1 commit 45cbeec
Show file tree
Hide file tree
Showing 4 changed files with 62 additions and 8 deletions.
14 changes: 14 additions & 0 deletions .github/workflows/black.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
name: Black lint

on: [push, pull_request]

jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: psf/black@stable
with:
options: "--diff --verbose --line-length 127"
src: "./scAR"
version: "22.3.0"
30 changes: 23 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,8 @@
# scAR

[![scAR](https://img.shields.io/badge/scAR-005AF0?style=for-the-badge&logo=dependabot&logoColor=white.svg)](https://github.com/Novartis/scAR)
![single-cell omics](https://img.shields.io/badge/single_cell_omics-005AF0?style=for-the-badge.svg)
![machine learning](https://img.shields.io/badge/machine_learning-005AF0?style=for-the-badge.svg)
![variational autoencoders](https://img.shields.io/badge/variational_autoencoders-005AF0?style=for-the-badge.svg)
![denoising](https://img.shields.io/badge/denoising-005AF0?style=for-the-badge.svg)
[![scAR](https://anaconda.org/bioconda/scar/badges/version.svg)](https://anaconda.org/bioconda/scar)
[![Stars](https://img.shields.io/github/stars/Novartis/scar?logo=GitHub&color=red)](https://github.com/Novartis/scAR)
[![Downloads](https://anaconda.org/bioconda/scar/badges/downloads.svg)](https://anaconda.org/bioconda/scar/files)

**scAR** (single cell Ambient Remover) is a package for denoising multiple single cell omics data. It can be used for multiple tasks, such as, **sgRNA assignment** for scCRISPRseq, **identity barcode assignment** for cell indexing, **protein denoising** for CITE-seq, **mRNA denoising** for scRNAseq, and etc... It is built using probabilistic deep learning, illustrated as follows:

Expand All @@ -22,7 +20,25 @@

## Installation

Clone this repository,
#### Conda
1, Install [conda](https://docs.conda.io/projects/conda/en/latest/user-guide/install/index.html)
2, Create conda environment
```sh
$ conda create -n scAR_env
```

3, Activate conda environment
```sh
$ conda activate scAR_env
```

4, Install scar
```sh
$ conda install -c bioconda scar
```

#### Git+pip
Alternatively, clone this repository,

```sh
$ git clone https://github.com/Novartis/scAR.git
Expand Down Expand Up @@ -61,7 +77,7 @@ There are two ways to run scAR.
1) Use scAR API if you are Python users

```sh
>>> from scAR import model
>>> from scar import model
>>> scarObj = model(adata.X.to_df(), empty_profile)
>>> scarObj.train()
>>> scarObj.inference()
Expand Down
3 changes: 2 additions & 1 deletion scAR/main/__init__.py
Original file line number Diff line number Diff line change
@@ -1,2 +1,3 @@
# -*- coding: utf-8 -*-
__version__ = '0.2.2'

__version__ = '0.2.3'
23 changes: 23 additions & 0 deletions scAR/test/test_scar.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
import pandas as pd
from scAR import model
import unittest

class ScarIntegration(unittest.TestCase):
"""
Test activation_functions.py functions.
"""

def test_scar(self):
raw_count = pd.read_pickle("scAR/test/raw_counts.pickle")
empty_profile=pd.read_pickle("scAR/test/ambient_profile.pickle")
expected_output=pd.read_pickle("scAR/test/output_assignment.pickle")

scarObj = model(
raw_count=raw_count.values, empty_profile=empty_profile, scRNAseq_tech="CROPseq"
)

scarObj.train(epochs=40, batch_size=64)

scarObj.inference()

self.assertTrue(scarObj.feature_assignment.equals(expected_output))

0 comments on commit 45cbeec

Please sign in to comment.