Coder Social home page Coder Social logo

merck / biophi Goto Github PK

View Code? Open in Web Editor NEW
125.0 10.0 44.0 14.69 MB

BioPhi is an open-source antibody design platform. It features methods for automated antibody humanization (Sapiens), humanness evaluation (OASis) and an interface for computer-assisted antibody sequence design.

Home Page: https://biophi.dichlab.org/

License: MIT License

Dockerfile 0.04% Makefile 0.08% Python 87.43% CSS 3.03% JavaScript 0.45% HTML 8.98%
antibody humanization humanness sapiens oasis

biophi's People

Contributors

chrisxu2016 avatar erikr avatar hrifanov avatar jasonmvictor avatar kvetab avatar prihoda avatar tony-res avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

biophi's Issues

Using BioPhi Server to download the result file encountered an error

return send_file(

The detailed is as follows

Traceback (most recent call last):
  File "/data/christianxu/miniconda3/envs/biophi-dev/lib/python3.8/site-packages/flask/app.py", line 2548, in __call__
    return self.wsgi_app(environ, start_response)
  File "/data/christianxu/miniconda3/envs/biophi-dev/lib/python3.8/site-packages/flask/app.py", line 2528, in wsgi_app
    response = self.handle_exception(e)
  File "/data/christianxu/miniconda3/envs/biophi-dev/lib/python3.8/site-packages/flask/app.py", line 2525, in wsgi_app
    response = self.full_dispatch_request()
  File "/data/christianxu/miniconda3/envs/biophi-dev/lib/python3.8/site-packages/flask/app.py", line 1822, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/data/christianxu/miniconda3/envs/biophi-dev/lib/python3.8/site-packages/flask/app.py", line 1820, in full_dispatch_request
    rv = self.dispatch_request()
  File "/data/christianxu/miniconda3/envs/biophi-dev/lib/python3.8/site-packages/flask/app.py", line 1796, in dispatch_request
    return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)
  File "/data/christianxu/AntiBody/algorithm/BioPhi/biophi/humanization/web/views.py", line 217, in humanize_detail_export_humanized_fasta
    return send_fasta(
  File "/data/christianxu/AntiBody/algorithm/BioPhi/biophi/common/utils/io.py", line 386, in send_fasta
    return send_text(stringio.getvalue(), name=name, extension='fa', timestamp=timestamp)
  File "/data/christianxu/AntiBody/algorithm/BioPhi/biophi/common/utils/io.py", line 374, in send_text
    return send_file(
TypeError: send_file() got an unexpected keyword argument 'attachment_filename'

Refer to the flask issue 4753,

Old names for some send_file parameters have been removed. download_name replaces attachment_filename, max_age replaces cache_timeout, and etag replaces add_etags. Additionally, path replaces filename in send_from_directory.

the web server Humanize can't work

Hi,
i'v used BioPhi web server to humanized antibody sequence, but Task is waiting in queue... appears for about 5 hours. I wonder if the web server is still working?

thanks!!

Smaller docker images

Currently the container images are quite big. There are a few things that could be done, such as using conda pack and two stage build that would hopefully decrease the image size. Is this something that would be helpful?

NOT WORKING

Task is waiting in queue. This seems to be a very recurrent problem over the last few days. I don't seem to be able to get the site to work!!

Observation and Fix: Sqlalchemy 2.0 Compatibility - Downgrade to Sqlalchemy 1.x

Just for information and to help other users of Biophi:

I installed Biophi using miniforge by following instructions provided on the project's github page.
However there was a runtime error:
AttributeError: 'Engine' object has no attribute 'execute'

I could track the issue down to a change in Sqlalchemy 2.0. Downgrading to Sqlalchemy via conda did the trick:
https://docs.sqlalchemy.org/en/14/changelog/migration_20.html
It is about engine.execute() which was deprecated in 1.4 and removed in 2.0.

$ conda install "sqlalchemy<2.0"
That installed sqlalchemy-1.4.46.

Kind regards and many thanks for providing Biophi, a great application!

Vernier CDR Grafting

Hi, I see that the CDR grafting option in the server performs straight grafting. Is there any way to automatically perform "Vernier" CDR grafting while retaining key "back-mutations" in the Vernier zones (as you describe in the paper)?
Thank you!!
SG

AttributeError: 'Engine' object has no attribute 'cursor'

following conda install instructions the system throws an error when computing humanness
AttributeError: 'Engine' object has no attribute 'cursor'

[2024-02-06 13:58:11,833: ERROR/ForkPoolWorker-14] Task biophi.humanization.web.tasks.humanness_task[9fe7dc93-8651-49ab-baec-f3a0386376f5] raised unexpected: HumannessTaskError("'Engine' object has no attribute 'cursor'")
Traceback (most recent call last):
File "/home/mclark/BioPhi/biophi/humanization/web/tasks.py", line 305, in humanness_task
humanness=get_antibody_humanness(
File "/home/mclark/BioPhi/biophi/humanization/methods/humanness.py", line 275, in get_antibody_humanness
vh=get_chain_humanness(vh, params=params) if vh else None,
File "/home/mclark/BioPhi/biophi/humanization/methods/humanness.py", line 321, in get_chain_humanness
peptides = get_chain_oasis_peptides(chain, params=params)
File "/home/mclark/BioPhi/biophi/humanization/methods/humanness.py", line 303, in get_chain_oasis_peptides
oas_hits = get_oas_hits(
File "/home/mclark/BioPhi/biophi/humanization/methods/humanness.py", line 367, in get_oas_hits
return pd.read_sql(statement, params=peptides, con=engine)
File "/home/mclark/.conda/envs/biophi-dev/lib/python3.9/site-packages/pandas/io/sql.py", line 706, in read_sql
return pandas_sql.read_query(
File "/home/mclark/.conda/envs/biophi-dev/lib/python3.9/site-packages/pandas/io/sql.py", line 2739, in read_query
cursor = self.execute(sql, params)
File "/home/mclark/.conda/envs/biophi-dev/lib/python3.9/site-packages/pandas/io/sql.py", line 2673, in execute
cur = self.con.cursor()
AttributeError: 'Engine' object has no attribute 'cursor'

Use biophi package locally in python

Is there any documentation showing how to use python code to do analysis locally instead of using the locally webserver, like using API to request or importing some packages?

Implement sorting on the backend

Result tables should be sortable, now it's implemented in javascript, it only works if we have <= 10 results. We should sort on the backend using a query argument like ?sort=my_column

make docker-build fails

Error description

I freshly cloned this repo on a new Ubuntu 20.04 VM, installed miniconda3, and followed the instructions at Run BioPhi dev server through Docker Compose.

Running make docker-build throws an error:

$ make docker-build
docker-compose build
redis uses an image, skipping
mongo uses an image, skipping
Building worker
Sending build context to Docker daemon  29.63MB
Step 1/12 : FROM continuumio/miniconda
latest: Pulling from continuumio/miniconda
Digest: sha256:fee1354ae2435522b9a8a79c5f1c406facc07ec5c44d730d8053600b37c924f0
Status: Downloaded newer image for continuumio/miniconda:latest
 ---> b8ea69b5c41c
Step 2/12 : RUN apt-get update -y && apt-get install -y --no-install-recommends         build-essential
 ---> Using cache
 ---> 9a6c9d2094d4
Step 3/12 : RUN conda config --add channels bioconda     && conda config --add channels conda-forge
 ---> Using cache
 ---> c3292b2e0f58
Step 4/12 : WORKDIR /opt/biophi
 ---> Using cache
 ---> 188452c8dc8c
Step 5/12 : COPY environment.yml .
 ---> 40000c92604e
Step 6/12 : COPY Makefile .
 ---> 35a9f24f85cf
Step 7/12 : RUN make env-update ENV_NAME=base
 ---> Running in 74fb699b3d87
conda env update -n base -f environment.yml
Collecting package metadata (repodata.json): ...working... make: *** [Makefile:12: env-upda
te] Killed
The command '/bin/sh -c make env-update ENV_NAME=base' returned a non-zero code: 2
ERROR: Service 'worker' failed to build : Build failed
make: *** [Makefile:18: docker-build] Error 1

So I ran make env-create:

base $ make env-create
conda env create -n biophi-dev -f environment.yml
Collecting package metadata (repodata.json): done
Solving environment: failed

ResolvePackageNotFound:
  - abnumber==0.2.7

make: *** [Makefile:9: env-create] Error 1

So I added the env name and channels to environment.yml:

name: biophi-test
channels:
    - defaults
    - bioconda
    - anaconda
    - conda-forge
dependencies:
  - python = 3.8
  - pip
  - abnumber == 0.2.7
  - pip:
    - click >= 7
    - pandas
    - sqlalchemy
    - flask
    - redis
    - celery
    - biopython
    - pytest
    - requests
    - tqdm
    - xlsxwriter
    - humanize
    - fairseq == 0.10.2

and re-ran make env-create, which now gives a new error:

base $ make env-create
conda env create -n biophi-dev -f environment.yml
Collecting package metadata (repodata.json): done
Solving environment: done

Downloading and Extracting Packages
pip-21.2.2           | 1.8 MB    | ################################################ | 100%
python-dateutil-2.8. | 233 KB    | ################################################ | 100%
pytz-2021.1          | 181 KB    | ################################################ | 100%
mkl_fft-1.3.0        | 180 KB    | ################################################ | 100%
bottleneck-1.3.2     | 125 KB    | ################################################ | 100%
pandas-1.3.1         | 9.6 MB    | ################################################ | 100%
python-3.8.11        | 18.2 MB   | ################################################ | 100%
biopython-1.78       | 2.1 MB    | ################################################ | 100%
mkl-2021.3.0         | 141.2 MB  | ################################################ | 100%
anarci-2021.02.04    | 1.1 MB    | ################################################ | 100%
blas-1.0             | 6 KB      | ################################################ | 100%
numpy-base-1.20.3    | 4.5 MB    | ################################################ | 100%
mkl-service-2.4.0    | 59 KB     | ################################################ | 100%
hmmer-3.3.2          | 9.6 MB    | ################################################ | 100%
numpy-1.20.3         | 23 KB     | ################################################ | 100%
numexpr-2.7.3        | 188 KB    | ################################################ | 100%
mkl_random-1.2.2     | 308 KB    | ################################################ | 100%
abnumber-0.2.7       | 33 KB     | ################################################ | 100%
intel-openmp-2021.3. | 1.4 MB    | ################################################ | 100%
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
Installing pip dependencies: - Ran pip subprocess with arguments:
['/home/ubuntu/miniconda3/envs/biophi-dev/bin/python', '-m', 'pip', 'install', '-U', '-r', '/home/ubuntu/BioPhi/condaenv.4ik288eg.requirements.txt']
Pip subprocess output:
Collecting click>=7
  Using cached click-8.0.1-py3-none-any.whl (97 kB)
Requirement already satisfied: pandas in /home/ubuntu/miniconda3/envs/biophi-dev/lib/python3.8/site-packages (from -r /home/ubuntu/BioPhi/condaenv.4ik288eg.requirements.txt (line 2)) (1.3.1)
Collecting sqlalchemy
  Using cached SQLAlchemy-1.4.22-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting flask
  Using cached Flask-2.0.1-py3-none-any.whl (94 kB)
Collecting redis
  Using cached redis-3.5.3-py2.py3-none-any.whl (72 kB)
Collecting celery
  Using cached celery-5.1.2-py3-none-any.whl (401 kB)
Requirement already satisfied: biopython in /home/ubuntu/miniconda3/envs/biophi-dev/lib/python3.8/site-packages (from -r /home/ubuntu/BioPhi/condaenv.4ik288eg.requirements.txt (line 7)) (1.78)
Collecting biopython
  Using cached biopython-1.79-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl (2.3 MB)
Collecting pytest
  Using cached pytest-6.2.4-py3-none-any.whl (280 kB)
Collecting requests
  Using cached requests-2.26.0-py2.py3-none-any.whl (62 kB)
Collecting tqdm
  Using cached tqdm-4.62.0-py2.py3-none-any.whl (76 kB)
Collecting xlsxwriter
  Using cached XlsxWriter-3.0.1-py3-none-any.whl (148 kB)
Collecting humanize
  Using cached humanize-3.11.0-py3-none-any.whl (90 kB)
Collecting fairseq==0.10.2
  Using cached fairseq-0.10.2-cp38-cp38-manylinux1_x86_64.whl (1.7 MB)
Collecting dataclasses
  Using cached dataclasses-0.6-py3-none-any.whl (14 kB)
Collecting cython
  Using cached Cython-0.29.24-cp38-cp38-manylinux1_x86_64.whl (1.9 MB)
Collecting regex
  Using cached regex-2021.8.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (738 kB)
Collecting sacrebleu>=1.4.12
  Using cached sacrebleu-2.0.0-py3-none-any.whl (90 kB)
Collecting hydra-core
  Using cached hydra_core-1.1.0-py3-none-any.whl (144 kB)
Collecting torch

Pip subprocess error:
/home/ubuntu/miniconda3/envs/biophi-dev/.tmp4b21tyj7: line 3: 32406 Killed                  /home/ubuntu/miniconda3/envs/biophi-dev/bin/python -m pip install -U -r /home/ubuntu/BioPhi/condaenv.4ik288eg.requirements.txt

failed

CondaEnvException: Pip failed

make: *** [Makefile:9: env-create] Error 1

Expected behavior

make env-create does not throw an error. Strangely this worked last night but the errors began around 10 pm EST.

Computing Embeddings from Transformer?

Hello,

Is there any ability to create an API for producing embeddings from input sequences? Users could implement this themselves if #19 turns out to be true. If not, perhaps an embedding API could be exposed without having to make weights publicly available?

Thanks!

Compatibility with SQLAlchemy 1.4.48

I'm not sure if what I'm going to showcase here is 100% correct, but here it goes.
While attempting to build and run my local version of the BioPhi backend, I run into some weird errors, including sqlalchemy.exc.ArgumentError: List argument must consist only of tuples or dictionaries. In particular, the error directed me to the get_oas_hits function (bellow).

def get_oas_hits(peptides: Union[str, List[str]], engine: Engine, filter_chain=None):
    if not isinstance(peptides, tuple):
        peptides = tuple(peptides)
    filter_chain_statement = ""
    if filter_chain:
        assert filter_chain in ['Heavy', 'Light']
        filter_chain_statement = f"AND Complete{filter_chain}Seqs >= 10000"

    statement = "SELECT peptides.* FROM peptides " \
                "LEFT JOIN subjects ON peptides.subject=subjects.id " \
                "WHERE peptide IN (" + ",".join("?" * len(peptides)) + ") AND subjects.StudyPath <> 'Corcoran_2016' " \
                + filter_chain_statement
    return pd.read_sql(statement, params=peptides, con=engine)

According to this Stack Overflow post the problem seems to derive from changes in the source code of SQLAlchemy itself. In the environment.yml I see sqlalchemy < 2, I seem to be running SQLAlchemy 1.4.48 on my docker container. Nonetheless, I was having this error and I managed to fix it by changing the get_oas_hits function to:

def get_oas_hits(peptides: Union[str, List[str]], engine: Engine, filter_chain=None):
    if not isinstance(peptides, list):
        peptides = list(peptides)
    filter_chain_statement = ""
    if filter_chain:
        assert filter_chain in ['Heavy', 'Light']
        filter_chain_statement = f"AND Complete{filter_chain}Seqs >= 10000"

    params = {f'param_{i}': peptide for i, peptide in enumerate(peptides)}
    statement = "SELECT peptides.* FROM peptides " \
                "LEFT JOIN subjects ON peptides.subject=subjects.id " \
                f"WHERE peptide IN ({','.join(':' + key for key in params.keys())}) AND subjects.StudyPath <> 'Corcoran_2016' " \
                + filter_chain_statement
    return pd.read_sql(statement, params=params, con=engine)

Not sure if this is something you'd like to take a look at.

Specify pandas version

Upgrading to pandas 2.0 makes the program raise a HumannessTaskError:

  • If you submit a light sequence it will say it expected a heavy sequence
  • If you submit a heavy sequence it will say it expected a light sequence

Traced the error back to pandas 2.0 and the following AttributeError in SQLAlchemy:

AttributeError: 'Connection' object has no attribute 'exec_driver_sql'

Reverting to pandas 1.5.2 solved the issue

Get humanness from Python script

I wasn't sure how to get the OASis and Germline scores from within a Python script. It works with this code, but I'm wondering if there is a simpler/more preferred way?

from biophi.common.utils.io import parse_antibody_files
from biophi.humanization.methods.humanness import OASisParams
from biophi.humanization.web.tasks import humanness_task, HumannessTaskResult

(
    antibody_inputs,
    invalid_names,
    duplicate_names,
    unrecognized_files,
) = parse_antibody_files(files=["./input.fasta"], verbose=False)

min_percent_subjects = 50
oasis_params = OASisParams(
    oasis_db_path="/general/biophi/OASis_9mers_v1.db",
    min_fraction_subjects=min_percent_subjects / 100,
)

file_index = 0
out = humanness_task(
    input=antibody_inputs[file_index],
    oasis_params=oasis_params,
    scheme="kabat",
    cdr_definition="kabat",
)

df = HumannessTaskResult.to_overview_dataframe([out])

where input.fasta is:

>Test1 H
QVQLQQSGAELARPGASVKMSCKASGYTFTRYTMHWVKQRPGQGLEWIGYINPSRGYTNYNQKFKDKATLTTDKSSSTAYMQLSSLTSEDSAVYYCARYYDDHYCLDYWGQGTTLTVSS

Control number of mutations when humanizing

Hello, I tried multiple options, but regardless of number of iterations or limit parameter I am getting always the same number of mutations introduced. Humanness score indeed increases, but I would like to know if there is an option to say how many mutations I want to be introduced?

Compatibility with Python 3.9?

Hi. I'm attempting to install biophi using conda on an environment using Python 3.9. I know that in the environment from the instructions it specifically says to use Python 3.8. I was wondering if there's any plans to assure compatibility with Python 3.9 any time soon. Thank you.

Found conflicts! Looking for incompatible packages.
This can take several minutes.  Press CTRL-C to abort.
failed                                                                                                                                                                  

UnsatisfiableError: The following specifications were found to be incompatible with each other:

Output in format: Requested package -> Available versionsThe following specifications were found to be incompatible with your system:

  - feature:/linux-64::__glibc==2.31=0
  - feature:|@/linux-64::__glibc==2.31=0

Your installed version is: 2.31

Jobs stay at "waiting in queue" at BioPhi server

I used the server sometime earlier this year and it went find and will get results back in couple of minutes.
However, lately I submitted a couple of calculations to either Evaluate humanness or Humanize antibody, the page kept refreshing with "Tasks are waiting in queue" for 24hr and did not show any results. When went back to the link a day later, it said "Results are only stored a limited amount of time, please submit the form again".

conda installation fails from README instructions

Just a quick issue - following the conda installation instructions in the README fails with a strange glibc version error for me (I think it's related to this issue).

Looks like the problem is that you actually need the additional channels from the environment.yml file ( conda-forge and/or nodefaults) for the installation to succeed, so the command should be

conda install biophi -c bioconda -c conda-forge -c nodefaults

Availability of Model Weights?

Hello,

Thanks for this great work. I was wondering if the actual weights of the Transformer are publicly available or if there are plans to make them available?

-Nick

Task is waiting in queue...

Recently the server was resetted to resolve this issue. After 3 weeks of use, the issue arose once more.

Could you please reset the server on the website?

Cannot build docker

I am trying to run the dev server using the instructions for development using docker compose. Once I try to build docker it fails at the stage => ERROR [biophi-main-web 6/9] RUN make env-update ENV_NAME=base with following error:
#0 0.237 conda env update -n base -f environment.yml
#0 0.718 Collecting package metadata (repodata.json): ...working... done
#0 46.40 Solving environment: ...working... failed
#0 46.40
#0 46.40 ResolvePackageNotFound:
#0 46.40 - hmmer[version='>=3.1']
#0 46.40
#0 49.63 make: *** [Makefile:12: env-update] Error 1

I am also not able to install this package locally using Conda, as the package cannot be found (I specify bioconda as a source)

I thought that the problem is in hmmer package so I have downloaded its 3.1 version, located it in the BioPhi directory and added the installation to the docker file. This solved the hmmer problem but raised a lot of package incompatibility errors, which eventually leads to the same results -> docker fails to build. Is there a workaround how to make it work? Thanks a lot for any help.

P.S This is just a small part of error messages I receive:

[worker 8/10] RUN make env-update ENV_NAME=base:
0.108 conda env update -n base -f environment.yml
0.528 Collecting package metadata (repodata.json): ...working... done
27.19 Solving environment: ...working...
54.07 Found conflicts! Looking for incompatible packages.
54.07 This can take several minutes. Press CTRL-C to abort.
Examining conflict for pycparser packaging brotlipy charset-normalizer conda-libmamba-solver truststore idna conda cffi conda-content-trust pluggy requests pycosat pysocks tk conda-package-streaming pyopenssl jsonpatch libmambapy wheel cryptography jsonpointer zstandard ruamel.yaml urllib3 anarci pip setuptools certifi abnumber conda-package-handling tqdm pythExamining conflict for pycparser packaging brotlipy charset-normalizer conda-libmamba-solver truststore idna conda cffi conda-content-trust pluggy requests pycosat pysocks conda-package-streaming pyopenssl jsonpatch libmambapy wheel cryptography jsonpointer zstandard ruamel.yaml urllib3 anarci pip setuptools certifi abnumber conda-package-handling tqdm python Examining conflict for pycparser packaging brotlipy charset-normalizer conda-libmamba-solver truststore idna conda cffi conda-content-trust pluggy requests pycosat pysocks conda-package-streaming pyopenssl jsonpatch libmambapy wheel cryptography jsonpointer zstandard ruamel.yaml urllib3 anarci pip setuptools certifi abnumber conda-package-handling tqdm python Examining conflict for pycparser packaging brotlipy charset-normalizer conda-libmamba-solver pcre2 truststore idna conda cffi conda-content-trust pluggy libarchive requests bzip2 pycosat pysocks conda-package-streaming pyopenssl libmamba jsonpatch libmambapy wheel cryptography jsonpointer zstandard ruamel.yaml libsolv urllib3 anarci pip setuptools certifi abnuExamining conflict for pycparser packaging brotlipy charset-normalizer conda-libmamba-solver pcre2 truststore idna conda cffi conda-content-trust pluggy libarchive requests bzip2 pycosat pysocks conda-package-streaming pyopenssl libmamba jsonpatch libmambapy wheel cryptography jsonpointer zstandard ruamel.yaml libsolv urllib3 anarci pip setuptools certifi abnuExamining conflict for pycparser packaging brotlipy charset-normalizer conda-libmamba-solver readline truststore idna conda cffi conda-content-trust pluggy requests pycosat krb5 pysocks conda-package-streaming sqlite pyopenssl ncurses jsonpatch libmambapy wheel cryptography jsonpointer zstandard ruamel.yaml urllib3 anarci pip setuptools certifi abnumber conda-Examining conflict for pycparser packaging brotlipy charset-normalizer conda-libmamba-solver readline truststore idna conda cffi conda-content-trust pluggy requests pycosat krb5 pysocks conda-package-streaming sqlite pyopenssl ncurses jsonpatch libmambapy wheel cryptography jsonpointer zstandard ruamel.yaml urllib3 anarci pip setuptools certifi abnumber conda-Examining conflict for conda-package-streaming zstandard pycparser brotlipy urllib3 conda anarci charset-normalizer pip setuptools requests certifi jsonpatch abnumber wheel conda-package-handling cryptography jsonpointer: 7%|โ–‹ | 5/76 [00:02<00:28, 2.48it/s]

Allowing humanization to make multiple outputs

Currently, the output of the humanization step returns a single fasta file. I would be interested in having the output have multiple sequences that are based on sampling from the output from the model.
I have a POC to do this in the cli, we could discuss what this would look like in the web client if needed.

make docker-build fails on machines with <=4GB RAM, but succeeds with >=8GB RAM

Provisioned a fresh VM with Ubuntu 20.04., git v2.25.1, and latest clone of BioPhi's main.

2 vCPU and 4 GB RAM throws the following error:

Click to expand console output
base $ git clone https://github.com/Merck/BioPhi.git
Cloning into 'BioPhi'...
remote: Enumerating objects: 1513, done.
remote: Counting objects: 100% (1513/1513), done.
remote: Compressing objects: 100% (1103/1103), done.
remote: Total 1513 (delta 467), reused 1419 (delta 385), pack-reused 0
Receiving objects: 100% (1513/1513), 14.62 MiB | 43.79 MiB/s, done.
Resolving deltas: 100% (467/467), done.

base $ cd BioPhi
base $ make docker-build
docker-compose build
redis uses an image, skipping
Building web
Sending build context to Docker daemon  29.64MB
Step 1/11 : FROM continuumio/miniconda3
latest: Pulling from continuumio/miniconda3
33847f680f63: Pull complete
f5a80bcd1413: Pull complete
8d0d14d1334a: Pull complete
Digest: sha256:592a60b95b547f31c11dc6593832e962952e3178f1fa11db37f43a2afe8df8d7
Status: Downloaded newer image for continuumio/miniconda3:latest
 ---> 67414e5844b6
Step 2/11 : RUN apt-get update -y && apt-get install -y --no-install-recommends         build-essential
 ---> Running in 4c3edc93c3d5
Get:1 http://deb.debian.org/debian buster InRelease [122 kB]
Get:2 http://deb.debian.org/debian buster-updates InRelease [51.9 kB]
Get:3 http://security.debian.org/debian-security buster/updates InRelease [65.4 kB]
Get:4 http://deb.debian.org/debian buster/main amd64 Packages [7907 kB]
Get:5 http://deb.debian.org/debian buster-updates/main amd64 Packages [15.2 kB]
Get:6 http://security.debian.org/debian-security buster/updates/main amd64 Packages [302 kB]
Fetched 8463 kB in 2s (5427 kB/s)
Reading package lists...
Reading package lists...
Building dependency tree...
Reading state information...
The following additional packages will be installed:
  binutils binutils-common binutils-x86-64-linux-gnu cpp cpp-8 dpkg-dev g++
  g++-8 gcc gcc-8 libasan5 libatomic1 libbinutils libc-dev-bin libc6-dev
  libcc1-0 libdpkg-perl libgcc-8-dev libgomp1 libisl19 libitm1 liblsan0
  libmpc3 libmpfr6 libmpx2 libquadmath0 libstdc++-8-dev libtsan0 libubsan1
  linux-libc-dev make patch xz-utils
Suggested packages:
  binutils-doc cpp-doc gcc-8-locales debian-keyring g++-multilib
  g++-8-multilib gcc-8-doc libstdc++6-8-dbg gcc-multilib manpages-dev autoconf
  automake libtool flex bison gdb gcc-doc gcc-8-multilib libgcc1-dbg
  libgomp1-dbg libitm1-dbg libatomic1-dbg libasan5-dbg liblsan0-dbg
  libtsan0-dbg libubsan1-dbg libmpx2-dbg libquadmath0-dbg glibc-doc gnupg
  | gnupg2 bzr libstdc++-8-doc make-doc ed diffutils-doc
Recommended packages:
  fakeroot gnupg | gnupg2 libalgorithm-merge-perl manpages manpages-dev
  libfile-fcntllock-perl liblocale-gettext-perl
The following NEW packages will be installed:
  binutils binutils-common binutils-x86-64-linux-gnu build-essential cpp cpp-8
  dpkg-dev g++ g++-8 gcc gcc-8 libasan5 libatomic1 libbinutils libc-dev-bin
  libc6-dev libcc1-0 libdpkg-perl libgcc-8-dev libgomp1 libisl19 libitm1
  liblsan0 libmpc3 libmpfr6 libmpx2 libquadmath0 libstdc++-8-dev libtsan0
  libubsan1 linux-libc-dev make patch xz-utils
0 upgraded, 34 newly installed, 0 to remove and 4 not upgraded.
Need to get 47.3 MB of archives.
After this operation, 181 MB of additional disk space will be used.
Get:1 http://deb.debian.org/debian buster/main amd64 xz-utils amd64 5.2.4-1 [183 kB]
Get:2 http://security.debian.org/debian-security buster/updates/main amd64 linux-libc-dev amd64 4.19.194-3 [1459 kB]
Get:3 http://deb.debian.org/debian buster/main amd64 binutils-common amd64 2.31.1-16 [2073 kB]
Get:4 http://deb.debian.org/debian buster/main amd64 libbinutils amd64 2.31.1-16 [478 kB]
Get:5 http://deb.debian.org/debian buster/main amd64 binutils-x86-64-linux-gnu amd64 2.31.1-16 [1823 kB]
Get:6 http://deb.debian.org/debian buster/main amd64 binutils amd64 2.31.1-16 [56.8 kB]
Get:7 http://deb.debian.org/debian buster/main amd64 libc-dev-bin amd64 2.28-10 [275 kB]
Get:8 http://deb.debian.org/debian buster/main amd64 libc6-dev amd64 2.28-10 [2691 kB]
Get:9 http://deb.debian.org/debian buster/main amd64 libisl19 amd64 0.20-2 [587 kB]
Get:10 http://deb.debian.org/debian buster/main amd64 libmpfr6 amd64 4.0.2-1 [775 kB]
Get:11 http://deb.debian.org/debian buster/main amd64 libmpc3 amd64 1.1.0-1 [41.3 kB]
Get:12 http://deb.debian.org/debian buster/main amd64 cpp-8 amd64 8.3.0-6 [8914 kB]
Get:13 http://deb.debian.org/debian buster/main amd64 cpp amd64 4:8.3.0-1 [19.4 kB]
Get:14 http://deb.debian.org/debian buster/main amd64 libcc1-0 amd64 8.3.0-6 [46.6 kB]
Get:15 http://deb.debian.org/debian buster/main amd64 libgomp1 amd64 8.3.0-6 [75.8 kB]
Get:16 http://deb.debian.org/debian buster/main amd64 libitm1 amd64 8.3.0-6 [27.7 kB]
Get:17 http://deb.debian.org/debian buster/main amd64 libatomic1 amd64 8.3.0-6 [9032 B]
Get:18 http://deb.debian.org/debian buster/main amd64 libasan5 amd64 8.3.0-6 [362 kB]
Get:19 http://deb.debian.org/debian buster/main amd64 liblsan0 amd64 8.3.0-6 [131 kB]
Get:20 http://deb.debian.org/debian buster/main amd64 libtsan0 amd64 8.3.0-6 [283 kB]
Get:21 http://deb.debian.org/debian buster/main amd64 libubsan1 amd64 8.3.0-6 [120 kB]
Get:22 http://deb.debian.org/debian buster/main amd64 libmpx2 amd64 8.3.0-6 [11.4 kB]
Get:23 http://deb.debian.org/debian buster/main amd64 libquadmath0 amd64 8.3.0-6 [133 kB]
Get:24 http://deb.debian.org/debian buster/main amd64 libgcc-8-dev amd64 8.3.0-6 [2298 kB]
Get:25 http://deb.debian.org/debian buster/main amd64 gcc-8 amd64 8.3.0-6 [9452 kB]
Get:26 http://deb.debian.org/debian buster/main amd64 gcc amd64 4:8.3.0-1 [5196 B]
Get:27 http://deb.debian.org/debian buster/main amd64 libstdc++-8-dev amd64 8.3.0-6 [1532 kB]
Get:28 http://deb.debian.org/debian buster/main amd64 g++-8 amd64 8.3.0-6 [9752 kB]
Get:29 http://deb.debian.org/debian buster/main amd64 g++ amd64 4:8.3.0-1 [1644 B]
Get:30 http://deb.debian.org/debian buster/main amd64 make amd64 4.2.1-1.2 [341 kB]
Get:31 http://deb.debian.org/debian buster/main amd64 libdpkg-perl all 1.19.7 [1414 kB]
Get:32 http://deb.debian.org/debian buster/main amd64 patch amd64 2.7.6-3+deb10u1 [126 kB]
Get:33 http://deb.debian.org/debian buster/main amd64 dpkg-dev all 1.19.7 [1773 kB]
Get:34 http://deb.debian.org/debian buster/main amd64 build-essential amd64 12.6 [7576 B]
debconf: delaying package configuration, since apt-utils is not installed
Fetched 47.3 MB in 1s (68.9 MB/s)
Selecting previously unselected package xz-utils.
(Reading database ... 12133 files and directories currently installed.)
Preparing to unpack .../00-xz-utils_5.2.4-1_amd64.deb ...
Unpacking xz-utils (5.2.4-1) ...
Selecting previously unselected package binutils-common:amd64.
Preparing to unpack .../01-binutils-common_2.31.1-16_amd64.deb ...
Unpacking binutils-common:amd64 (2.31.1-16) ...
Selecting previously unselected package libbinutils:amd64.
Preparing to unpack .../02-libbinutils_2.31.1-16_amd64.deb ...
Unpacking libbinutils:amd64 (2.31.1-16) ...
Selecting previously unselected package binutils-x86-64-linux-gnu.
Preparing to unpack .../03-binutils-x86-64-linux-gnu_2.31.1-16_amd64.deb ...
Unpacking binutils-x86-64-linux-gnu (2.31.1-16) ...
Selecting previously unselected package binutils.
Preparing to unpack .../04-binutils_2.31.1-16_amd64.deb ...
Unpacking binutils (2.31.1-16) ...
Selecting previously unselected package libc-dev-bin.
Preparing to unpack .../05-libc-dev-bin_2.28-10_amd64.deb ...
Unpacking libc-dev-bin (2.28-10) ...
Selecting previously unselected package linux-libc-dev:amd64.
Preparing to unpack .../06-linux-libc-dev_4.19.194-3_amd64.deb ...
Unpacking linux-libc-dev:amd64 (4.19.194-3) ...
Selecting previously unselected package libc6-dev:amd64.
Preparing to unpack .../07-libc6-dev_2.28-10_amd64.deb ...
Unpacking libc6-dev:amd64 (2.28-10) ...
Selecting previously unselected package libisl19:amd64.
Preparing to unpack .../08-libisl19_0.20-2_amd64.deb ...
Unpacking libisl19:amd64 (0.20-2) ...
Selecting previously unselected package libmpfr6:amd64.
Preparing to unpack .../09-libmpfr6_4.0.2-1_amd64.deb ...
Unpacking libmpfr6:amd64 (4.0.2-1) ...
Selecting previously unselected package libmpc3:amd64.
Preparing to unpack .../10-libmpc3_1.1.0-1_amd64.deb ...
Unpacking libmpc3:amd64 (1.1.0-1) ...
Selecting previously unselected package cpp-8.
Preparing to unpack .../11-cpp-8_8.3.0-6_amd64.deb ...
Unpacking cpp-8 (8.3.0-6) ...
Selecting previously unselected package cpp.
Preparing to unpack .../12-cpp_4%3a8.3.0-1_amd64.deb ...
Unpacking cpp (4:8.3.0-1) ...
Selecting previously unselected package libcc1-0:amd64.
Preparing to unpack .../13-libcc1-0_8.3.0-6_amd64.deb ...
Unpacking libcc1-0:amd64 (8.3.0-6) ...
Selecting previously unselected package libgomp1:amd64.
Preparing to unpack .../14-libgomp1_8.3.0-6_amd64.deb ...
Unpacking libgomp1:amd64 (8.3.0-6) ...
Selecting previously unselected package libitm1:amd64.
Preparing to unpack .../15-libitm1_8.3.0-6_amd64.deb ...
Unpacking libitm1:amd64 (8.3.0-6) ...
Selecting previously unselected package libatomic1:amd64.
Preparing to unpack .../16-libatomic1_8.3.0-6_amd64.deb ...
Unpacking libatomic1:amd64 (8.3.0-6) ...
Selecting previously unselected package libasan5:amd64.
Preparing to unpack .../17-libasan5_8.3.0-6_amd64.deb ...
Unpacking libasan5:amd64 (8.3.0-6) ...
Selecting previously unselected package liblsan0:amd64.
Preparing to unpack .../18-liblsan0_8.3.0-6_amd64.deb ...
Unpacking liblsan0:amd64 (8.3.0-6) ...
Selecting previously unselected package libtsan0:amd64.
Preparing to unpack .../19-libtsan0_8.3.0-6_amd64.deb ...
Unpacking libtsan0:amd64 (8.3.0-6) ...
Selecting previously unselected package libubsan1:amd64.
Preparing to unpack .../20-libubsan1_8.3.0-6_amd64.deb ...
Unpacking libubsan1:amd64 (8.3.0-6) ...
Selecting previously unselected package libmpx2:amd64.
Preparing to unpack .../21-libmpx2_8.3.0-6_amd64.deb ...
Unpacking libmpx2:amd64 (8.3.0-6) ...
Selecting previously unselected package libquadmath0:amd64.
Preparing to unpack .../22-libquadmath0_8.3.0-6_amd64.deb ...
Unpacking libquadmath0:amd64 (8.3.0-6) ...
Selecting previously unselected package libgcc-8-dev:amd64.
Preparing to unpack .../23-libgcc-8-dev_8.3.0-6_amd64.deb ...
Unpacking libgcc-8-dev:amd64 (8.3.0-6) ...
Selecting previously unselected package gcc-8.
Preparing to unpack .../24-gcc-8_8.3.0-6_amd64.deb ...
Unpacking gcc-8 (8.3.0-6) ...
Selecting previously unselected package gcc.
Preparing to unpack .../25-gcc_4%3a8.3.0-1_amd64.deb ...
Unpacking gcc (4:8.3.0-1) ...
Selecting previously unselected package libstdc++-8-dev:amd64.
Preparing to unpack .../26-libstdc++-8-dev_8.3.0-6_amd64.deb ...
Unpacking libstdc++-8-dev:amd64 (8.3.0-6) ...
Selecting previously unselected package g++-8.
Preparing to unpack .../27-g++-8_8.3.0-6_amd64.deb ...
Unpacking g++-8 (8.3.0-6) ...
Selecting previously unselected package g++.
Preparing to unpack .../28-g++_4%3a8.3.0-1_amd64.deb ...
Unpacking g++ (4:8.3.0-1) ...
Selecting previously unselected package make.
Preparing to unpack .../29-make_4.2.1-1.2_amd64.deb ...
Unpacking make (4.2.1-1.2) ...
Selecting previously unselected package libdpkg-perl.
Preparing to unpack .../30-libdpkg-perl_1.19.7_all.deb ...
Unpacking libdpkg-perl (1.19.7) ...
Selecting previously unselected package patch.
Preparing to unpack .../31-patch_2.7.6-3+deb10u1_amd64.deb ...
Unpacking patch (2.7.6-3+deb10u1) ...
Selecting previously unselected package dpkg-dev.
Preparing to unpack .../32-dpkg-dev_1.19.7_all.deb ...
Unpacking dpkg-dev (1.19.7) ...
Selecting previously unselected package build-essential.
Preparing to unpack .../33-build-essential_12.6_amd64.deb ...
Unpacking build-essential (12.6) ...
Setting up binutils-common:amd64 (2.31.1-16) ...
Setting up linux-libc-dev:amd64 (4.19.194-3) ...
Setting up libgomp1:amd64 (8.3.0-6) ...
Setting up libasan5:amd64 (8.3.0-6) ...
Setting up make (4.2.1-1.2) ...
Setting up libmpfr6:amd64 (4.0.2-1) ...
Setting up xz-utils (5.2.4-1) ...
update-alternatives: using /usr/bin/xz to provide /usr/bin/lzma (lzma) in auto mode
update-alternatives: warning: skip creation of /usr/share/man/man1/lzma.1.gz because associated file /usr/share/man/man1/xz.1.gz (of link group lzma) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/man1/unlzma.1.gz because associated file /usr/share/man/man1/unxz.1.gz (of link group lzma) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/man1/lzcat.1.gz because associated file /usr/share/man/man1/xzcat.1.gz (of link group lzma) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/man1/lzmore.1.gz because associated file /usr/share/man/man1/xzmore.1.gz (of link group lzma) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/man1/lzless.1.gz because associated file /usr/share/man/man1/xzless.1.gz (of link group lzma) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/man1/lzdiff.1.gz because associated file /usr/share/man/man1/xzdiff.1.gz (of link group lzma) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/man1/lzcmp.1.gz because associated file /usr/share/man/man1/xzcmp.1.gz (of link group lzma) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/man1/lzgrep.1.gz because associated file /usr/share/man/man1/xzgrep.1.gz (of link group lzma) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/man1/lzegrep.1.gz because associated file /usr/share/man/man1/xzegrep.1.gz (of link group lzma) doesn't exist
update-alternatives: warning: skip creation of /usr/share/man/man1/lzfgrep.1.gz because associated file /usr/share/man/man1/xzfgrep.1.gz (of link group lzma) doesn't exist
Setting up libquadmath0:amd64 (8.3.0-6) ...
Setting up libmpc3:amd64 (1.1.0-1) ...
Setting up libatomic1:amd64 (8.3.0-6) ...
Setting up patch (2.7.6-3+deb10u1) ...
Setting up libdpkg-perl (1.19.7) ...
Setting up libmpx2:amd64 (8.3.0-6) ...
Setting up libubsan1:amd64 (8.3.0-6) ...
Setting up libisl19:amd64 (0.20-2) ...
Setting up libbinutils:amd64 (2.31.1-16) ...
Setting up cpp-8 (8.3.0-6) ...
Setting up libc-dev-bin (2.28-10) ...
Setting up libcc1-0:amd64 (8.3.0-6) ...
Setting up liblsan0:amd64 (8.3.0-6) ...
Setting up libitm1:amd64 (8.3.0-6) ...
Setting up binutils-x86-64-linux-gnu (2.31.1-16) ...
Setting up libtsan0:amd64 (8.3.0-6) ...
Setting up binutils (2.31.1-16) ...
Setting up dpkg-dev (1.19.7) ...
Setting up libgcc-8-dev:amd64 (8.3.0-6) ...
Setting up cpp (4:8.3.0-1) ...
Setting up libc6-dev:amd64 (2.28-10) ...
Setting up libstdc++-8-dev:amd64 (8.3.0-6) ...
Setting up gcc-8 (8.3.0-6) ...
Setting up gcc (4:8.3.0-1) ...
Setting up g++-8 (8.3.0-6) ...
Setting up g++ (4:8.3.0-1) ...
update-alternatives: using /usr/bin/g++ to provide /usr/bin/c++ (c++) in auto mode
Setting up build-essential (12.6) ...
Processing triggers for libc-bin (2.28-10) ...
Removing intermediate container 4c3edc93c3d5
 ---> 7929c494c4c5
Step 3/11 : WORKDIR /opt/biophi
 ---> Running in af786e51ff0c
Removing intermediate container af786e51ff0c
 ---> c985a3f4b9e9
Step 4/11 : COPY environment.yml .
 ---> e938f6f37d4e
Step 5/11 : COPY Makefile .
 ---> 1b27c5adcbde
Step 6/11 : RUN make env-update ENV_NAME=base
 ---> Running in 14b9502a50b7
conda env update -n base -f environment.yml
Collecting package metadata (repodata.json): ...working... done
Solving environment: ...working... done

Downloading and Extracting Packages
abnumber-0.2.7       | 33 KB     | ########## | 100%
wheel-0.36.2         | 33 KB     | ########## | 100%
libblas-3.9.0        | 12 KB     | ########## | 100%
certifi-2021.5.30    | 141 KB    | ########## | 100%
ruamel_yaml-0.15.80  | 272 KB    | ########## | 100%
libcblas-3.9.0       | 11 KB     | ########## | 100%
idna-2.10            | 52 KB     | ########## | 100%
setuptools-57.4.0    | 934 KB    | ########## | 100%
pandas-1.3.1         | 13.0 MB   | ########## | 100%
liblapack-3.9.0      | 11 KB     | ########## | 100%
pysocks-1.7.1        | 27 KB     | ########## | 100%
python-dateutil-2.8. | 240 KB    | ########## | 100%
pip-21.2.4           | 1.1 MB    | ########## | 100%
libopenblas-0.3.17   | 9.2 MB    | ########## | 100%
biopython-1.79       | 2.6 MB    | ########## | 100%
pycosat-0.6.3        | 107 KB    | ########## | 100%
brotlipy-0.7.0       | 341 KB    | ########## | 100%
tqdm-4.61.2          | 83 KB     | ########## | 100%
numpy-1.21.1         | 6.2 MB    | ########## | 100%
openssl-1.1.1k       | 2.1 MB    | ########## | 100%
pyopenssl-20.0.1     | 49 KB     | ########## | 100%
libgfortran5-11.1.0  | 1.7 MB    | ########## | 100%
cryptography-3.4.7   | 1.1 MB    | ########## | 100%
conda-package-handli | 927 KB    | ########## | 100%
libgfortran-ng-11.1. | 19 KB     | ########## | 100%
six-1.16.0           | 18 KB     | ########## | 100%
urllib3-1.26.6       | 112 KB    | ########## | 100%
conda-4.10.3         | 3.1 MB    | ########## | 100%
hmmer-3.3.2          | 9.6 MB    | ########## | 100%
anarci-2020.04.23    | 1.1 MB    | ########## | 100%
python-3.8.10        | 26.2 MB   | ########## | 100%
chardet-4.0.0        | 199 KB    | ########## | 100%
python_abi-3.8       | 4 KB      | ########## | 100%
pytz-2021.1          | 239 KB    | ########## | 100%
cffi-1.14.6          | 226 KB    | ########## | 100%
requests-2.25.1      | 52 KB     | ########## | 100%
pycparser-2.20       | 94 KB     | ########## | 100%
Preparing transaction: ...working... done
Verifying transaction: ...working... done
Executing transaction: ...working... done
Installing pip dependencies: ...working... Ran pip subprocess with arguments:
['/opt/conda/bin/python', '-m', 'pip', 'install', '-U', '-r', '/opt/biophi/condaenv.tshc824g.requirements.txt']
Pip subprocess output:
Collecting click>=7
  Downloading click-8.0.1-py3-none-any.whl (97 kB)
Requirement already satisfied: pandas in /opt/conda/lib/python3.8/site-packages (from -r /opt/biophi/condaenv.tshc824g.requirements.txt (line 2)) (1.3.1)
Collecting pandas
  Downloading pandas-1.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (11.5 MB)
Collecting sqlalchemy
  Downloading SQLAlchemy-1.4.23-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.5 MB)
Collecting flask
  Downloading Flask-2.0.1-py3-none-any.whl (94 kB)
Collecting redis
  Downloading redis-3.5.3-py2.py3-none-any.whl (72 kB)
Collecting celery
  Downloading celery-5.1.2-py3-none-any.whl (401 kB)
Requirement already satisfied: biopython in /opt/conda/lib/python3.8/site-packages (from -r /opt/biophi/condaenv.tshc824g.requirements.txt (line 7)) (1.79)
Collecting pytest
  Downloading pytest-6.2.4-py3-none-any.whl (280 kB)
Requirement already satisfied: requests in /opt/conda/lib/python3.8/site-packages (from -r /opt/biophi/condaenv.tshc824g.requirements.txt (line 9)) (2.25.1)
Collecting requests
  Downloading requests-2.26.0-py2.py3-none-any.whl (62 kB)
Requirement already satisfied: tqdm in /opt/conda/lib/python3.8/site-packages (from -r /opt/biophi/condaenv.tshc824g.requirements.txt (line 10)) (4.61.2)
Collecting tqdm
  Downloading tqdm-4.62.2-py2.py3-none-any.whl (76 kB)
Collecting xlsxwriter
  Downloading XlsxWriter-3.0.1-py3-none-any.whl (148 kB)
Collecting humanize
  Downloading humanize-3.11.0-py3-none-any.whl (90 kB)
Collecting fairseq==0.10.2
  Downloading fairseq-0.10.2-cp38-cp38-manylinux1_x86_64.whl (1.7 MB)
Requirement already satisfied: cffi in /opt/conda/lib/python3.8/site-packages (from fairseq==0.10.2->-r /opt/biophi/condaenv.tshc824g.requirements.txt (line 13)) (1.14.6)
Collecting hydra-core
  Downloading hydra_core-1.1.1-py3-none-any.whl (145 kB)
Requirement already satisfied: numpy in /opt/conda/lib/python3.8/site-packages (from fairseq==0.10.2->-r /opt/biophi/condaenv.tshc824g.requirements.txt (line 13)) (1.21.1)
Collecting sacrebleu>=1.4.12
  Downloading sacrebleu-2.0.0-py3-none-any.whl (90 kB)
Collecting regex
  Downloading regex-2021.8.21-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (763 kB)
Collecting dataclasses
  Downloading dataclasses-0.6-py3-none-any.whl (14 kB)
Collecting torch
  Downloading torch-1.9.0-cp38-cp38-manylinux1_x86_64.whl (831.4 MB)

failed
Pip subprocess error:
/opt/conda/.tmpg04gop_k: line 3:   461 Killed                  /opt/conda/bin/python -m pip install -U -r /opt/biophi/condaenv.tshc824g.requirements.txt


CondaEnvException: Pip failed

make: *** [Makefile:12: env-update] Error 1
The command '/bin/sh -c make env-update ENV_NAME=base' returned a non-zero code: 2
ERROR: Service 'web' failed to build : Build failed
make: *** [Makefile:25: docker-build] Error 1

The error does not appear if repeating the above steps with 2 vCPU and 8 GB RAM.

unable to use cdr_grafting by BioPhi command-line interface

When trying:
biophi cdr_grafting mabs.fa --fasta-only --output humanized.fa

errors appears as:
Error: No such command 'cdr_grafting'

I checked scripts and find methond "cdr_grafting" is in class CDRGraftingHumanizationParams of script biophi/humanization/methods/humanization.py.

How should cdr_grafting function should be used in command line?

Humanization Benchmark Data Availability

Hi,

Thanks for open sourcing this work- it's awesome! Not as much of an issue as a request. Could you open source the humanization benchmark that you use in the paper. It would be great for reproducibility and development of future humanization tools!

Thanks!

OASis CLI displays an Incorrect example database path

It's a minor thing, but it gave me a headache for a little while and I needed to dig through the code to see how the path specified via --oasis-db is passed to SQLite.

The CLI --help page suggests to pass paths as sqlite:////Absolute/path/to/oas_human_subject_9mers_2019_11.db, but as can be seen below, the sqlite:/// is already prepended. This results in Error: .. The OASis DB path does not exist ....

oas_engine = create_engine('sqlite:///' + os.path.abspath(params.oasis_db_path), echo=False)

Error: Invalid sequence for "antibody"

Error: Invalid sequence for "Antibody1": "QVQLVQSGGGVVQPGRSLRLSCKASGYTETRYTMHWVRQAPGKGLEWIGYINPSRGYTNYNOKVKDRFTISTDKSKSTAFLQMDSLRPEDTAVYYCARYYDDHYCSSASCFCLDYWGQGTPVTVSS"
I dont know how to deal with this issue , It would be much appreciated if this solves the problem,thanks guys!

Conda UnsatisfiableError

I am trying to run biophi in anaconda. I am very new to this so already after trying to get biophi running I encounter the following error.

UnsatisfiableError: The following specifications were found to be incompatible with each other:

Output in format: Requested package -> Available versionsThe following specifications were found to be incompatible with your system:

feature:/win-64::__win==0=0

feature:|@/win-64::__win==0=0

biophi -> click[version='>=7'] -> __unix

biophi -> click[version='>=7'] -> __win

I tried installing click using pip install click, which seems to work, but doesnt resolve the error message. I am running this in windows and not in unix/linux.

Use of Sapiens model for antibody representation?

Greetings, and thank you for your work!

I have been searching for a transformer model that has been trained on antibody sequences so I can extract learned representations of my dataset, and this repo has been suggested to me.

After tokenizing a sequence using the dictionary.encode_line method, I call the sentence_encoder to get the encoding of my sequence. In your opinion, for downstream tasks, should I ideally be working with the 'encoder_embedding' values of the output dictionary, or with the 'encoder_out' values?

More importantly, is there a built-in method that I can use to get reconstructed sequences back? I wrote a linear decoder that tries to reconstruct the 'encoder_out' values back to the original tokenized sequences, and although it does a somewhat decent job, it is not good enough for my downstream tasks.

Thank you for your time!

Enable cli for humanization cdr grafting

I would like to run the tool using the cli instead of the web browser. Specifically, I saw that running the humanization using cdr grafting is not an option currently. Is this a feature you would be interested in having?
I'm happy to work on implementing this, I would propose a two step process:

  1. Set up the cdr grafting with as minimal changes as possible
  2. Make an api for the calls that are made. Currently things between the cli and the web use very similar functions for the same features in the browser and in the cli. The functions should probably be structured in a way where both of them use the same methods, but have the web interface and the cli use different methods for parsing/(de)serialising input and output. This second point would be a lot of restructuring work, but would allow the two interfaces to stay in sink more easily.

make docker-build fails due to error in Makefile

After 8eb45f1 merged, now getting error doing fresh install on new VM and repo clone:

$ make docker-build
docker-compose build
ERROR: Service 'web' depends on service 'mongo' which is undefined.
make: *** [Makefile:25: docker-build] Error 1

Add mypy type checks

Hi there.
Thank you very much for contributing to the project.

I noticed that the code has some type checking, but not all of it does. I'm happy to try and improve this by adding it everywhere, as well as adding a check in the ci that ensures proper type checking. Let me know if this is something you would approve of.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.