Coder Social home page Coder Social logo

dncon2's Introduction

DNCON2

Deep convolutional neural networks for protein contact map prediction

Web-server and datasets at http://sysbio.rnet.missouri.edu/dncon2/

Citation

Badri Adhikari, Jie Hou, Jianlin Cheng. "DNCON2: Improved protein contact prediction using two-level deep convolutional neural networks". Bioinformatics, 2017.

Test Environment

64-bit PC - Ubuntu 16.04 LTS

Programs, Scripts, and Databases dependencies

Programs, Scripts, and Databases dependencies

Installation Notes

  • We tested in Ubuntu becasue the tool 'FreeContact' is easier to install in a Debian system. If you would like to install DNCON2 in some other operating systems, first test if 'FreeContact' can be installed in it. If, for some reason, you do not have a Ubuntu machine, and cannot install FreeContact, you can still use DNCON2. With just a few code updates you can skip using the FreeContact tool. You will get slightly less precise results.
  • Updated versions of Databases and Programs 'may' generate better results, but we recommend initial installation with the versions suggested here. We haven't rigorously tested new versions of third-party programs and newer databases.
  • Since these installation steps are for a 64-bit machine, for installing some of the programs, your links may be different. Please refer to appropriate third-party websites.
  • For verifying your installation, use the results in the scripts and outputs in the dry-run directory. In the dry-run directory we provide input, output, and log files of DNCON2 execution for three sequences - 3e7u, T0866, and T0900.
  • It is possible that your installation results have slightly different confidence values. We noticed that the programs PSICOV, CCMpred, and FreeContact can produce slightly different results based on the machine architecture, availability/absence of GPU, etc.

Data Flow

Data Flow

Installation Steps

(A) Download and Unzip DNCON2 package
Create a working directory called 'DNCON2' where all scripts, programs and databases will reside:

cd ~
mkdir DNCON2

Download the DNCON2 code:

cd ~/DNCON2/
wget http://sysbio.rnet.missouri.edu/bdm_download/dncon2-tool/DNCON2.tar.gz
tar zxvf DNCON2.tar.gz
# Alternately
git clone https://github.com/multicom-toolbox/DNCON2.git

(B) Download and Unzip all databases

cd ~/DNCON2/  
mkdir databases  
cd databases/  
wget http://sysbio.rnet.missouri.edu/bdm_download/dncon2-tool/databases/nr90-2012.tar.gz  
tar zxvf nr90-2012.tar.gz  
wget http://sysbio.rnet.missouri.edu/bdm_download/dncon2-tool/databases/uniref.tar.gz  
tar zxvf uniref.tar.gz  
wget http://sysbio.rnet.missouri.edu/bdm_download/dncon2-tool/databases/uniprot20_2016_02.tar.gz  
tar zxvf uniprot20_2016_02.tar.gz  

(C) Install FreeContact, PSICOV, and CCMpred

sudo apt-get install freecontact

Note: If you do not have root permissions, refer to the 'freecontact-install-non-root.txt' for instructions.

cd ~/DNCON2/
mkdir psicov
cd psicov/
wget http://bioinfadmin.cs.ucl.ac.uk/downloads/PSICOV/psicov2.c
wget http://bioinfadmin.cs.ucl.ac.uk/downloads/PSICOV/Makefile
make
cd ~/DNCON2/
sudo apt install git
sudo apt install cmake
git clone --recursive https://github.com/soedinglab/CCMpred.git
cd CCMpred
cmake .
make

[OPTIONAL] Verify FreeContact, PSICOV, and CCMpred Installation

cd ~/DNCON2/
mkdir test-dncon2
./CCMpred/bin/ccmpred ./CCMpred/example/1atzA.aln ~/DNCON2/test-dncon2/ccmpred.cmap
./psicov/psicov ./CCMpred/example/1atzA.aln > ~/DNCON2/test-dncon2/psicov.rr
freecontact < ./CCMpred/example/1atzA.aln > ~/DNCON2/test-dncon2/freecontact.rr
[above freecontact command may throw a 'Symbol .. has different size ..' warning!]

(D) Install Tensorflow, Keras, and h5py and Update keras.json

(a) Install Tensorflow:

sudo pip install tensorflow

GPU version is NOT needed. If you face issues, refer to the the tensor flow installation guide at https://www.tensorflow.org/install/install_linux.

(b) Install Keras:

sudo pip install keras

(c) Install the h5py library:

sudo pip install python-h5py

(d) Add the entry [“image_dim_ordering": "tf”,] to your keras..json file at ~/.keras/keras.json. Note that if you have not tried to run Keras before, you have have to execute the Tensorflow verification step once so that your keras.json file is created. After the update, your keras.json should look like the one below:

{
    "epsilon": 1e-07,
    "floatx": "float32",
    "image_dim_ordering":"tf",
    "image_data_format": "channels_last",
    "backend": "tensorflow"
}

[OPTIONAL] Verify Tensorflow, Keras, and hp5y installation

The script ‘predict-rr-from-features.sh’ takes a feature file as input and predicts contacts using the trained CNN models. Using an existing feature file (feat-3e7u.txt) and a name for output RR file and intermediate stage2 feature file, test the installation of Tensorflow, Keras, and hp5y using the following command:

cd ~/DNCON2/
./DNCON2/scripts/predict-rr-from-features.sh ./DNCON2/dry-run/output/3e7u/feat-3e7u.txt ./test-dncon2/3e7u.rr ./test-dncon2/feat-stg2.txt

Verify that the contents of your output file ‘3e7u.rr’ matches the contents of ‘~/DNCON2/dry-run/output/3e7u/3e7u.dncon2.rr’.

(E) Install Legacy Blast, PSIPRED, and runpsipredandsolv (MetaPSICOV)

(a) Install PSIPRED

cd ~/DNCON2/
wget http://bioinfadmin.cs.ucl.ac.uk/downloads/psipred/old_versions/psipred3.5.tar.gz
tar zxvf psipred3.5.tar.gz

(b) Install Legacy Blast

wget ftp://ftp.ncbi.nlm.nih.gov/blast/executables/legacy/2.2.26/blast-2.2.26-x64-linux.tar.gz
tar zxvf blast-2.2.26-x64-linux.tar.gz

(c) Install MetaPSICOV

mkdir ~/DNCON2/metapsicov
cd  ~/DNCON2/metapsicov/
wget http://bioinfadmin.cs.ucl.ac.uk/downloads/MetaPSICOV/metapsicov.tar.gz
tar zxvf metapsicov.tar.gz
cd src
make
make install

(d) Install 'tcsh' If you do not have a copy of csh at '/usr/bin/csh' install 'tcsh':

sudo apt-get install tcsh (below requires it)

(e) Update the following paths in '~/DNCON2/metapsicov/runpsipredandsolv'

set dbname = /home/badri/DNCON2/databases/uniref/uniref90pfilt
set ncbidir = /home/badri/DNCON2/blast-2.2.26/bin
set execdir = /home/badri/DNCON2/psipred/bin/
set execdir2 = /home/badri/DNCON2/metapsicov/bin/
set datadir = /home/badri/DNCON2/psipred/data/ 
set datadir2 = /home/badri/DNCON2/metapsicov/data/

[OPTIONAL] Verify 'runpsipredandsolv' installation:

cd ~/DNCON2/
cp ./metapsicov/examples/5ptpA.fasta ~/DNCON2/test-dncon2/
cd ~/DNCON2/test-dncon2/
../metapsicov/runpsipredandsolv 5ptpA.fasta

Check the expected output files '5ptpA.ss2', '5ptpA.horiz', and '5ptpA.solv'.

(F) Install SCRATCH Suite

cd ~/DNCON2/
wget http://download.igb.uci.edu/SCRATCH-1D_1.1.tar.gz
tar zxvf SCRATCH-1D_1.1.tar.gz
cd SCRATCH-1D_1.1/
perl install.pl
// Replace the 32-bit blast with 64-bit version (if needed)
mv ./pkg/blast-2.2.26 ./pkg/blast-2.2.26.original
cp -r ~/blast-2.2.26 ./pkg/ (64-bit Legacy Blast is already installed)

[OPTIONAL] Verify SCRATCH installation

cd ~/DNCON2/SCRATCH-1D_1.1/
cd doc
../bin/run_SCRATCH-1D_predictors.sh test.fasta test.out 4

(G) Install HHblits and JackHMMER

sudo apt install hhsuite
cd ~/DNCON2/
wget http://eddylab.org/software/hmmer3/3.1b2/hmmer-3.1b2-linux-intel-x86_64.tar.gz
tar zxvf hmmer-3.1b2-linux-intel-x86_64.tar.gz
cd hmmer-3.1b2-linux-intel-x86_64
./configure
make
sudo apt-get install csh
cd ~/DNCON2/
wget ftp://ftp.ncbi.nih.gov/toolbox/ncbi_tools/ncbi.tar.gz
tar zxvf ncbi.tar.gz
csh
./ncbi/make/makedis.csh
exit

(H) Configure DNCON2 scripts (in '~/DNCON2/DNCON2/scripts/' directory)

(a) Update the following variables in the script 'run-ccmpred-freecontact-psicov.pl'

FREECONTACT=> '/usr/bin/freecontact',
PSICOV    => '/home/badri/DNCON2/psicov/psicov',
CCMPRED   => '/home/badri/DNCON2/CCMpred/bin/ccmpred',
HOURLIMIT => 24,
NPROC     => 8

(b) Update the following variables in the script 'generate-alignments.pl'

JACKHMMER   => '/home/badri/DNCON2/hmmer-3.1b2-linux-intel-x86_64/binaries/jackhmmer',
REFORMAT    => abs_path(dirname($0)).'/reformat.pl',
JACKHMMERDB => '/home/badri/DNCON2/databases/uniref/uniref90pfilt',
HHBLITS     => '/usr/bin/hhblits',
HHBLITSDB   => '/home/badri/DNCON2/databases/uniprot20_2016_02/uniprot20_2016_02',
CPU         => 2

(c) Update the following variables in the script 'dncon2-main.pl'

SCRATCH      => '/home/badri/SCRATCH-1D_1.1/bin/run_SCRATCH-1D_predictors.sh',
BLASTPATH    => '/home/badri/ncbi-blast-2.2.25+/bin', 
BLASTNRDB    => '/home/badri/databases/nr90-2012',
PSIPRED      => '/home/badri/metapsicov/runpsipredandsolv',
ALNSTAT      => '/home/badri/metapsicov/bin/alnstats',

(d) Install NCBI Blast+ v2.2.25

cd ~/DNCON2/
wget ftp://ftp.ncbi.nlm.nih.gov/blast/executables/blast+/2.2.25/ncbi-blast-2.2.25+-x64-linux.tar.gz
tar zxvf ncbi-blast-2.2.25+-x64-linux.tar.gz 

(I) Verify DNCON2 scripts

(a) [OPTIONAL] Verify the script 'run-ccmpred-freecontact-psicov.pl'

cd ~/DNCON2/
./DNCON2/scripts/run-ccmpred-freecontact-psicov.pl ./DNCON2/dry-run/output/3e7u/alignments/3e7u.aln ./test-dncon2/temp-out-psicov ./test-dncon2/temp-out-ccmpred ./test-dncon2/temp-out-freecontact

Compare these outputs with the outputs at './DNCON2/dry-run/output/3e7u/'.

(b) [OPTIONAL] Verify the script 'generate-alignments.pl'

cd ~/DNCON2/
./DNCON2/scripts/generate-alignments.pl ./DNCON2/dry-run/input/T0900.fasta ./test-dncon2/temp-T0900-alignments/

Compare these outputs with the outputs at './DNCON2/dry-run/output/T0900/'.

(c) Verify DNCON2 installation by making contact predictions for three sequences - 3e7u, T0866, and T0900

cd ~/DNCON2/DNCON2/dry-run/
./run-3e7u.sh
./run-T0866.sh
./run-T0900.sh

Update the output paths in the script 'evaluate-runs.sh' execute it to evaluate precision of predicted contacts

./evaluate-runs.sh

Compare the evaluations with the outputs and logs at './DNCON2/dry-run/output' and './DNCON2/dry-run/results.txt'.


Badri Adhikari [email protected] (developer) Jianlin Cheng [email protected] (PI)

dncon2's People

Contributors

badriadhikari avatar jianlin-cheng avatar wtq18 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

dncon2's Issues

unused dependency: ncbi

In the document, it mentioned

cd ~/DNCON2/
wget ftp://ftp.ncbi.nih.gov/toolbox/ncbi_tools/ncbi.tar.gz
tar zxvf ncbi.tar.gz
csh ./ncbi/make/makedis.csh

however, nowhere among the scripts was the NCBI toolkit used. Is this obsolete dependency, or I have missed something?

feature and labels

Hello, i have two questions hope i get the answers from you

1- first the rule of the sequence alignment is that to extract a chunks of subsequences represents the first sequence

2- and then those alignments are fed to the covariance matrix to extract a matrix called covariance matrix the measures the correlations between each of these alignments with each other

3-from what i understand it that proteins contact map describe the distance matrix as a label , like for example the distance between the first amino acid in the first chain and the first amino acid in the second chain is equal to 200 A, we set a threshold with 8 A so the proteins contact map description for this distance number will be "not in contact" "False" or in binary world "0" is im right with that understanding

My Questions
First
1-what is the rule of the covariance matrix
2- what is the rule of proteins contact map are those the labels of the matrix distances if so what is the rule of the covariance matrix
3- what is the input to the neural network model
A- what is the feature, are those the distance matrix if yes what is the rule of covariance matrix
B- what is the label of these features are Proteins contact map is the labels in (0's and 1's )

Second
1- i want from you kindly to give me a hint or steps which is the first script to use and second and so on cuz i want to cite your paper so i started to inspired from your great work

thanks in advance

keras and tensorflow incompatibility

When I am trying to test TensorFlow and Keras for DNcon2 on my 64bit CentOS 6.7 and anaconda3.6, I got the following error, which seems to be resulted from using incorrect keras/tensorflow version from the one used by DNcon2. Could you please explain in the document which specific keras and tensorflow version you used?

$ ./scripts/predict-rr-from-features.sh ./dry-run/output/3e7u-2017-Oct-23/feat-3e7u.txt ./dry-run/output/3e7u-2017-Oct-23/3e7u.rr ./dry-run/output/3e7u-2017-Oct-23/3e7u.feat.stage2.txt

Stage2 file already present.. using it..

All features in final X:
# Sequence Length (log)
# alignment-count (log)
# effective-alignment-count (log)
# Relative 'b' count
# Relative 'H' count
# Relative 'E' count
# AA composition
# Atchley factors
# Secondary Structure
# Solvent accessibility
# PSSM inf feature
# PSSM
# PSSM Sums (divided by 100)
# PSSM sum cosines
# Relative sequence separation
# Sequence separation between 23 and 28
# Sequence separation between 28 and 38
# Sequence separation between 38 and 48
# Sequence separation 48+
# Psipred
# Psisolv
# pref score
# scld lu con pot
# levitt con pot
# braun con pot
# joint entro
# pearson r
# Shannon entropy sum
# ccmpred
# freecontact
# psicov
# pstat_pots
# pstat_mimt
# pstat_mip

Predict stage2..
Using TensorFlow backend.

SCRIPT : ./scripts/cnn-predict-stage2.py
dir_config : ./scripts/../model-config-n-weights
fileX : ./dry-run/output/3e7u-2017-Oct-23/feat-3e7u.txt
fileRR : ./dry-run/output/3e7u-2017-Oct-23/3e7u.rr.tmp

Reading weight file ./scripts/../model-config-n-weights/stage2-1.hdf5 ..
Reading weight file ./scripts/../model-config-n-weights/stage2-2.hdf5 ..
Reading weight file ./scripts/../model-config-n-weights/stage2-3.hdf5 ..
Reading weight file ./scripts/../model-config-n-weights/stage2-4.hdf5 ..
Reading weight file ./scripts/../model-config-n-weights/stage2-5.hdf5 ..
Reading weight file ./scripts/../model-config-n-weights/stage2-6.hdf5 ..
Reading weight file ./scripts/../model-config-n-weights/stage2-7.hdf5 ..
Reading weight file ./scripts/../model-config-n-weights/stage2-8.hdf5 ..
Reading weight file ./scripts/../model-config-n-weights/stage2-9.hdf5 ..
Reading weight file ./scripts/../model-config-n-weights/stage2-10.hdf5 ..
Reading weight file ./scripts/../model-config-n-weights/stage2-11.hdf5 ..
Reading weight file ./scripts/../model-config-n-weights/stage2-12.hdf5 ..
Reading weight file ./scripts/../model-config-n-weights/stage2-13.hdf5 ..
Reading weight file ./scripts/../model-config-n-weights/stage2-14.hdf5 ..
Reading weight file ./scripts/../model-config-n-weights/stage2-15.hdf5 ..
Reading weight file ./scripts/../model-config-n-weights/stage2-16.hdf5 ..
Reading weight file ./scripts/../model-config-n-weights/stage2-17.hdf5 ..
Reading weight file ./scripts/../model-config-n-weights/stage2-18.hdf5 ..
Reading weight file ./scripts/../model-config-n-weights/stage2-19.hdf5 ..
Reading weight file ./scripts/../model-config-n-weights/stage2-20.hdf5 ..

FeatID Avg Med Max Sum Avg[30] Med[30] Max[30] Sum[30]
Feat 0 2.3609 3.6889 3.6889 5902.2 2.9511 3.6889 3.6889 147.5560
Feat 1 3.4548 5.3982 5.3982 8637.1 4.3185 5.3982 5.3982 215.9265
Feat 2 2.2380 3.4968 3.4968 5594.9 2.7975 3.4968 3.4968 139.8729
Feat 3 0.1792 0.2800 0.2800 448.0 0.2240 0.2800 0.2800 11.2000
Feat 4 0.1600 0.2500 0.2500 400.0 0.2000 0.2500 0.2500 10.0000
Feat 5 0.1600 0.2500 0.2500 400.0 0.2000 0.2500 0.2500 10.0000
Feat 6 0.2723 0.1100 1.0000 680.8 0.1920 0.2400 0.2400 9.6000
Feat 7 0.2723 0.1100 1.0000 680.8 0.3404 0.3000 1.0000 17.0200
Feat 8 0.3347 0.2700 1.0000 836.8 0.0480 0.0600 0.0600 2.4000
Feat 9 0.3347 0.2700 1.0000 836.8 0.4184 0.4000 1.0000 20.9200
Feat10 0.3992 0.5000 1.0000 998.0 0.4080 0.5100 0.5100 20.4000
Feat11 0.3992 0.5000 1.0000 998.0 0.4990 0.5250 1.0000 24.9500
Feat12 0.3442 0.3000 1.0000 860.4 0.8000 1.0000 1.0000 40.0000
Feat13 0.3442 0.3000 1.0000 860.4 0.4302 0.4700 1.0000 21.5100
Feat14 0.3754 0.4900 0.8600 938.4 0.4000 0.5000 0.5000 20.0000
Feat15 0.3754 0.4900 0.8600 938.4 0.4692 0.5000 0.8600 23.4600
Feat16 0.1600 0.0000 1.0000 400.0 0.0000 0.0000 0.0000 0.0000
Feat17 0.1600 0.0000 1.0000 400.0 0.2000 0.0000 1.0000 10.0000
Feat18 0.1600 0.0000 1.0000 400.0 0.8000 1.0000 1.0000 40.0000
Feat19 0.1600 0.0000 1.0000 400.0 0.2000 0.0000 1.0000 10.0000
Feat20 0.3200 0.0000 1.0000 800.0 0.0000 0.0000 0.0000 0.0000
Feat21 0.3200 0.0000 1.0000 800.0 0.4000 0.0000 1.0000 20.0000
Feat22 0.4640 0.0000 1.0000 1160.0 0.0000 0.0000 0.0000 0.0000
Feat23 0.4640 0.0000 1.0000 1160.0 0.5800 1.0000 1.0000 29.0000
Feat24 0.0000 0.0000 0.0000 0.0 0.0000 0.0000 0.0000 0.0000
Feat25 0.0000 0.0000 0.0000 0.0 0.0000 0.0000 0.0000 0.0000
Feat26 0.0000 0.0000 0.0000 0.0 0.0000 0.0000 0.0000 0.0000
Feat27 0.0000 0.0000 0.0000 0.0 0.0000 0.0000 0.0000 0.0000
Feat28 0.0000 0.0000 0.0000 0.0 0.0000 0.0000 0.0000 0.0000
Feat29 0.2132 0.1200 0.9700 533.0 0.2548 0.1850 0.7500 12.7400
Feat30 0.0600 0.0000 1.0000 150.0 0.1000 0.0000 1.0000 5.0000
Feat31 0.0600 0.0000 1.0000 150.0 0.0600 0.0000 1.0000 3.0000
Feat32 0.0024 0.0000 1.0000 6.0 0.0000 0.0000 0.0000 0.0000
Feat33 0.0000 0.0000 0.0000 0.0 0.0000 0.0000 0.0000 0.0000
Feat34 0.3500 0.2560 0.9970 875.1 0.3448 0.4310 0.4310 17.2400
Feat35 0.3500 0.2560 0.9970 875.1 0.4375 0.4230 0.9970 21.8770
Feat36 0.1142 0.0050 0.6870 285.6 0.0208 0.0260 0.0260 1.0400
Feat37 0.1142 0.0050 0.6870 285.6 0.1428 0.0085 0.6870 7.1390
Feat38 0.1469 0.0160 0.8660 367.2 0.3816 0.4770 0.4770 19.0800
Feat39 0.1469 0.0160 0.8660 367.2 0.1836 0.0425 0.8660 9.1790
Feat40 0.1917 0.1400 0.9460 479.2 0.1120 0.1400 0.1400 5.6000
Feat41 0.1917 0.1400 0.9460 479.2 0.2396 0.2040 0.9460 11.9810
Feat42 0.1869 0.0880 0.9840 467.2 0.4478 0.4080 0.9840 22.3880
Feat43 0.4130 0.5400 1.0000 1032.4 0.5595 0.6110 0.9050 27.9750
Feat44 0.4008 0.5000 1.0000 1001.9 0.6231 0.7890 0.9740 31.1560
Feat45 0.3594 0.4600 1.0000 898.6 0.4544 0.5060 0.7140 22.7190
Feat46 0.0000 0.0000 0.0000 0.0 0.0000 0.0000 0.0000 0.0000
Feat47 0.3200 0.5000 0.5000 800.0 0.4000 0.5000 0.5000 20.0000
Feat48 0.5143 0.7160 0.9490 1285.8 0.7408 0.9260 0.9260 37.0400
Feat49 0.5143 0.7160 0.9490 1285.8 0.6429 0.7645 0.9490 32.1440
Feat50 0.1682 0.1970 1.0000 420.5 0.2170 0.2110 1.0000 10.8488
Feat51 0.2812 0.0000 8.1615 703.1 0.2985 0.0000 2.9122 14.9262
Feat52 0.2676 0.0000 7.5221 669.1 0.5417 0.0000 7.5221 27.0857
Feat53 0.3157 0.4834 0.5417 789.2 0.4064 0.5062 0.5329 20.3223
Feat54 0.3178 0.4910 0.5724 794.5 0.3966 0.4915 0.5276 19.8292
Feat55 0.3200 0.4931 0.5734 799.9 0.4000 0.4971 0.5331 19.9990

Starting ensemble prediction..

Running prediction using ./scripts/../model-config-n-weights/stage2-1.hdf5 and ./scripts/../model-config-n-weights/model-arch.config

Read model architecture:
layer0 : 16 5 1 relu
layer1 : 16 5 1 relu
layer2 : 16 5 1 relu
layer3 : 16 5 1 relu
layer4 : 16 5 1 relu
layer5 : 16 5 1 relu
layer6 : 1 5 0 sigmoid

/home/zcx/Projects/EVfold/ext/DNCON2/scripts/libcnnpredict.py:116: UserWarning: Update your Conv2D call to the Keras 2 API: Conv2D(16, (5, 5), input_shape=(50, 50, 5..., padding="same")
model.add(Convolution2D(num_kernels, filter_size, filter_size, border_mode='same', input_shape=X[0, :, :, :].shape))
/home/zcx/Projects/EVfold/ext/DNCON2/scripts/libcnnpredict.py:118: UserWarning: Update your Conv2D call to the Keras 2 API: Conv2D(16, (5, 5), padding="same")
model.add(Convolution2D(num_kernels, filter_size, filter_size, border_mode='same'))
/home/zcx/Projects/EVfold/ext/DNCON2/scripts/libcnnpredict.py:118: UserWarning: Update your Conv2D call to the Keras 2 API: Conv2D(1, (5, 5), padding="same")
model.add(Convolution2D(num_kernels, filter_size, filter_size, border_mode='same'))
Traceback (most recent call last):
File "/home/zcx/Program/anaconda/3.6/lib/python3.6/site-packages/tensorflow/python/framework/common_shapes.py", line 654, in _call_cpp_shape_fn_impl
input_tensors_as_shapes, status)
File "/home/zcx/Program/anaconda/3.6/lib/python3.6/contextlib.py", line 88, in exit
next(self.gen)
File "/home/zcx/Program/anaconda/3.6/lib/python3.6/site-packages/tensorflow/python/framework/errors_impl.py", line 466, in raise_exception_on_not_ok_status
pywrap_tensorflow.TF_GetCode(status))
tensorflow.python.framework.errors_impl.InvalidArgumentError: Dimension 0 in both shapes must be equal, but are 5 and 16 for 'Assign' (op: 'Assign') with input shapes: [5,5,56,16], [16,61,5,5].

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "./scripts/cnn-predict-stage2.py", line 48, in
P = make_ensemble_prediction(weight_arch_dict, X)
File "/home/zcx/Projects/EVfold/ext/DNCON2/scripts/libcnnpredict.py", line 172, in make_ensemble_prediction
P0 = make_prediction(read_model_arch(weight_arch_dict[weight]), weight, X)
File "/home/zcx/Projects/EVfold/ext/DNCON2/scripts/libcnnpredict.py", line 127, in make_prediction
model.load_weights(file_weights)
File "/home/zcx/Program/anaconda/3.6/lib/python3.6/site-packages/keras/models.py", line 719, in load_weights
topology.load_weights_from_hdf5_group(f, layers)
File "/home/zcx/Program/anaconda/3.6/lib/python3.6/site-packages/keras/engine/topology.py", line 3095, in load_weights_from_hdf5_group
K.batch_set_value(weight_value_tuples)
File "/home/zcx/Program/anaconda/3.6/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py", line 2188, in batch_set_value
assign_op = x.assign(assign_placeholder)
File "/home/zcx/Program/anaconda/3.6/lib/python3.6/site-packages/tensorflow/python/ops/variables.py", line 527, in assign
return state_ops.assign(self._variable, value, use_locking=use_locking)
File "/home/zcx/Program/anaconda/3.6/lib/python3.6/site-packages/tensorflow/python/ops/state_ops.py", line 274, in assign
validate_shape=validate_shape)
File "/home/zcx/Program/anaconda/3.6/lib/python3.6/site-packages/tensorflow/python/ops/gen_state_ops.py", line 43, in assign
use_locking=use_locking, name=name)
File "/home/zcx/Program/anaconda/3.6/lib/python3.6/site-packages/tensorflow/python/framework/op_def_library.py", line 767, in apply_op
op_def=op_def)
File "/home/zcx/Program/anaconda/3.6/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 2632, in create_op
set_shapes_for_outputs(ret)
File "/home/zcx/Program/anaconda/3.6/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 1911, in set_shapes_for_outputs
shapes = shape_func(op)
File "/home/zcx/Program/anaconda/3.6/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 1861, in call_with_requiring
return call_cpp_shape_fn(op, require_shape_fn=True)
File "/home/zcx/Program/anaconda/3.6/lib/python3.6/site-packages/tensorflow/python/framework/common_shapes.py", line 595, in call_cpp_shape_fn
require_shape_fn)
File "/home/zcx/Program/anaconda/3.6/lib/python3.6/site-packages/tensorflow/python/framework/common_shapes.py", line 659, in _call_cpp_shape_fn_impl
raise ValueError(err.message)
ValueError: Dimension 0 in both shapes must be equal, but are 5 and 16 for 'Assign' (op: 'Assign') with input shapes: [5,5,56,16], [16,61,5,5].

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.