Coder Social home page Coder Social logo

chapmanb / bcbb Goto Github PK

View Code? Open in Web Editor NEW
604.0 604.0 243.0 81.26 MB

Incubator for useful bioinformatics code, primarily in Python and R

Home Page: http://bcbio.wordpress.com

Python 57.65% HTML 16.87% Clojure 0.67% Makefile 0.43% Shell 0.16% TeX 9.65% R 0.99% CSS 6.10% JavaScript 5.98% Jupyter Notebook 0.37% SCSS 1.14%

bcbb's People

Contributors

abretaud avatar brainstorm avatar chapmanb avatar cmclean avatar davycats avatar dfornika avatar fubar2 avatar hsiaoyi0504 avatar hugh-zhu avatar jdidion avatar jwm avatar khughitt avatar kwoklab-user avatar mspinelli avatar nlharris avatar peterjc avatar rhpvorderman avatar roryk avatar sergiuser1 avatar sjhosui avatar skinner avatar timgates42 avatar tipabu avatar vals avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

bcbb's Issues

barcode_sort_trim.py

i'm not sure if it's a problem in the latest version of barcode_sort_trim.py

After updating to the pipeline with FastQC, I've got an extra base 'A' in the 3' of read 1.

GFF.write fails when using a single SeqRecord.

In [6]: seqTP53

Out[6]: SeqRecord(seq=Seq('TGGTTCAAGTAATTCTCCTGCCTCAGACTCCAGAGTAGCTGGGATTACAGGCGC...CCC', IUPACAmbiguousDNA()), id='NG_017013.1', name='NG_017013', description='Homo sapiens tumor protein p53 (TP53), RefSeqGene on chromosome 17.', dbxrefs=[])

with open('tp53.gff', 'w') as file:
GFF.write(seqTP53, file)

ERROR: An unexpected error occurred while tokenizing input
The following traceback may be corrupted or invalid

The error message is: ('EOF in multi-line statement', (8, 0))

AttributeError Traceback (most recent call last)
/home/merc/gitcode/mirna-django/src/scripts/ in ()
1 with open('tp53.gff', 'w') as file:
----> 2 GFF.write(seqTP53, file)
3

/usr/local/lib/python2.7/dist-packages/bcbio-0.1-py2.7.egg/BCBio/GFF/GFFOutput.pyc in write(recs, out_handle, include_fasta)
183 """
184 writer = GFF3Writer()
--> 185 return writer.write(recs, out_handle, include_fasta)

/usr/local/lib/python2.7/dist-packages/bcbio-0.1-py2.7.egg/BCBio/GFF/GFFOutput.pyc in write(self, recs, out_handle, include_fasta)
74 fasta_recs = []
75 for rec in recs:
---> 76 self._write_rec(rec, out_handle)
77 self._write_annotations(rec.annotations, rec.id, out_handle)
78 for sf in rec.features:

/usr/local/lib/python2.7/dist-packages/bcbio-0.1-py2.7.egg/BCBio/GFF/GFFOutput.pyc in _write_rec(self, rec, out_handle)
99 def _write_rec(self, rec, out_handle):
100 # if we have a SeqRecord, write out optional directive

--> 101 if len(rec.seq) > 0:
102 out_handle.write("##sequence-region %s 1 %s\n" % (rec.id, len(rec.seq)))
103

AttributeError: 'str' object has no attribute 'seq'

Problem installing install_biolinux

Hi,
I have the following issue. On my cluster configuration I jump into a public server and from there to the actual head node of the cluster. The set up does not need me to have a password for the cluster since it is done using ssh keys. When I try to install the bcbio_nextgen it aks me for my password. How can I get around this password if I do not have one or what should I enter?

Thanks in advance, inti pedroso

$ python bcbio_nextgen_install.py install_directory data_directory
Installing tools...
[localhost] Executing task 'install_biolinux'
INFO: Config start time: 2013-02-05 14:51:19.267698
INFO: This is a Base Flavor - no overrides
DBG [init.py]: Minimal Edition 1.5.3
INFO: This is a minimal
INFO: Distribution ubuntu
INFO: Get local environment
INFO: Ubuntu setup
DBG [distribution.py]: Debian-shared setup
DBG [distribution.py]: Source=quantal
DBG [distribution.py]: Checking target distribution ubuntu
[localhost] run: cat /proc/version
[localhost] Login password for 'pedrosoi':

CDS Phase not calculated by GFFOutput

The GFF3 standard (v1.21) requires that the phase be set for type CDS lines, however GFFOutput does not seem to calculate phase, rather all phases are reported as 0.

I wrote routines to take a list of SeqFeatures and calculate/set their phases. (E.g. sort list 5-prime to 3-prime, default the first CDS to phase=0 if no phase is set, and then set the phases of all CDSes in the list. CDS SeqFeatures processed in this way show the correct phase in gff3 files written by GFFOutput.

Would this code be of interest for incorporating into GFFOutput.py ?

Test test_3_empty_fastq not passing with GATK 1.3

Hey Brad, I saw you had made a commit for GATK 1.3 compatibility, so I installed it and tried to run the tests. Unfortunately the test test_3_empty_fastq gets stuck on a call to GATK. Specifically

java -Xmx6g -Djava.io.tmpdir=/bubo/home/h10/vale/bcbb/nextgen/tests/tmp/tmppV8I7Y -jar /bubo/home/h10/vale/opt/GenomeAnalysisTK-1.3-14-g348f2db/GenomeAnalysisTK.jar -T CountCovariates -cov ReadGroupCovariate -cov QualityScoreCovariate -cov CycleCovariate -cov DinucCovariate -recalFile /bubo/home/h10/vale/bcbb/nextgen/tests/tx/1_110221_FC_withspace_8-sort-dup.recal -I /bubo/home/h10/vale/bcbb/nextgen/tests/1_110221_FC_withspace_8-sort-dup.bam -R /bubo/home/h10/vale/bcbb/nextgen/tests/data/automated/tool-data/../../genomes/hg19/seq/hg19.fa -l INFO -U -OQ --default_platform illumina --knownSites /bubo/home/h10/vale/bcbb/nextgen/tests/data/automated/tool-data/../../genomes/hg19/variation/dbsnp_132.vcf --phone_home NO_ET -nt 5

will give the error

##### ERROR ------------------------------------------------------------------------------------------
##### ERROR A USER ERROR has occurred (version 1.3-14-g348f2db): 
##### ERROR The invalid arguments or inputs must be corrected before the GATK can proceed
##### ERROR Please do not post this error to the GATK forum
##### ERROR
##### ERROR See the documentation (rerun with -h) for this tool to view allowable command-line arguments.
##### ERROR Visit our wiki for extensive documentation http://www.broadinstitute.org/gsa/wiki
##### ERROR Visit our forum to view answers to commonly asked questions http://getsatisfaction.com/gsa
##### ERROR
##### ERROR MESSAGE: Bad input: Could not find any usable data in the input BAM file(s).
##### ERROR ------------------------------------------------------------------------------------------

The test passes when using GATK 1.2, I think this is due to this point in the changelog:

  • We now throw an exception when there is no data instead of creating an empty csv file.

GFF.parser target_lines returns interchange order of contigs if contigs sorted in certain order

Steps to reproduce:
Create file sorted in certain order (III -> I)

III protein_coding  CDS 100063  100183  .   +   1    gene_id "CADAFUAG00000321"; transcript_id "CADAFUAT00000321"; exon_number "4"; gene_name "AFUA_3G00460"; transcript_name "AFUA_3G00460"; protein_id "CADAFUAP00000321";
III protein_coding  CDS 1004211 1004214 .   +   0    gene_id "CADAFUAG00000267"; transcript_id "CADAFUAT00000267"; exon_number "1"; gene_name "AFUA_3G03700"; transcript_name "AFUA_3G03700"; protein_id "CADAFUAP00000267";
III protein_coding  CDS 1004428 1004850 .   +   2    gene_id "CADAFUAG00000267"; transcript_id "CADAFUAT00000267"; exon_number "2"; gene_name "AFUA_3G03700"; transcript_name "AFUA_3G03700"; protein_id "CADAFUAP00000267";
I   tRNA    exon    883674  883709  .   +   .    gene_id "CADAFUAG00009730"; transcript_id "CADAFUAT00009730"; exon_number "2"; gene_name "AFUA_5G03266"; transcript_name "AFUA_5G03266"; seqedit "false";
I   tRNA_pseudogene exon    3717600 3717932 .   -   .    gene_id "CADAFUAG00005891"; transcript_id "CADAFUAT00005891"; exon_number "1"; gene_name "AFUA_5G14275"; transcript_name "AFUA_5G14275"; seqedit "false";
I   tRNA_pseudogene exon    3916324 3920790 .   +   .    gene_id "CADAFUAG00006577"; transcript_id "CADAFUAT00006577"; exon_number "1"; gene_name "AFUA_5G15102"; transcript_name "AFUA_5G15102"; seqedit "false";

Run python code :

from BCBio.GFF import parse

# target_lines must be not 1 and less then number of features in block
for record in parse('test_file.gff', target_lines=2):
    print 'record id: %10s     number of features %5s' % (record.id, len(record.features))

Observed result:

record id:        III     number of features     2
record id:          I     number of features     1
record id:        III     number of features     1
record id:          I     number of features     2

Expected result:
Contig sort order should not affect output

record id:        III     number of features     1
record id:        III     number of features     1
record id:          I     number of features     2
record id:          I     number of features     1

Note:
Sorting file in order (I -> III) will give:

record id:          I     number of features     2
record id:          I     number of features     1
record id:        III     number of features     1
record id:        III     number of features     1

parsing of GFF3 attributes fails when tag starts with a space

The GFF3 spec appears to allow attribute tags to start with a space. BCBio does not handle this well. Below is a test case where input1.gff3 contains no spaces in tag and input2.gff3 and input3.gff3 contain the tag "foo". All three inputs should have the same output, but they don't as shown here:
$ cat examineGFF3.py

!/usr/bin/env python

import pprint
from BCBio.GFF import GFFExaminer
import sys

examiner = GFFExaminer()
pprint.pprint(examiner.parent_child_map(sys.stdin))
$ cat input1.gff3

gff-version 3

contig1 . gene 1544 2057 . - . ID=contig1.1
contig1 . mRNA 1544 2057 . - . ID=mRNA.contig1.1;Parent=contig1.1
$ cat input2.gff3

gff-version 3

contig1 . gene 1544 2057 . - . ID=contig1.1
contig1 . mRNA 1544 2057 . - . foo=bar;ID=mRNA.contig1.1;Parent=contig1.1
$ cat input3.gff3

gff-version 3

contig1 . gene 1544 2057 . - . ID=contig1.1
contig1 . mRNA 1544 2057 . - . ID=mRNA.contig1.1;Parent=contig1.1; foo=bar
$ ./examineGFF3.py < input1.gff3
{('', 'gene'): [('', 'mRNA')]}
$ ./examineGFF3.py < input2.gff3
{}
$ ./examineGFF3.py < input3.gff3
Traceback (most recent call last):
File "./examineGFF3.py", line 8, in
pprint.pprint(examiner.parent_child_map(sys.stdin))
File "/Some/Path/env/lib/python2.7/site-packages/BCBio/GFF/GFFParser.py", line 744, in _file_or_handle_inside
out = fn(_args, *_kwargs)
File "/Some/Path/env/lib/python2.7/site-packages/BCBio/GFF/GFFParser.py", line 829, in parent_child_map
self._get_local_params())[0]
File "/Some/Path/env/lib/python2.7/site-packages/BCBio/GFF/GFFParser.py", line 169, in _gff_line_map
quals, is_gff2 = _split_keyvals(gff_parts[8])
File "/Some/Path/env/lib/python2.7/site-packages/BCBio/GFF/GFFParser.py", line 93, in _split_keyvals
assert len(item) == 1, item
AssertionError: ['ID', 'mRNA.contig1.1;Parent', 'contig1.1']
$

picard_sam_to_bam.py

Hi Brad,

it seems that it will keep finding CreateSequenceDictionary in /usr/share/java/picard even though I have specify another path in my config file? I have tried doing the setup again after I modified the config files, but still it didn't look up the path I specified.

and I didn't seem to have specified the path of hg19.fa for GATK?

Thanks,
Paul

GFFparser only keeps first line

There appears to be a bug in the current GFFParser implementation which causes it to ignore all lines but the first one.

It looks like the issue occurs somewhere in the call to self._lines_to_out_info(line_gen, limit_info, target_lines) in GFFParser.py on line 609.

Before this line is executed, the line generator has all of the lines. The loop is only iterated once, however.

I tested this using the sample gff3 file from the broad institute:

edit_test.fa    .   gene    500 2610    .   +   .   ID=newGene
edit_test.fa    .   mRNA    500 2385    .   +   .   Parent=newGene;Namo=reinhard+did+this;Name=t1%28newGene%29;ID=t1;uri=http%3A//www.yahoo.com
edit_test.fa    .   five_prime_UTR  500 802 .   +   .   Parent=t1
edit_test.fa    .   CDS 803 1012    .   +   .   Parent=t1
etc...

Problem occurs in both Python 2.7.5 and 3.3.2.

issue compiling cortex

Hi,
I am getting a error while compiling cortex. Let me know if you can help. See at the end of the text below
thanks in advance, inti

$ python2.7 bcbio_nextgen_install.py --distribution centos --nosudo ~/APP/bcbio_nextgen/install_directory ~/APP/bcbio_nextgen/data_directory
Initialized empty Git repository in /home/pedrosoi/APP/bcbio_nextgen/tmpbcbio-install/cloudbiolinux/.git/
remote: Counting objects: 5704, done.
remote: Compressing objects: 100% (2337/2337), done.
remote: Total 5704 (delta 3383), reused 5513 (delta 3202)
Receiving objects: 100% (5704/5704), 5.56 MiB | 466 KiB/s, done.
Resolving deltas: 100% (3383/3383), done.
Installing tools...
[localhost] Executing task 'install_biolinux'
INFO: Config start time: 2013-02-06 08:28:50.307292
INFO: This is a Base Flavor - no overrides
DBG [init.py]: Minimal Edition 1.5.3
INFO: This is a minimal
INFO: Distribution centos
INFO: Get local environment
INFO: CentOS setup
DBG [distribution.py]: Checking target distribution centos
DBG [distribution.py]: Unknown target distro
DBG [distribution.py]: NixPkgs: Ignored
[localhost] run: echo $HOME
[localhost] out: /home/pedrosoi
[localhost] out:

[localhost] run: uname -m
[localhost] out: x86_64
[localhost] out:

INFO: Now, testing connection to host...
INFO: Connection to host appears to work!
DBG [utils.py]: Expand paths
DBG [fabfile.py]: Target is 'None'
DBG [config.py]: Using config file /home/pedrosoi/APP/bcbio_nextgen/tmpbcbio-install/cloudbiolinux/cloudbio/../contrib/flavor/ngs_pipeline/main.yaml
INFO: Meta-package information from /home/pedrosoi/APP/bcbio_nextgen/tmpbcbio-install/cloudbiolinux/cloudbio/../contrib/flavor/ngs_pipeline/main.yaml

  • Packages: minimal,libraries,python,java,r,bio_nextgen,distributed
  • Libraries:
    [localhost] run: echo $PATH
    [localhost] out: /home/pedrosoi/python/bin:/opt/xcat/bin:/opt/xcat/sbin:/usr/lib64/qt-3.3/bin:/apps/sge/6.2u5/bin/lx24-amd64:/usr/local/bin:/bin:/usr/bin
    [localhost] out:

[localhost] run: echo '# CloudBioLinux PATH updates' >> ~/.bashrc
[localhost] run: echo 'export PATH=$PATH:/home/pedrosoi/APP/bcbio_nextgen/install_directory/bin' >> ~/.bashrc
[localhost] run: echo $TMPDIR
[localhost] out:
[localhost] out:
[localhost] run: echo $HOME
[localhost] out: /home/pedrosoi
[localhost] out:

[localhost] run: wget --no-check-certificate https://raw.github.com/pypa/virtualenv/master/virtualenv.py
[localhost] out: --2013-02-06 08:28:52-- https://raw.github.com/pypa/virtualenv/master/virtualenv.py
[localhost] out: Resolving raw.github.com... 199.27.77.130
[localhost] out: Connecting to raw.github.com|199.27.77.130|:443... connected.
[localhost] out: WARNING: certificate common name “*.a.ssl.fastly.net” doesn’t match requested host name “raw.github.com”.
[localhost] out: HTTP request sent, awaiting response... 200 OK
[localhost] out: Length: 114330 (112K) [text/plain]
[localhost] out: Saving to: “virtualenv.py”
[localhost] out:
[localhost] out:
[localhost] out: 0% [ ] 0 --.-K/s
[localhost] out: 100%[==========================================================================>] 114,330 --.-K/s in 0.01s
[localhost] out:
[localhost] out: 2013-02-06 08:28:53 (9.05 MB/s) - “virtualenv.py” saved [114330/114330]
[localhost] out:
[localhost] out:

[localhost] run: python virtualenv.py /home/pedrosoi/APP/bcbio_nextgen/install_directory
[localhost] out: New python executable in /home/pedrosoi/APP/bcbio_nextgen/install_directory/bin/python
[localhost] out: Installing setuptools............................done.
[localhost] out: Installing pip.....................done.
[localhost] out:

[localhost] run: rm -rf /home/pedrosoi/tmp/cloudbiolinux
INFO: Target=unknown; Edition=Minimal Edition; Flavor=ngs_pipeline
DBG [config.py]: Using config file /home/pedrosoi/APP/bcbio_nextgen/tmpbcbio-install/cloudbiolinux/cloudbio/../config/custom.yaml
INFO: Reading /home/pedrosoi/APP/bcbio_nextgen/tmpbcbio-install/cloudbiolinux/cloudbio/../config/custom.yaml
DBG [shared.py]: Packages to install: pydoop,seal,leiningen,bx-python,rpy,abyss,cortex_var,ray,transabyss,trinity,velvet,macs,bcbio_variation,crisp,gemini,stacks,tassel,varscan,hydra,cufflinks,freebayes,gatk,gatk_queue,picard,sambamba,samtools,shrec,snpeff,tophat,vep,bfast,novoalign,novosort,plink_seq,ucsc_tools,bedtools,dwgsim,fastqc,fastx_toolkit,varianttools,vcftools,bowtie,bowtie2,bwa,gmap,lastz,mosaik,snap,stampy
INFO: Custom install for 'leiningen' start time: 2013-02-06 08:29:10.272973
INFO: Custom install for 'leiningen' start time: 2013-02-06 08:29:10.272973
DBG [fabfile.py]: Import leiningen
DBG [fabfile.py]: Import leiningen
[localhost] run: echo $TMPDIR
[localhost] out:
[localhost] out:
[localhost] run: echo $HOME
[localhost] out: /home/pedrosoi
[localhost] out:

[localhost] run: mkdir -p /home/pedrosoi/tmp/cloudbiolinux
[localhost] run: wget --no-check-certificate https://raw.github.com/technomancy/leiningen/stable/bin/lein
[localhost] out: --2013-02-06 08:29:10-- https://raw.github.com/technomancy/leiningen/stable/bin/lein
[localhost] out: Resolving raw.github.com... 199.27.77.130
[localhost] out: Connecting to raw.github.com|199.27.77.130|:443... connected.
[localhost] out: WARNING: certificate common name “*.a.ssl.fastly.net” doesn’t match requested host name “raw.github.com”.
[localhost] out: HTTP request sent, awaiting response... 200 OK
[localhost] out: Length: 10250 (10K) [text/plain]
[localhost] out: Saving to: “lein”
[localhost] out:
[localhost] out:
[localhost] out: 0% [ ] 0 --.-K/s
[localhost] out: 100%[==========================================================================>] 10,250 --.-K/s in 0s
[localhost] out:
[localhost] out: 2013-02-06 08:29:11 (73.1 MB/s) - “lein” saved [10250/10250]
[localhost] out:
[localhost] out:

[localhost] run: chmod a+rwx lein
[localhost] run: mv lein /home/pedrosoi/APP/bcbio_nextgen/install_directory/bin
[localhost] run: /home/pedrosoi/APP/bcbio_nextgen/install_directory/bin/lein
[localhost] out: Downloading Leiningen to /home/pedrosoi/.lein/self-installs/leiningen-2.0.0-standalone.jar now...
[localhost] out: % Total % Received % Xferd Average Speed Time Time Time Current
[localhost] out: Dload Upload Total Spent Left Speed
[localhost] out:
[localhost] out: 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
[localhost] out: 3 12.6M 3 403k 0 0 293k 0 0:00:43 0:00:01 0:00:42 488k
[localhost] out: 10 12.6M 10 1294k 0 0 546k 0 0:00:23 0:00:02 0:00:21 710k
[localhost] out: 18 12.6M 18 2364k 0 0 703k 0 0:00:18 0:00:03 0:00:15 839k
[localhost] out: 27 12.6M 27 3575k 0 0 820k 0 0:00:15 0:00:04 0:00:11 937k
[localhost] out: 33 12.6M 33 4390k 0 0 813k 0 0:00:15 0:00:05 0:00:10 905k
[localhost] out: 36 12.6M 36 4767k 0 0 745k 0 0:00:17 0:00:06 0:00:11 869k
[localhost] out: 40 12.6M 40 5223k 0 0 706k 0 0:00:18 0:00:07 0:00:11 782k
[localhost] out: 45 12.6M 45 5936k 0 0 708k 0 0:00:18 0:00:08 0:00:10 711k
[localhost] out: 51 12.6M 51 6697k 0 0 713k 0 0:00:18 0:00:09 0:00:09 621k
[localhost] out: 53 12.6M 53 6870k 0 0 662k 0 0:00:19 0:00:10 0:00:09 497k
[localhost] out: 55 12.6M 55 7140k 0 0 627k 0 0:00:20 0:00:11 0:00:09 476k
[localhost] out: 58 12.6M 58 7567k 0 0 611k 0 0:00:21 0:00:12 0:00:09 470k
[localhost] out: 63 12.6M 63 8232k 0 0 616k 0 0:00:20 0:00:13 0:00:07 461k
[localhost] out: 69 12.6M 69 8953k 0 0 623k 0 0:00:20 0:00:14 0:00:06 453k
[localhost] out: 73 12.6M 73 9430k 0 0 614k 0 0:00:21 0:00:15 0:00:06 514k
[localhost] out: 75 12.6M 75 9801k 0 0 599k 0 0:00:21 0:00:16 0:00:05 534k
[localhost] out: 78 12.6M 78 9.9M 0 0 587k 0 0:00:21 0:00:17 0:00:04 526k
[localhost] out: 82 12.6M 82 10.4M 0 0 582k 0 0:00:22 0:00:18 0:00:04 493k
[localhost] out: 84 12.6M 84 10.6M 0 0 559k 0 0:00:23 0:00:19 0:00:04 379k
[localhost] out: 85 12.6M 85 10.7M 0 0 541k 0 0:00:23 0:00:20 0:00:03 317k
[localhost] out: 87 12.6M 87 11.0M 0 0 527k 0 0:00:24 0:00:21 0:00:03 292k
[localhost] out: 89 12.6M 89 11.3M 0 0 519k 0 0:00:24 0:00:22 0:00:02 282k
[localhost] out: 93 12.6M 93 11.7M 0 0 515k 0 0:00:25 0:00:23 0:00:02 268k
[localhost] out: 97 12.6M 97 12.2M 0 0 516k 0 0:00:25 0:00:24 0:00:01 343k
[localhost] out: 100 12.6M 100 12.6M 0 0 518k 0 0:00:24 0:00:24 --:--:-- 418k
[localhost] out: Leiningen is a tool for working with Clojure projects.
[localhost] out:
[localhost] out: Several tasks are available:
[localhost] out: check Check syntax and warn on reflection.
[localhost] out: classpath Write the classpath of the current project to output-file.
[localhost] out: clean Remove all files from project's target-path.
[localhost] out: compile Compile Clojure source into .class files.
[localhost] out: deploy Build jar and deploy to remote repository.
[localhost] out: deps Show details about dependencies.
[localhost] out: do Higher-order task to perform other tasks in succession.
[localhost] out: help Display a list of tasks or help for a given task or subtask.
[localhost] out: install Install current project to the local repository.
[localhost] out: jar Package up all the project's files into a jar file.
[localhost] out: javac Compile Java source files.
[localhost] out: new Generate scaffolding for a new project based on a template.
[localhost] out: plugin DEPRECATED. Please use the :user profile instead.
[localhost] out: pom Write a pom.xml file to disk for Maven interoperability.
[localhost] out: repl Start a repl session either with the current project or standalone.
[localhost] out: retest Run only the test namespaces which failed last time around.
[localhost] out: run Run the project's -main function.
[localhost] out: search Search remote maven repositories for matching jars.
[localhost] out: show-profiles List all available profiles or display one if given an argument.
[localhost] out: test Run the project's tests.
[localhost] out: trampoline Run a task without nesting the project's JVM inside Leiningen's.
[localhost] out: uberjar Package up the project files and all dependencies into a jar file.
[localhost] out: upgrade Upgrade Leiningen to specified version or latest stable.
[localhost] out: version Print version for Leiningen and the current JVM.
[localhost] out: with-profile Apply the given task with the profile(s) specified.
[localhost] out:
[localhost] out: Run lein help $TASK for details.
[localhost] out:
[localhost] out: See also: readme, faq, tutorial, news, sample, profiles, deploying, mixed-source, templates, and copying.
[localhost] out:

[localhost] run: rm -rf /home/pedrosoi/tmp/cloudbiolinux
INFO: Custom install for 'leiningen' end time: 2013-02-06 08:29:40.600416; duration: 0:00:30.327443
INFO: Custom install for 'leiningen' end time: 2013-02-06 08:29:40.600416; duration: 0:00:30.327443
INFO: Custom install for 'cortex_var' start time: 2013-02-06 08:29:40.600716
INFO: Custom install for 'cortex_var' start time: 2013-02-06 08:29:40.600716
INFO: Custom install for 'cortex_var' start time: 2013-02-06 08:29:40.600716
DBG [fabfile.py]: Import cortex_var
DBG [fabfile.py]: Import cortex_var
DBG [fabfile.py]: Import cortex_var
[localhost] run: echo $TMPDIR
[localhost] out:
[localhost] out:
[localhost] run: echo $HOME
[localhost] out: /home/pedrosoi
[localhost] out:

[localhost] run: mkdir -p /home/pedrosoi/tmp/cloudbiolinux
[localhost] run: wget --no-check-certificate -O CORTEX_release_v1.0.5.14.tgz 'http://downloads.sourceforge.net/project/cortexassembler/cortex_var/latest/CORTEX_release_v1.0.5.14.tgz'
[localhost] out: --2013-02-06 08:29:41-- http://downloads.sourceforge.net/project/cortexassembler/cortex_var/latest/CORTEX_release_v1.0.5.14.tgz
[localhost] out: Resolving downloads.sourceforge.net... 216.34.181.59
[localhost] out: Connecting to downloads.sourceforge.net|216.34.181.59|:80... connected.
[localhost] out: HTTP request sent, awaiting response... 302 Found
[localhost] out: Location: http://netcologne.dl.sourceforge.net/project/cortexassembler/cortex_var/latest/CORTEX_release_v1.0.5.14.tgz [following]
[localhost] out: --2013-02-06 08:29:41-- http://netcologne.dl.sourceforge.net/project/cortexassembler/cortex_var/latest/CORTEX_release_v1.0.5.14.tgz
[localhost] out: Resolving netcologne.dl.sourceforge.net... 78.35.24.46, 2001:4dd0:1234:6::5f
[localhost] out: Connecting to netcologne.dl.sourceforge.net|78.35.24.46|:80... connected.
[localhost] out: HTTP request sent, awaiting response... 200 OK
[localhost] out: Length: 30656942 (29M) [application/x-gzip]
[localhost] out: Saving to: “CORTEX_release_v1.0.5.14.tgz”
[localhost] out:
[localhost] out:
[localhost] out: 0% [ ] 0 --.-K/s
[localhost] out: 1% [ ] 379,225 1.64M/s
[localhost] out: 1% [> ] 553,105 1.20M/s
[localhost] out: 2% [> ] 700,765 1.01M/s
[localhost] out: 2% [=> ] 841,525 953K/s
[localhost] out: 3% [=> ] 1,016,609 916K/s
[localhost] out: 3% [=> ] 1,194,805 895K/s
[localhost] out: 4% [==> ] 1,425,265 913K/s
[localhost] out: 5% [===> ] 1,676,425 939K/s
[localhost] out: 6% [===> ] 1,951,045 980K/s
[localhost] out: 7% [====> ] 2,211,689 998K/s
[localhost] out: 8% [=====> ] 2,512,705 1.01M/s
[localhost] out: 8% [=====> ] 2,736,265 1.00M/s
[localhost] out: 9% [======> ] 2,900,485 1003K/s
[localhost] out: 10% [======> ] 3,096,269 993K/s eta 27s
[localhost] out: 10% [=======> ] 3,304,825 977K/s eta 27s
[localhost] out: 11% [=======> ] 3,579,445 958K/s eta 27s
[localhost] out: 12% [========> ] 3,867,865 1011K/s eta 27s
[localhost] out: 13% [=========> ] 4,127,305 1.01M/s eta 27s
[localhost] out: 14% [=========> ] 4,463,849 1.06M/s eta 24s
[localhost] out: 15% [==========> ] 4,818,685 1.12M/s eta 24s
[localhost] out: 16% [===========> ] 5,167,825 1.16M/s eta 24s
[localhost] out: 18% [============> ] 5,600,969 1.22M/s eta 24s
[localhost] out: 19% [=============> ] 5,918,545 1.23M/s eta 24s
[localhost] out: 20% [==============> ] 6,244,225 1.25M/s eta 21s
[localhost] out: 21% [==============> ] 6,525,745 1.25M/s eta 21s
[localhost] out: 22% [===============> ] 6,789,325 1.23M/s eta 21s
[localhost] out: 23% [================> ] 7,054,285 1.28M/s eta 21s
[localhost] out: 23% [================> ] 7,319,245 1.29M/s eta 21s
[localhost] out: 24% [=================> ] 7,595,069 1.32M/s eta 19s
[localhost] out: 25% [==================> ] 7,931,965 1.34M/s eta 19s
[localhost] out: 26% [===================> ] 8,274,029 1.37M/s eta 19s
[localhost] out: 27% [===================> ] 8,580,389 1.38M/s eta 19s
[localhost] out: 28% [====================> ] 8,872,949 1.38M/s eta 19s
[localhost] out: 29% [=====================> ] 9,176,725 1.36M/s eta 17s
[localhost] out: 31% [======================> ] 9,534,145 1.36M/s eta 17s
[localhost] out: 32% [=======================> ] 9,913,469 1.35M/s eta 17s
[localhost] out: 33% [========================> ] 10,291,589 1.36M/s eta 17s
[localhost] out: 34% [========================> ] 10,600,709 1.36M/s eta 17s
[localhost] out: 35% [=========================> ] 10,857,565 1.34M/s eta 16s
[localhost] out: 36% [==========================> ] 11,149,949 1.35M/s eta 16s
[localhost] out: 37% [===========================> ] 11,470,109 1.37M/s eta 16s
[localhost] out: 38% [===========================> ] 11,809,765 1.40M/s eta 16s
[localhost] out: 39% [============================> ] 12,060,749 1.39M/s eta 16s
[localhost] out: 40% [=============================> ] 12,276,029 1.36M/s eta 14s
[localhost] out: 40% [=============================> ] 12,524,429 1.34M/s eta 14s
[localhost] out: 41% [==============================> ] 12,817,165 1.32M/s eta 14s
[localhost] out: 42% [===============================> ] 13,149,569 1.33M/s eta 14s
[localhost] out: 44% [================================> ] 13,496,125 1.35M/s eta 14s
[localhost] out: 45% [================================> ] 13,854,749 1.35M/s eta 13s
[localhost] out: 46% [=================================> ] 14,259,089 1.36M/s eta 13s
[localhost] out: 47% [==================================> ] 14,684,129 1.38M/s eta 13s
[localhost] out: 49% [===================================> ] 15,092,785 1.39M/s eta 13s
[localhost] out: 50% [====================================> ] 15,388,105 1.41M/s eta 13s
[localhost] out: 51% [=====================================> ] 15,697,225 1.41M/s eta 11s
[localhost] out: 52% [======================================> ] 15,982,885 1.40M/s eta 11s
[localhost] out: 52% [======================================> ] 16,233,869 1.38M/s eta 11s
[localhost] out: 53% [=======================================> ] 16,509,869 1.36M/s eta 11s
[localhost] out: 54% [========================================> ] 16,830,205 1.40M/s eta 11s
[localhost] out: 56% [=========================================> ] 17,194,349 1.43M/s eta 10s
[localhost] out: 57% [=========================================> ] 17,492,605 1.44M/s eta 10s
[localhost] out: 57% [==========================================> ] 17,758,945 1.43M/s eta 10s
[localhost] out: 58% [===========================================> ] 18,007,345 1.41M/s eta 10s
[localhost] out: 59% [===========================================> ] 18,214,169 1.37M/s eta 10s
[localhost] out: 59% [===========================================> ] 18,385,465 1.31M/s eta 9s
[localhost] out: 60% [============================================> ] 18,549,685 1.25M/s eta 9s
[localhost] out: 61% [=============================================> ] 18,845,005 1.20M/s eta 9s
[localhost] out: 62% [=============================================> ] 19,205,009 1.20M/s eta 9s
[localhost] out: 63% [==============================================> ] 19,585,889 1.23M/s eta 9s
[localhost] out: 64% [===============================================> ] 19,806,865 1.19M/s eta 8s
[localhost] out: 65% [================================================> ] 20,081,309 1.20M/s eta 8s
[localhost] out: 66% [================================================> ] 20,346,269 1.21M/s eta 8s
[localhost] out: 67% [=================================================> ] 20,631,929 1.20M/s eta 8s
[localhost] out: 68% [==================================================> ] 20,948,125 1.19M/s eta 8s
[localhost] out: 69% [===================================================> ] 21,265,349 1.20M/s eta 7s
[localhost] out: 69% [===================================================> ] 21,446,305 1.16M/s eta 7s
[localhost] out: 70% [===================================================> ] 21,523,585 1.11M/s eta 7s
[localhost] out: 70% [===================================================> ] 21,633,985 1.06M/s eta 7s
[localhost] out: 70% [====================================================> ] 21,752,665 1.04M/s eta 7s
[localhost] out: 71% [====================================================> ] 21,893,425 1.03M/s eta 7s
[localhost] out: 71% [====================================================> ] 22,063,165 1.03M/s eta 7s
[localhost] out: 72% [=====================================================> ] 22,246,705 987K/s eta 7s
[localhost] out: 73% [======================================================> ] 22,482,685 959K/s eta 7s
[localhost] out: 74% [======================================================> ] 22,737,809 937K/s eta 7s
[localhost] out: 75% [=======================================================> ] 23,015,365 953K/s eta 6s
[localhost] out: 76% [========================================================> ] 23,339,665 957K/s eta 6s
[localhost] out: 77% [========================================================> ] 23,676,385 985K/s eta 6s
[localhost] out: 78% [=========================================================> ] 24,069,509 1017K/s eta 6s
[localhost] out: 79% [==========================================================> ] 24,393,809 1005K/s eta 6s
[localhost] out: 80% [===========================================================> ] 24,800,909 1.01M/s eta 5s
[localhost] out: 82% [============================================================> ] 25,228,885 1.14M/s eta 5s
[localhost] out: 83% [=============================================================> ] 25,688,425 1.24M/s eta 5s
[localhost] out: 84% [==============================================================> ] 26,011,345 1.29M/s eta 5s
[localhost] out: 86% [===============================================================> ] 26,388,085 1.39M/s eta 5s
[localhost] out: 87% [================================================================> ] 26,793,629 1.45M/s eta 3s
[localhost] out: 88% [=================================================================> ] 27,199,525 1.55M/s eta 3s
[localhost] out: 90% [==================================================================> ] 27,676,829 1.62M/s eta 3s
[localhost] out: 91% [===================================================================> ] 28,139,305 1.67M/s eta 3s
[localhost] out: 93% [=====================================================================> ] 28,658,185 1.77M/s eta 3s
[localhost] out: 95% [======================================================================> ] 29,237,609 1.85M/s eta 1s
[localhost] out: 97% [=======================================================================> ] 29,788,405 1.92M/s eta 1s
[localhost] out: 99% [=========================================================================> ] 30,381,805 2.01M/s eta 1s
[localhost] out: 100%[==========================================================================>] 30,656,942 2.03M/s in 22s
[localhost] out:
[localhost] out: 2013-02-06 08:30:04 (1.32 MB/s) - “CORTEX_release_v1.0.5.14.tgz” saved [30656942/30656942]
[localhost] out:
[localhost] out:

[localhost] run: tar --pax-option='delete=SCHILY.,delete=LIBARCHIVE.' -xzpf CORTEX_release_v1.0.5.14.tgz
[localhost] run: sed -i.bak -r -e 's/-L/full/path/\S_/-L/home/pedrosoi/APP/bcbio_nextgen/install_directory/lib/g' Makefile
[localhost] run: sed -i.bak -r -e 's/^IDIR_GSL=._$/IDIR_GSL=/home/pedrosoi/APP/bcbio_nextgen/install_directory/include/g' Makefile
[localhost] run: sed -i.bak -r -e 's/^IDIR_GSL_ALSO=.*$/IDIR_GSL_ALSO=/home/pedrosoi/APP/bcbio_nextgen/install_directory/include/gsl/g' Makefile
[localhost] run: make clean
[localhost] out: Making clean in doc
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/doc' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: test -z "gsl-ref.dvi gsl-ref.pdf gsl-ref.ps gsl-ref.html" \ [localhost] out: || rm -rf gsl-ref.dvi gsl-ref.pdf gsl-ref.ps gsl-ref.html [localhost] out: rm -rf .libs _libs [localhost] out: rm -rf gsl-ref.aux gsl-ref.cp gsl-ref.cps gsl-ref.fn gsl-ref.fns \ [localhost] out: gsl-ref.ky gsl-ref.kys gsl-ref.log gsl-ref.pg gsl-ref.pgs \ [localhost] out: gsl-ref.tmp gsl-ref.toc gsl-ref.tp gsl-ref.tps gsl-ref.vr \ [localhost] out: gsl-ref.vrs [localhost] out: rm -f *.lo [localhost] out: rm -f vti.tmp [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/doc'
[localhost] out: Making clean in bspline
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/bspline' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslbspline.la " || rm -f libgslbspline.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/bspline'
[localhost] out: Making clean in wavelet
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/wavelet' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslwavelet.la " || rm -f libgslwavelet.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/wavelet'
[localhost] out: Making clean in cdf
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/cdf' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslcdf.la" || rm -f libgslcdf.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/cdf'
[localhost] out: Making clean in deriv
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/deriv' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslderiv.la" || rm -f libgslderiv.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/deriv'
[localhost] out: Making clean in diff
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/diff' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgsldiff.la" || rm -f libgsldiff.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/diff'
[localhost] out: Making clean in ntuple
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/ntuple' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: test -z "test.dat" || rm -f test.dat [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslntuple.la" || rm -f libgslntuple.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/ntuple'
[localhost] out: Making clean in monte
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/monte' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslmonte.la " || rm -f libgslmonte.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/monte'
[localhost] out: Making clean in multimin
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/multimin' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslmultimin.la " || rm -f libgslmultimin.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/multimin'
[localhost] out: Making clean in min
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/min' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslmin.la " || rm -f libgslmin.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/min'
[localhost] out: Making clean in multiroots
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/multiroots' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslmultiroots.la " || rm -f libgslmultiroots.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/multiroots'
[localhost] out: Making clean in roots
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/roots' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslroots.la " || rm -f libgslroots.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/roots'
[localhost] out: Making clean in ode-initval2
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/ode-initval2' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslodeiv2.la " || rm -f libgslodeiv2.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/ode-initval2'
[localhost] out: Making clean in ode-initval
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/ode-initval' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslodeiv.la " || rm -f libgslodeiv.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/ode-initval'
[localhost] out: Making clean in histogram
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/histogram' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: test -z "test.txt test.dat" || rm -f test.txt test.dat [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslhistogram.la " || rm -f libgslhistogram.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/histogram'
[localhost] out: Making clean in interpolation
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/interpolation' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslinterpolation.la " || rm -f libgslinterpolation.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/interpolation'
[localhost] out: Making clean in integration
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/integration' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslintegration.la" || rm -f libgslintegration.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/integration'
[localhost] out: Making clean in sum
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/sum' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslsum.la " || rm -f libgslsum.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/sum'
[localhost] out: Making clean in siman
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/siman' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: test -z "siman_test.out" || rm -f siman_test.out [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslsiman.la" || rm -f libgslsiman.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f siman_tsp [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/siman'
[localhost] out: Making clean in statistics
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/statistics' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslstatistics.la" || rm -f libgslstatistics.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/statistics'
[localhost] out: Making clean in multifit
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/multifit' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslmultifit.la " || rm -f libgslmultifit.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/multifit'
[localhost] out: Making clean in fit
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/fit' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslfit.la " || rm -f libgslfit.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/fit'
[localhost] out: Making clean in poly
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/poly' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslpoly.la " || rm -f libgslpoly.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/poly'
[localhost] out: Making clean in fft
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/fft' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslfft.la " || rm -f libgslfft.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/fft'
[localhost] out: Making clean in randist
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/randist' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslrandist.la" || rm -f libgslrandist.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/randist'
[localhost] out: Making clean in rng
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/rng' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: test -z "test.dat" || rm -f test.dat [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslrng.la " || rm -f libgslrng.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/rng'
[localhost] out: Making clean in qrng
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/qrng' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslqrng.la " || rm -f libgslqrng.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/qrng'
[localhost] out: Making clean in dht
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/dht' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgsldht.la " || rm -f libgsldht.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/dht'
[localhost] out: Making clean in specfunc
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/specfunc' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslspecfunc.la " || rm -f libgslspecfunc.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/specfunc'
[localhost] out: Making clean in eigen
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/eigen' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgsleigen.la " || rm -f libgsleigen.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/eigen'
[localhost] out: Making clean in linalg
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/linalg' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgsllinalg.la " || rm -f libgsllinalg.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/linalg'
[localhost] out: Making clean in blas
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/blas' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslblas.la" || rm -f libgslblas.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/blas'
[localhost] out: Making clean in cblas
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/cblas' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: test -z "libgslcblas.la" || rm -f libgslcblas.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -rf .libs _libs [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/cblas'
[localhost] out: Making clean in ieee-utils
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/ieee-utils' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslieeeutils.la " || rm -f libgslieeeutils.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/ieee-utils'
[localhost] out: Making clean in sort
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/sort' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslsort.la" || rm -f libgslsort.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/sort'
[localhost] out: Making clean in multiset
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/multiset' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslmultiset.la" || rm -f libgslmultiset.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/multiset'
[localhost] out: Making clean in combination
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/combination' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslcombination.la " || rm -f libgslcombination.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/combination'
[localhost] out: Making clean in permutation
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/permutation' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslpermutation.la " || rm -f libgslpermutation.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/permutation'
[localhost] out: Making clean in matrix
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/matrix' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test test_static [localhost] out: test -z "test.txt test.dat" || rm -f test.txt test.dat [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslmatrix.la " || rm -f libgslmatrix.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/matrix'
[localhost] out: Making clean in vector
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/vector' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test test_static [localhost] out: test -z "test.txt test.dat" || rm -f test.txt test.dat [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslvector.la " || rm -f libgslvector.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/vector'
[localhost] out: Making clean in block
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/block' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: test -z "test.txt test.dat" || rm -f test.txt test.dat [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslblock.la " || rm -f libgslblock.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/block'
[localhost] out: Making clean in cheb
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/cheb' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslcheb.la " || rm -f libgslcheb.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/cheb'
[localhost] out: Making clean in complex
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/complex' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslcomplex.la " || rm -f libgslcomplex.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/complex'
[localhost] out: Making clean in const
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/const' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/const'
[localhost] out: Making clean in err
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/err' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslerr.la" || rm -f libgslerr.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/err'
[localhost] out: Making clean in test
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/test' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgsltest.la" || rm -f libgsltest.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/test'
[localhost] out: Making clean in sys
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/sys' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15' [localhost] out: rm -f test [localhost] out: rm -rf .libs _libs [localhost] out: test -z "libgslsys.la " || rm -f libgslsys.la [localhost] out: rm -f "./so_locations" [localhost] out: rm -f *.o [localhost] out: rm -f *.lo [localhost] out: make[1]: Leaving directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/sys'
[localhost] out: Making clean in utils
[localhost] out: make[1]: Entering directory /home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15/utils' [localhost] out: make[2]: Entering directory/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: make[2]: Leaving directory `/home/pedrosoi/tmp/cloudbiolinux/CORTEX_release_v1.0.5.14/libs/gsl-1.15'
[localhost] out: rm -rf .libs _libs
[localhost] out: test -z "libutils.la" || rm -f libutils.la
[localhost] out: rm -f "./so_locations"
[localhost] out: rm -f *.o
[localhost] out: rm -f *.lo
[localhost] ou

GFF parsing stopped working on system upgrade - biopython error

I upgraded my system from Xubuntu 12.10 to 14.04, and GFF parsing stopped working.

I get this error:

>>> from BCBio import GFF
>>> with open('gff_test.gff3') as INFILE:
...     for chromosome_record in GFF.parse(INFILE):
...         print len(chromosome_record.features)
Traceback (most recent call last):
  File "<stdin>", line 2, in <module>
  File "/usr/local/lib/python2.7/dist-packages/bcbio-0.1-py2.7.egg/BCBio/GFF/GFFParser.py", line 709, in parse
    target_lines):
  File "/usr/local/lib/python2.7/dist-packages/bcbio-0.1-py2.7.egg/BCBio/GFF/GFFParser.py", line 304, in parse_in_parts
    cur_dict = self._results_to_features(cur_dict, results)
  File "/usr/local/lib/python2.7/dist-packages/bcbio-0.1-py2.7.egg/BCBio/GFF/GFFParser.py", line 344, in _results_to_features
    results.get('child', []))
  File "/usr/local/lib/python2.7/dist-packages/bcbio-0.1-py2.7.egg/BCBio/GFF/GFFParser.py", line 402, in _add_parent_child_features
    children)
  File "/usr/local/lib/python2.7/dist-packages/bcbio-0.1-py2.7.egg/BCBio/GFF/GFFParser.py", line 447, in _add_children_to_parent
    cur_parent.location_operator = "join"
  File "/usr/local/lib/python2.7/dist-packages/Bio/SeqFeature.py", line 247, in _set_location_operator
    raise ValueError("Only CompoundLocation gets an operator (%r)" % value)
ValueError: Only CompoundLocation gets an operator ('join')

The file I'm using is this extremely simple one (and I get the same error on real files from Phytozome that used to parse fine):

chrA    test    gene    101 700 .   +   .   ID=gene1;Name=gene1;
chrA    test    mRNA    101 700 .   +   .   ID=PAC:gene1_mRNA;Name=gene1.t1.2;pacid=gene1_mRNA;Parent=gene1

I tried reinstalling BCBio (now on version 0.1) and Biopython (1.64), it didn't help.
Is this something I should submit to biopython rather than here?

gff_to_genbank.py: sub_features is deprecated

Thanks for gff_to_genbank.py. It's quite useful to me.

gff_to_genbank.py foo.gff foo.fa
/usr/local/lib/python2.7/site-packages/Bio/SeqFeature.py:171: BiopythonDeprecationWarning: Rather using f.sub_features, f.location should be a CompoundFeatureLocation
  BiopythonDeprecationWarning)

GFF: parse_simple fails in some cases

To reproduce:

from BCBio import GFF

# http://www.broadinstitute.org/annotation/gebo/help/data/gff3/transcripts.gff3
input_file = 'transcripts.gff3'

list(GFF.parse_simple(open(input_file)))

Error output:

KeyError                                  Traceback (most recent call last)
<ipython-input-10-32ebcd95dc0c> in <module>()
----> 1 list(GFF.parse_simple(open(infile)))

/home/keith/software/bcbb/gff/BCBio/GFF/GFFParser.pyc in parse_simple(gff_files, limit_info)
    721     parser = GFFParser()
    722     for rec in parser.parse_simple(gff_files, limit_info=limit_info):
--> 723         yield rec["child"][0]
    724 
    725 def _file_or_handle(fn):

KeyError: 'child'

I checked the results of parser.parse_simple(gff_files, limit_info=limit_info) and there are some parent entries that have no child key.

E.g. For the above file:

[{'parent': [{'id': 'newGene',
    'is_gff2': False,
    'location': [499, 2610],
    'quals': {'ID': ['newGene']},
    'rec_id': 'edit_test.fa',
    'strand': 1,
    'type': 'gene'}]},
 {'child': [{'id': 't1',
    'is_gff2': False,
    'location': [499, 2385],
    'quals': {'ID': ['t1'],
     'Name': ['t1(newGene)'],
     'Namo': ['reinhard+did+this'],
     'Parent': ['newGene'],
     'uri': ['http://www.yahoo.com']},
    'rec_id': 'edit_test.fa',
    'strand': 1,
    'type': 'mRNA'}]},
   ...
]

If you want to treat the parent and child nodes the same, a simple fix would be:

yield rec.get('child', rec.get('parent'))[0]

Hopefully this time it is an actual issue and not just a misunderstanding on my part :)

If the above solution is appropriate, I would be glad to submit a patch.

parsing flybase GFF

When I run available_limits on a flybase GFF file, I get
'gff_type': {...
('three_prime_UTR',): 19250,
...}}

Now, I put my limit_info as
limit_info = dict(gff_type = ["three_prime_UTR"])

When I now try to parse the GFF file with this limit_info, I do not get back any records.

Is there something I am missing, or is this a flybase idiosyncrasy ?

An example line for the three_primer_UTR gff_type from flybase would be :
2L FlyBase three_prime_UTR 3871190 3871425 . + . ID=three_prime_UTR_FBgn0031575:10_802;Name=Cep97-u3;Parent=FBtr0077504,FBtr0290255,FBtr0301082;parent_type=mRNA

Non-intuitive "dist" directory for bcbio/picard/__init.py__

Hey Brad,

I found quite surprising that the interface you created for Picard expects it to be into "dist" directory.

When I unpacked it directly from sf.net the jars appear on a flat directory, without a "dist" subdir, therefore the some script fails trying to find the jars (fixed it by creating the dist dir and moving the files there though).

Thanks !

FastQC vs SolexaQA

Brad,
It's not really an issue. But I want to know, from your experience, how much time you would save from switching to FastQC from SolexaQA?

Thanks,
Paul

Missing GPG keys on mongodb repositories

Clutters the output unnecesarily:

[ec2-204-236-204-229.compute-1.amazonaws.com] err: W: GPG error: http://downloads.mongodb.org 10.4 Release: The following signatures couldn't be verified because the public key is not available: NO_PUBKEY 9ECBEC467F0CEB10
[ec2-204-236-204-229.compute-1.amazonaws.com] err: W: GPG error: http://cran.stat.ucla.edu lucid/ Release: The following signatures couldn't be verified because the public key is not available: NO_PUBKEY D67FC6EAE2A11821

Total variant and CalculateHsMetrics

Hi,

I really cannot think of the reason, but I always have a problem in triggering CalculateHsMetrics.jar with bait interval and target interval being referenced to data file I previously referenced to. The same error message reported even when I updated the config files and rebooted the machine. Do you have any clues for where things could go wrong?

Thanks,
Paul

flybase GFF parsing error

In lines 773-775 of GFFParser.py in gff/BCBio/GFF/GFFParser.py
if line.strip() and line.strip()[0] != "#":
parts = [p.strip() for p in line.split('\t')]
assert len(parts) == 9, line
a tab is not expected in any of the fields, but the flybase GFF files do have the occasional tab in the names or descriptions of genes in the last field, and then the parser breaks.

Without loss of generality, could this not be modified to -

            if len(parts) > 9: 
                    temp_parts = parts[0:8]
                    last_part = " ".join(temp_parts[8:])
                    temp_parts.append(last_part)
                    parts = temp_parts

?

I have made this change in my local copy, and this seems to work fine.

import error on Mac OS X

Using Homebrew-installed Python 2.7.5. Module was installed using pip install bcbio-gff.

import BCBio.GFF
Traceback (most recent call last):
File "", line 1, in
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/BCBio/GFF/init.py", line 3, in
from GFFParser import GFFParser, DiscoGFFParser, GFFExaminer, parse, parse_simple
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/BCBio/GFF/GFFParser.py", line 22, in
import urllib
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib.py", line 26, in
import socket
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/socket.py", line 47, in
import _socket
ImportError: dlopen(/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/_socket.so, 2): Symbol not found: __PyInt_AsInt
Referenced from: /usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/_socket.so
Expected in: flat namespace
in /usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/_socket.so

biopython->numpy interactive (y/n) while deploying pipeline

Even putting numpy >=1.6.1 in setup.py's before biopython, the following message pops up:

Numerical Python (NumPy) is not installed.

This package is required for many Biopython features.  Please install
it before you install Biopython. You can install Biopython anyway, but
anything dependent on NumPy will not work. If you do this, and later
install NumPy, you should then re-install Biopython.

You can find NumPy at http://numpy.scipy.org

Do you want to continue this installation? (y/N):

Apparently install_requires packages are not installed in order, so no dependency order can be defined that way... are you aware of any "pre_install_requires" or similar in setuptools ? Couldn't find it after quickly checking docs :-/

Do not follow up on lanes that failed upstream (wet lab)

The first 4 lanes of this Run failed on the lab and therefore the resulting fastq files were empty (zero filesize). The threads for all the other lanes continue running and get processed correctly though (non-critical exception handling issue).

time automated_initial_analysis.py ~/config/post_process.yaml dataset dataset/run_info.yaml
barcode    count
total      0
barcode    count
total      0
barcode    count
total      0
barcode    count
total      0
1_run_4 Aligning with bowtie
2_run_1 Aligning with bowtie
4_run_4 Aligning with bowtie
3_run_3 Aligning with bowtie
Traceback (most recent call last):
  File "automated_initial_analysis.py", line 7, in 
    execfile(__file__)
  File "automated_initial_analysis.py", line 724, in 
    main(*args, **kwargs)
  File "automated_initial_analysis.py", line 56, in main
    run_main(config, config_file, fc_dir, run_info_yaml)
  File "automated_initial_analysis.py", line 82, in run_main
    for i in run_items))
  File "lib/python2.6/multiprocessing/pool.py", line 148, in map
    return self.map_async(func, iterable, chunksize).get()
  File "lib/python2.6/multiprocessing/pool.py", line 422, in get
    raise self._value
OSError: [Errno 2] No such file or directory

test_2_rnaseq - Error: "Aligned record iterator [...] is behind the unmapped reads..."

Hey Brad, recently when I've been running the test suite the test AutomatedAnalysisTest.test_2_rnaseq the command

java -Xmx6g -jar /bubo/sw/apps/bioinfo/picard/1.41/MergeBamAlignment.jar UNMAPPED=/bubo/home/h10/vale/bcbb/nextgen/tests/test_automated_output/alignments/1_110907_ERP000591_tophat/1_110907_ERP000591_1_fastq-fastq.bam ALIGNED=/bubo/home/h10/vale/bcbb/nextgen/tests/test_automated_output/alignments/1_110907_ERP000591_tophat/1_110907_ERP000591.sam OUTPUT=/bubo/home/h10/vale/bcbb/nextgen/tests/test_automated_output/alignments/1_110907_ERP000591_tophat/tx/1_110907_ERP000591.bam REFERENCE_SEQUENCE=/bubo/nobackup/uppnex/reference/biodata/genomes/Mmusculus/mm9/seq/mm9.fa TMP_DIR=/bubo/home/h10/vale/bcbb/nextgen/tests/test_automated_output/tmp/tmpD2zhfL PAIRED_RUN=true VALIDATION_STRINGENCY=SILENT

is run and fails with the error

Exception in thread "main" java.lang.IllegalStateException: Aligned record iterator (ERR032227.10000043) is behind the unmapped reads (ERR032227.10000043)
        at net.sf.picard.sam.AbstractAlignmentMerger.mergeAlignment(AbstractAlignmentMerger.java:178)
        at net.sf.picard.sam.SamAlignmentMerger.mergeAlignment(SamAlignmentMerger.java:148)
        at net.sf.picard.sam.MergeBamAlignment.doWork(MergeBamAlignment.java:128)
        at net.sf.picard.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:156)
        at net.sf.picard.sam.MergeBamAlignment.main(MergeBamAlignment.java:102)

I can't figure out with this is referring to, searching on the internet I just found the source code for MergeBamAlignment.

Do you know what might be going wrong?

And by the way, which version if Picard are you using and recommend? We are running Picard 1.41 over here, could this be due to us using an old version?

We always appreciate your help!

doing bcl->qseq->fastq->analysis->galaxy in one machine

Hi,

We have a different setting here where the drive with the bcl files is mounted to the analysis machine and we would do everything there. Do you recommend we keep the messaging system in the pipeline? Just want to get some advices.

Thanks,
Paul

bwa quality values

Hi Brad!

I've been using bcbb for exome analysis and have used bwa as the aligner. We have illumina data, so quality_format: Illumina in post_process.yaml. However, as far as I can see, ngsalign/bwa.py doesn't use this information. From bwa aln:

bwa aln

    -I        the input is in the Illumina 1.3+ FASTQ-like format

However, I ran some tests on bwa (0.5.9-r16) with and without the flag, which gave the same mapping results (qualities etc), so I'm wondering if you left this out on purpose? It would seem bwa makes a guess about the quality format, after all?

Cheers,

Per

Genbank Parsing Problem?

Hi Brad, AnnotationSketch is complaining about the parsed file again:

GenomeTools error: CDS feature on line 27 in file "../../mirna-django/src/scripts/tp53.gff3" has the wrong phase 0 (should be 1)

I don't know if the problem is with their GFF3 parser though. Can you tell me what you think?

http://paste.debian.net/159462/

GFF code does not work in Python 3

The current GFF code does not work with Python 3.
gff/BCBio/GFF/init.py
needs local referencing with P3:
from .GFFParser import GFFParser, DiscoGFFParser, GFFExaminer, parse, parse_simple
from .GFFOutput import GFF3Writer, write

(note the . before GFFParser and GFFOutput)

GFFExaminer() displaying empty dict for UCSC GTF

I tried following http://biopython.org/wiki/GFF_Parsing to parse UCSC-generated GTF file.

After executing

pprint.pprint(examiner.parent_child_map(handle))

the output was

{}

Similarly,

examiner.available_limits(handle)

produced

3: {'gff_id': {}, 'gff_source': {}, 'gff_source_type': {}, 'gff_type': {}}

Trying to parse that same file with

from BCBio import GFF
for rec in GFF.parse(handle):
    print rec

produced

ID: chr1
Name: <unknown name>
Description: <unknown description>
Number of features: 2
UnknownSeq(14409, alphabet = Alphabet(), character = '?')

Here are the first 10 lines from the GTF in question

chr1 hg19_knownGene exon 11874 12227 0.000000 + . gene_id "uc001aaa.3"; transcript_id "uc001aaa.3";
chr1 hg19_knownGene exon 12613 12721 0.000000 + . gene_id "uc001aaa.3"; transcript_id "uc001aaa.3";
chr1 hg19_knownGene exon 13221 14409 0.000000 + . gene_id "uc001aaa.3"; transcript_id "uc001aaa.3";
chr1 hg19_knownGene start_codon 12190 12192 0.000000 + . gene_id "uc010nxq.1"; transcript_id "uc010nxq.1";
chr1 hg19_knownGene CDS 12190 12227 0.000000 + 0 gene_id "uc010nxq.1"; transcript_id "uc010nxq.1";
chr1 hg19_knownGene exon 11874 12227 0.000000 + . gene_id "uc010nxq.1"; transcript_id "uc010nxq.1";
chr1 hg19_knownGene CDS 12595 12721 0.000000 + 1 gene_id "uc010nxq.1"; transcript_id "uc010nxq.1";
chr1 hg19_knownGene exon 12595 12721 0.000000 + . gene_id "uc010nxq.1"; transcript_id "uc010nxq.1";
chr1 hg19_knownGene CDS 13403 13636 0.000000 + 0 gene_id "uc010nxq.1"; transcript_id "uc010nxq.1";
chr1 hg19_knownGene stop_codon 13637 13639 0.000000 + . gene_id "uc010nxq.1"; transcript_id "uc010nxq.1";

Trailing Illumina 'A' and demultiplexing

Hi Brad,

We are seeing some issues with unexpectedly many reads ending up in the 'unmatched' category after demultiplexing. After digging around a little, we think that this may be related to the trailing 'A' that the Illumina machines add after the barcode.

More specifically, we allow one mismatch and no indels for the demuxing. It seems that the reads that are unexpectedly classified as unmatched have one mismatch in the actual 6-nucleotide barcode and are, in addition, having the trailing 'A' nucleotide miscalled.

Reading the code, it does indeed seem that for Illumina reads, the last 7 nucleotides, including the trailing 'A', of each read are matched when demultiplexing. Can you confirm that this is the case?

Our preference is to match just the 6-mer index sequence, excluding the last nucleotide in the read and it would be nice to have this done by default for Illumina reads, or at least be able to influence this behavior with a configuration option. What do you think?

Thanks
/Pontus

num_cores: messages; socket.timeout: timed out

Hey Brad, we are trying to use the distributed version of the pipeline.

We have a couple of test sets that we use to quickly see if the pipeline is working. One that takes the normal pipeline about 3 hours to finish, and another much smaller that takes about 7 minutes (this is with 8 cores).

When running the small test set on the messaging variant all files get generated as they should, and the program exits properly. Note that this small set consists of fastq files which are only 12 lines each, and I'm guessing much of the analysis gets skipped due to a lack of data.

When we run the messaging version of the pipeline for the larger set, the programs work for a while (time varies, but say between 45 minutes and 1 hour 30 minutes), but then one of the jobs crashes with a socket.timeout error, (this specific job I believe is some master that coordinates what the other jobs should be doing.

I'll include the output of that job here:

[2012-02-25 02:55:26,856] Found YAML samplesheet, using /proj/a2010002/nobackup/illumina/pipeline_test/archive/000101_SN001_001_AABCD99XX/run_info.yaml instead of Galaxy API
Traceback (most recent call last):
  File "/bubo/home/h10/vale/.virtualenvs/devel/bin/automated_initial_analysis.py", line 7, in <module>
    execfile(__file__)
  File "/bubo/home/h10/vale/bcbb/nextgen/scripts/automated_initial_analysis.py", line 117, in <module>
    main(*args, **kwargs)
  File "/bubo/home/h10/vale/bcbb/nextgen/scripts/automated_initial_analysis.py", line 48, in main
    run_main(config, config_file, fc_dir, work_dir, run_info_yaml)
  File "/bubo/home/h10/vale/bcbb/nextgen/scripts/automated_initial_analysis.py", line 65, in run_main
    lane_items = run_parallel("process_lane", lanes)
  File "/bubo/home/h10/vale/bcbb/nextgen/bcbio/distributed/messaging.py", line 28, in run_parallel
    return runner_fn(fn_name, items)
  File "/bubo/home/h10/vale/bcbb/nextgen/bcbio/distributed/messaging.py", line 67, in _run
    while not result.ready():
  File "/bubo/home/h10/vale/.virtualenvs/devel/lib/python2.7/site-packages/celery/result.py", line 306, in ready
    return all(result.ready() for result in self.results)
  File "/bubo/home/h10/vale/.virtualenvs/devel/lib/python2.7/site-packages/celery/result.py", line 306, in <genexpr>
    return all(result.ready() for result in self.results)
  File "/bubo/home/h10/vale/.virtualenvs/devel/lib/python2.7/site-packages/celery/result.py", line 108, in ready
    return self.status in self.backend.READY_STATES
  File "/bubo/home/h10/vale/.virtualenvs/devel/lib/python2.7/site-packages/celery/result.py", line 196, in status
    return self.state
  File "/bubo/home/h10/vale/.virtualenvs/devel/lib/python2.7/site-packages/celery/result.py", line 191, in state
    return self.backend.get_status(self.task_id)
  File "/bubo/home/h10/vale/.virtualenvs/devel/lib/python2.7/site-packages/celery/backends/base.py", line 237, in get_status
    return self.get_task_meta(task_id)["status"]
  File "/bubo/home/h10/vale/.virtualenvs/devel/lib/python2.7/site-packages/celery/backends/amqp.py", line 128, in get_task_meta
    return self.poll(task_id)
  File "/bubo/home/h10/vale/.virtualenvs/devel/lib/python2.7/site-packages/celery/backends/amqp.py", line 153, in poll
    with self.app.pool.acquire_channel(block=True) as (_, channel):
  File "/sw/comp/python/2.7.1_kalkyl/lib/python2.7/contextlib.py", line 17, in __enter__
    return self.gen.next()
  File "/bubo/home/h10/vale/.virtualenvs/devel/lib/python2.7/site-packages/kombu/connection.py", line 789, in acquire_channel
    yield connection, connection.default_channel
  File "/bubo/home/h10/vale/.virtualenvs/devel/lib/python2.7/site-packages/kombu/connection.py", line 593, in default_channel
    self.connection
  File "/bubo/home/h10/vale/.virtualenvs/devel/lib/python2.7/site-packages/kombu/connection.py", line 586, in connection
    self._connection = self._establish_connection()
  File "/bubo/home/h10/vale/.virtualenvs/devel/lib/python2.7/site-packages/kombu/connection.py", line 546, in _establish_connection
    conn = self.transport.establish_connection()
  File "/bubo/home/h10/vale/.virtualenvs/devel/lib/python2.7/site-packages/kombu/transport/amqplib.py", line 252, in establish_connection
    connect_timeout=conninfo.connect_timeout)
  File "/bubo/home/h10/vale/.virtualenvs/devel/lib/python2.7/site-packages/kombu/transport/amqplib.py", line 62, in __init__
    super(Connection, self).__init__(*args, **kwargs)
  File "/bubo/home/h10/vale/.virtualenvs/devel/lib/python2.7/site-packages/amqplib/client_0_8/connection.py", line 129, in __init__
    self.transport = create_transport(host, connect_timeout, ssl)
  File "/bubo/home/h10/vale/.virtualenvs/devel/lib/python2.7/site-packages/amqplib/client_0_8/transport.py", line 281, in create_transport
    return TCPTransport(host, connect_timeout)
  File "/bubo/home/h10/vale/.virtualenvs/devel/lib/python2.7/site-packages/amqplib/client_0_8/transport.py", line 85, in __init__
    raise socket.error, msg
socket.timeout: timed out
[INFO/MainProcess] process shutting down
[DEBUG/MainProcess] running all "atexit" finalizers with priority >= 0
[DEBUG/MainProcess] running the remaining "atexit" finalizers

Have you encountered any issues with socket.timeout?
Any ideas what we might be doing wrong?

110221_empty_FC12345AAXX missing on s3 bucket ?

Hi Brad, in the test test_empty_fastq(), you are referring the file 110221_empty_FC12345AAXX, but it doesn't get installed during the _install_test_files() step.

Can you check this, and upload these files to the s3 bucket?

Thanks in advance.
/Valentine

parse simple returns different start location

I need to parse GFF file and I found a little problem.

In my GFF file is (for example) this line:
BAC1_SV_50C14_semf_p2_contig1 ltr LTR_retrotransposon 38 9461 0 . 0 ID=ele00002;

So the location should be [38, 9461], but parse_simple(...) returns location [37, 9461].
It happened in all files I tried.

My code look like this:
from BCBio import GFF

  in_file = "your_file.gff"
  in_handle = open(in_file)

  for record in GFF.parse_simple(in_handle):
        print record['type']
        print record['id']
        print record['location']

  in_handle.close()

Ant it prints this:
LTR_retrotransposon
ele00002
[37, 9461]

Can someone explain me what is wrong please? I don know if it is on purpose or if it is some bug.

config files

Hi,

I was running automated_initial_analysis.py but found a typo in the config files. I restarted the script manually but the program seems to have cached the previous config file and resulted in the same error message. May I know how I should restart the script correctly?

Thanks,
Paul

tabix access to gff?

Is there a way to query gff by region with the gff parser thats installed with pip? I have written a small wrapper for tabix that makes use of the _gff_line_map() function and the subprocess module to return gff records.

I did notice this: access_gff_index.py

However, its not included in the pip installed module unless I am missing something, and my gff files are all indexed with tabix as well.

I can create a pull request if you think this would be beneficial.

GFF parsing with initial sequence fails

Hi Brad,

I tried to use the GFF parser with initial sequence. I followed the example from the Wiki, and essentially had something like

  def import_sequence_and_gff(self, fasta_fn, gff_fn):
    in_seq_file = fasta_fn
    in_seq_handle = open(fasta_fn)
    seq_dict = SeqIO.to_dict(SeqIO.parse(in_seq_handle, "fasta"))
    in_seq_handle.close()

    in_file = gff_fn
    in_handle = open(in_file)
    for rec in GFF.parse(in_handle, base_dict=seq_dict):
      print rec
      in_handle.close()

But I get this error message

ValueError: Only CompoundLocation gets an operator ('join')

Looking through the stack, it looks like GFF created a SeqFeature with a SeqLocation, and tried to use the 'join' operator, and BioPython only allows CompoundLocation to have operators.

I am not familiar with the details of BioPython and GFF parser, is this a known problem? Is there a work-around?

Thanks,
Benjie

over url-encoding in attribute fields

In trying to add a Name=value field to my data, and have GFFOutput.py write it, I find that the value field is being fully URL encoded, which is different from the gff3 specification.
In my case, it means attributes like:
NAME=jgi.p|Schco3|1037802
end up urlencoded like this:
NAME=jgi.p%7CSchco3%7C1037802
which causes problems with our downstream data use. I believe these should not be escaped according to the gff3 standard. The gff3 standard v 1.21 says:

URL escaping rules are used for tags or values containing the following characters: ",=;". Spaces are allowed in this field, but tabs must be replaced with the %09 URL escape.  -- http://www.sequenceontology.org/gff3.shtml 

So the rule seems to be:

  1. attribute key or value variables should be fully URL escaped when they contain ",=;"
  2. attribute key or value TAB characters should always be escaped, but having TAB does not trigger full url encoding of that key or value

The attribute key and value in NAME=jgi.p|Schco3|1037802 do not contain ",=;". Hence this should not be escaped.

Do you agree? Would you like a patch to GFFOutput.py that provides a routine following those rules for escaping values?

Start position in GFF

Hi Brad,

Why is the start location after parsing (*.location) gff file is one less than the original.

If GFF file have below co-ordinaties,

five_prime_UTR 3860074 3861033

The parser is outputting 3860073 3861033 as start and end coordinates. I read through the SO ontology and other documentation. What I found is start position is 1 based coordinate system and I suppose it is inclusive.

Pardon me if this is a silly questions.

Thanks,
Srikar.

Test suite has started taking a very long time to run

Hi Brad, we have started to experience really long times for running the test suite, have the run time for tests increased for you as well?
Or are we doing something odd? Running all the tests on a node with 8 cores takes about 4 hours.

Something new I have noted in the stdout from nosetests -s -v test_automated_analysis.py is thousands of lines of the form

INFO  17:41:53,026 TraversalEngine -  chr5:133867088        5.39e+07    2.4 h        2.7 m     74.0%         3.3 h    51.4 m 

which seem to perform some work for several hours.

Trans-spliced genes cause ValueError: Did not find remapped ID location

Hi, Brad. I've annotated a trans-spliced gene according to this recommendation—look for Trans-spliced transcript. The gene rps12 is composed of two gene features, two mRNA features with the same ID and each with two parents (the two genes), three exons, three CDS and one intron (the cis-spliced intron). This situation causes the following error:

❯❯❯ gff_to_genbank.py pg29-plastid-manual.gff pg29-plastid-manual.fa
…
ValueError: Did not find remapped ID location: gene84, [[112441, 113241]], [9558, 9672]

So, any hope to support trans-splicing? Thanks!

The annotation looks like this:

1   manual  gene    9559    9672    .   +   .   ID=gene83;Name=rps12|lcl|NC_021456.1_cdsid_YP_008082803.1_8-gene;exception=trans-splicing
1   manual  gene    112442  113241  .   +   .   ID=gene84;Name=rps12|lcl|NC_021456.1_cdsid_YP_008082803.1_8-gene;exception=trans-splicing
1   manual  mRNA    9559    9672    .   +   .   ID=mRNA43;Parent=gene83,gene84;Name=rps12|lcl|NC_021456.1_cdsid_YP_008082803.1_8;exception=trans-splicing
1   manual  mRNA    112442  113241  .   +   .   ID=mRNA43;Parent=gene83,gene84;Name=rps12|lcl|NC_021456.1_cdsid_YP_008082803.1_8;exception=trans-splicing
1   manual  exon    9559    9672    .   +   .   Parent=mRNA43
1   manual  CDS 9559    9672    .   +   0   Parent=mRNA43
1   manual  exon    112442  112673  .   +   .   Parent=mRNA43
1   manual  CDS 112442  112673  .   +   0   Parent=mRNA43
1   manual  intron  112674  113215  .   +   .   Parent=mRNA43
1   manual  exon    113216  113241  .   +   .   Parent=mRNA43
1   manual  CDS 113216  113241  .   +   2   Parent=mRNA43

_tabularize_metrics float parse error

Just catched this one on the logs...

Traceback (most recent call last):
File ".virtualenvs/devel/bin/align_summary_report.py", line 7, in
execfile(file)
File "bcbb/nextgen/scripts/align_summary_report.py", line 292, in
main(_args, *_kwargs)
File "bcbb/nextgen/scripts/align_summary_report.py", line 72, in main
run_pdflatex(out_file, params)
File "/lib/python2.6/contextlib.py", line 34, in exit
self.gen.throw(type, value, traceback)
File "/bubo/home/h5/roman/opt/bcbb/nextgen/bcbio/utils.py", line 83, in curdir_tmpdir
yield tmp_dir
File "bcbb/nextgen/scripts/align_summary_report.py", line 49, in main
align_bam, ref_file, is_paired, bait_file, target_file)
File "bcbb/nextgen/bcbio/broad/metrics.py", line 236, in report
vrn_vals)
File "nextgen/bcbio/broad/metrics.py", line 33, in get_summary_metrics
hybrid_vals, vrn_vals)
File "nextgen/bcbio/broad/metrics.py", line 83, in _tabularize_metrics
std_dev = "+/- %.1f" % float(std) if (std and std != "?") else ""
ValueError: invalid literal for float(): 40,463208

Version of GATK (for test_empty_fastq())

Hey Brad, we're trying to get the test suite to pass and are currently looking at nosetests test_automated_analysis:AutomatedAnalysisTest.test_empty_fastq.

It gives the error

ERROR ------------------------------------------------------------------------------------------
ERROR A USER ERROR has occurred (version 1.0.5365):
ERROR The invalid arguments or inputs must be corrected before the GATK can proceed
ERROR Please do not post this error to the GATK forum
ERROR
ERROR See the documentation (rerun with -h) for this tool to view allowable command-line arguments.
ERROR Visit our wiki for extensive documentation http://www.broadinstitute.org/gsa/wiki
ERROR Visit our forum to view answers to commonly asked questions http://getsatisfaction.com/gsa
ERROR
ERROR MESSAGE: Your input file has a malformed header: We never saw the required CHROM header line (starting with one #) for the input VCF file
ERROR ------------------------------------------------------------------------------------------

when processing the file genomes/hg19/variation/dbsnp_132.vcf

So I was wondering what version of GATK you are using.

Thanks, Valentine.

adaptor_trim.py

Hi Brad,

I am running a test of your code for adaptor trimming (http://coderscrowd.com/app/codes/view/224), the unit test is Ok. I don't see though why this code would not work for long reads. In the description of the code you mentioned that the code is designed for short read sequencing, can you please enlighten me on this ?

Thanks

Rad

merging of demuxed fastq files and project-based analyses

Hi Brad,

more of a question than an issue. I noticed you've added code (bcbio.pipeline.sample.merge_sample) to merge samples across lanes. I've been using save_diskspace=true in order to remove sam files, but this I noticed also removes the demultiplexed files, right? I just want to make sure because it affects our data delivery routines, as outlined below.

In our setup, we have situations when we run several projects on one lane, which we distinguish with an extra "description" tag in run_info, so in principle each barcode could have a description with a different project name. We then partition fastq files in a lane based on the description tag when delivering data to customers.

On a similar note, when I do analyses for customers, I've been doing it on a project-by-project basis (it makes more sense to me), and therefore written helper scripts (project_*, see EDIT: https://github.com/percyfal/bcbb/tree/develop/nextgen/scripts) for this purpose. project_analysis_pipeline.sh is almost a copy of automated_initial_analysis.py, but starts off with demultiplexed files. Have you had this functionality in mind (or is it even already there)?

Cheers,

Per

bam_to_wiggle

Hi Brad,

Thanks much for this neat little script.
There is no provision for strand information though .... ?, pysam's pileup does not make any provision for strand I presume, so that makes things a tad complicated ?

Vineeth

Tests not passing with v0.2 & messaging

Hi Brad,

I'm having problems getting the tests to pass for my (customized) version of bcbb, which is based on your 0.2 version. The error message I get is that the required files cannot be found, e.g.:

ValueError: Did not find correct files for /path/to/bcbb/nextgen/tests/data/automated/../110106_FC70BUKAAXX 1 FC70BUKAAXX []

Doing an ls of this path gives:

ls /path/to/bcbb/nextgen/tests/data/automated/../110106_FC70BUKAAXX
1_110106_FC70BUKAAXX_1_fastq.txt
1_110106_FC70BUKAAXX_2_fastq.txt
2_110106_FC70BUKAAXX_1_fastq.txt
2_110106_FC70BUKAAXX_2_fastq.txt
3_110106_FC70BUKAAXX_fastq.txt

This is only an issue when setting "num_cores: messaging", the tests pass fine when using e.g. "num_cores: 8" on the same code.

Does this ring a bell? Can you think of a reason why the files are found ok in the multicore mode but not when using the distributed mode?

Thanks
/Pontus

No such file or directory

While I was running data_fabfile.py, I got the following error message
hg19.txt'00249.fadom.faa7_random.far (return code 1) while executing 'cat ./chr1.fa
do you have any clues where goes wrong?

Thanks,
Paul

error while installing lein

I am trying to insyall the software and I am getting the following error. I have pasted the complete output in case you may want to see it.

thanks in advance
into

$ python2.7 bcbio_nextgen_install.py --distribution centos --nosudo /home/pedrosoi/APP/bcbio_nextgen/install_directory /home/pedrosoi/APP/bcbio_nextgen/data_directory
Installing tools...
[localhost] Executing task 'install_biolinux'
INFO: Config start time: 2013-02-05 22:41:40.486109
INFO: This is a Base Flavor - no overrides
DBG [init.py]: Minimal Edition 1.5.3
INFO: This is a minimal
INFO: Distribution centos
INFO: Get local environment
INFO: CentOS setup
DBG [distribution.py]: Checking target distribution centos
DBG [distribution.py]: Unknown target distro
DBG [distribution.py]: NixPkgs: Ignored
[localhost] run: echo $HOME
[localhost] out: /home/pedrosoi
[localhost] out:

[localhost] run: uname -m
[localhost] out: x86_64
[localhost] out:

INFO: Now, testing connection to host...
INFO: Connection to host appears to work!
DBG [utils.py]: Expand paths
DBG [fabfile.py]: Target is 'None'
DBG [config.py]: Using config file /home/pedrosoi/APP/bcbio_nextgen/tmpbcbio-install/cloudbiolinux/cloudbio/../contrib/flavor/ngs_pipeline/main.yaml
INFO: Meta-package information from /home/pedrosoi/APP/bcbio_nextgen/tmpbcbio-install/cloudbiolinux/cloudbio/../contrib/flavor/ngs_pipeline/main.yaml

  • Packages: minimal,libraries,python,java,r,bio_nextgen,distributed
  • Libraries:
    INFO: Target=unknown; Edition=Minimal Edition; Flavor=ngs_pipeline
    DBG [config.py]: Using config file /home/pedrosoi/APP/bcbio_nextgen/tmpbcbio-install/cloudbiolinux/cloudbio/../config/custom.yaml
    INFO: Reading /home/pedrosoi/APP/bcbio_nextgen/tmpbcbio-install/cloudbiolinux/cloudbio/../config/custom.yaml
    DBG [shared.py]: Packages to install: pydoop,seal,leiningen,bx-python,rpy,abyss,cortex_var,ray,transabyss,trinity,velvet,macs,bcbio_variation,crisp,gemini,stacks,tassel,varscan,hydra,cufflinks,freebayes,gatk,gatk_queue,picard,sambamba,samtools,shrec,snpeff,tophat,vep,bfast,novoalign,novosort,plink_seq,ucsc_tools,bedtools,dwgsim,fastqc,fastx_toolkit,varianttools,vcftools,bowtie,bowtie2,bwa,gmap,lastz,mosaik,snap,stampy
    INFO: Custom install for 'leiningen' start time: 2013-02-05 22:41:41.460383
    INFO: Custom install for 'leiningen' start time: 2013-02-05 22:41:41.460383
    DBG [fabfile.py]: Import leiningen
    DBG [fabfile.py]: Import leiningen
    [localhost] run: echo $TMPDIR
    [localhost] out:
    [localhost] out:
    [localhost] run: echo $HOME
    [localhost] out: /home/pedrosoi
    [localhost] out:

[localhost] run: wget --no-check-certificate https://raw.github.com/technomancy/leiningen/stable/bin/lein
[localhost] out: --2013-02-05 22:41:42-- https://raw.github.com/technomancy/leiningen/stable/bin/lein
[localhost] out: Resolving raw.github.com... 199.27.77.130
[localhost] out: Connecting to raw.github.com|199.27.77.130|:443... connected.
[localhost] out: WARNING: certificate common name “*.a.ssl.fastly.net” doesn’t match requested host name “raw.github.com”.
[localhost] out: HTTP request sent, awaiting response... 200 OK
[localhost] out: Length: 10250 (10K) [text/plain]
[localhost] out: Saving to: “lein.1”
[localhost] out:
[localhost] out:
[localhost] out: 0% [ ] 0 --.-K/s
[localhost] out: 100%[==========================================================================>] 10,250 --.-K/s in 0s
[localhost] out:
[localhost] out: 2013-02-05 22:41:42 (49.4 MB/s) - “lein.1” saved [10250/10250]
[localhost] out:
[localhost] out:

[localhost] run: chmod a+rwx lein
[localhost] run: mv lein /home/pedrosoi/APP/bcbio_nextgen/install_directory/bin
[localhost] run: lein
[localhost] out: /bin/bash: lein: command not found
[localhost] out:

Fatal error: run() received nonzero return code 127 while executing!

Requested: lein
Executed: /bin/bash -l -c "cd /home/pedrosoi/tmp/cloudbiolinux && lein"

Aborting.
Disconnecting from localhost... done.
Traceback (most recent call last):
File "bcbio_nextgen_install.py", line 159, in
main(parser.parse_args())
File "bcbio_nextgen_install.py", line 37, in main
install_tools(cbl["tool_fabfile"], fabricrc)
File "bcbio_nextgen_install.py", line 71, in install_tools
"install_biolinux:flavor=ngs_pipeline"])
File "/home/pedrosoi/python/lib/python2.7/subprocess.py", line 511, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['fab', '-f', '/home/pedrosoi/APP/bcbio_nextgen/tmpbcbio-install/cloudbiolinux/fabfile.py', '-H', 'localhost', '-c', '/home/pedrosoi/APP/bcbio_nextgen/tmpbcbio-install/fabricrc.txt', 'install_biolinux:flavor=ngs_pipeline']' returned non-zero exit status 1

GFFParser on python 3.3.2

Hi,

I tried to use GFFParser on python 3.3.2, but I got an error.

Python 3.3.2 (default, May 21 2013, 11:50:47)
[GCC 4.2.1 Compatible Apple Clang 4.1 ((tags/Apple/clang-421.11.66))] on darwin
Type "help", "copyright", "credits" or "license" for more information.

from BCBio import GFF
Traceback (most recent call last):
File "", line 1, in
File "/opt/local/Library/Frameworks/Python.framework/Versions/3.3/lib/python3.3/site-packages/BCBio/GFF/init.py", line 3, in
from GFFParser import GFFParser, DiscoGFFParser, GFFExaminer, parse, parse_simple
ImportError: No module named 'GFFParser'

I installed it using pip "/Users/Alvin/Library/Python/3.3/bin/pip install bcbio-gff".

I am new to python, so I am not sure whether is it caused by improper installation or the different version of python.

Thanks,
Alvin

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.