Coder Social home page Coder Social logo

baidu / lac Goto Github PK

View Code? Open in Web Editor NEW
3.8K 106.0 588.0 65.14 MB

百度NLP:分词,词性标注,命名实体识别,词重要性

License: Apache License 2.0

CMake 5.72% C 5.64% Python 38.03% C++ 44.45% Java 6.16%
word-segmentation part-of-speech-tagger named-entity-recognition chinese-word-segmentation chinese-nlp lexical-analysis python java

lac's Introduction

工具介绍

LAC全称Lexical Analysis of Chinese,是百度自然语言处理部研发的一款联合的词法分析工具,实现中文分词、词性标注、专名识别等功能。该工具具有以下特点与优势:

  • 效果好:通过深度学习模型联合学习分词、词性标注、专名识别任务,词语重要性,整体效果F1值超过0.91,词性标注F1值超过0.94,专名识别F1值超过0.85,效果业内领先。
  • 效率高:精简模型参数,结合Paddle预测库的性能优化,CPU单线程性能达800QPS,效率业内领先。
  • 可定制:实现简单可控的干预机制,精准匹配用户词典对模型进行干预。词典支持长片段形式,使得干预更为精准。
  • 调用便捷支持一键安装,同时提供了Python、Java和C++调用接口与调用示例,实现快速调用和集成。
  • 支持移动端: 定制超轻量级模型,体积仅为2M,主流千元手机单线程性能达200QPS,满足大多数移动端应用的需求,同等体积量级效果业内领先。

安装与使用

在此我们主要介绍Python安装与使用,其他语言使用:

安装说明

代码兼容Python2/3

  • 全自动安装: pip install lac

  • 半自动下载:先下载http://pypi.python.org/pypi/lac/,解压后运行 python setup.py install

  • 安装完成后可在命令行输入laclac --segonly,lac --rank启动服务,进行快速体验。

    国内网络可使用百度源安装,安装速率更快:pip install lac -i https://mirror.baidu.com/pypi/simple

功能与使用

分词

  • 代码示例:
from LAC import LAC

# 装载分词模型
lac = LAC(mode='seg')

# 单个样本输入,输入为Unicode编码的字符串
text = u"LAC是个优秀的分词工具"
seg_result = lac.run(text)

# 批量样本输入, 输入为多个句子组成的list,平均速率会更快
texts = [u"LAC是个优秀的分词工具", u"百度是一家高科技公司"]
seg_result = lac.run(texts)
  • 输出:
【单样本】:seg_result = [LAC, 是, 个, 优秀, 的, 分词, 工具]
【批量样本】:seg_result = [[LAC, 是, 个, 优秀, 的, 分词, 工具], [百度, 是, 一家, 高科技, 公司]]

词性标注与实体识别

  • 代码示例:
from LAC import LAC

# 装载LAC模型
lac = LAC(mode='lac')

# 单个样本输入,输入为Unicode编码的字符串
text = u"LAC是个优秀的分词工具"
lac_result = lac.run(text)

# 批量样本输入, 输入为多个句子组成的list,平均速率更快
texts = [u"LAC是个优秀的分词工具", u"百度是一家高科技公司"]
lac_result = lac.run(texts)
  • 输出:

每个句子的输出其切词结果word_list以及对每个单词的标注tags_list,其格式为(word_list, tags_list)

【单样本】: lac_result = ([百度, 是, 一家, 高科技, 公司], [ORG, v, m, n, n])
【批量样本】:lac_result = [
                    ([百度, 是, 一家, 高科技, 公司], [ORG, v, m, n, n]),
                    ([LAC, 是, 个, 优秀, 的, 分词, 工具], [nz, v, q, a, u, n, n])
                ]

词性和专名类别标签集合如下表,其中我们将最常用的4个专名类别标记为大写的形式:

标签 含义 标签 含义 标签 含义 标签 含义
n 普通名词 f 方位名词 s 处所名词 nw 作品名
nz 其他专名 v 普通动词 vd 动副词 vn 名动词
a 形容词 ad 副形词 an 名形词 d 副词
m 数量词 q 量词 r 代词 p 介词
c 连词 u 助词 xc 其他虚词 w 标点符号
PER 人名 LOC 地名 ORG 机构名 TIME 时间

词语重要性

  • 代码示例:
from LAC import LAC

# 装载词语重要性模型
lac = LAC(mode='rank')

# 单个样本输入,输入为Unicode编码的字符串
text = u"LAC是个优秀的分词工具"
rank_result = lac.run(text)

# 批量样本输入, 输入为多个句子组成的list,平均速率会更快
texts = [u"LAC是个优秀的分词工具", u"百度是一家高科技公司"]
rank_result = lac.run(texts)
  • 输出:
【单样本】:rank_result = [['LAC', '是', '个', '优秀', '的', '分词', '工具'], 
                        [nz, v, q, a, u, n, n],[3, 0, 0, 2, 0, 3, 1]]
【批量样本】:rank_result = [
                    (['LAC', '是', '个', '优秀', '的', '分词', '工具'], 
                     [nz, v, q, a, u, n, n], [3, 0, 0, 2, 0, 3, 1]),  
                    (['百度', '是', '一家', '高科技', '公司'], 
                     [ORG, v, m, n, n], [3, 0, 2, 3, 1])
                ]

词语重要性各类别标签集合如下表,我们使用4-Level梯度进行分类:

标签 含义 常见于词性
0 query中表述的冗余词 p, w, xc ...
1 query中限定较弱的词 r, c, u ...
2 query中强限定的词 n, s, v ...
3 query中的核心词 nz, nw, LOC ...

定制化功能

在模型输出的基础上,LAC还支持用户配置定制化的切分结果和专名类型输出。当模型预测匹配到词典的中的item时,会用定制化的结果替代原有结果。为了实现更加精确的匹配,我们支持以由多个单词组成的长片段作为一个item。

我们通过装载词典文件的形式实现该功能,词典文件每行表示一个定制化的item,由一个单词或多个连续的单词组成,每个单词后使用'/'表示标签,如果没有'/'标签则会使用模型默认的标签。每个item单词数越多,干预效果会越精准。

  • 词典文件示例

    这里仅作为示例,展现各种需求情况下的结果。后续还将开放以通配符配置词典的模式,敬请期待。

春天/SEASON
花/n 开/v
秋天的风
落 阳
  • 代码示例
from LAC import LAC
lac = LAC()

# 装载干预词典, sep参数表示词典文件采用的分隔符,为None时默认使用空格或制表符'\t'
lac.load_customization('custom.txt', sep=None)

# 干预后结果
custom_result = lac.run(u"春天的花开秋天的风以及冬天的落阳")
  • 以输入“春天的花开秋天的风以及冬天的落阳”为例,原本输出结果为:
春天/TIME 的/u 花开/v 秋天/TIME 的/u 风/n 以及/c 冬天/TIME 的/u 落阳/n
  • 添加示例中的词典文件后的结果为:
春天/SEASON 的/u 花/n 开/v 秋天的风/n 以及/c 冬天/TIME 的/u 落/n 阳/n

增量训练

我们也提供了增量训练的接口,用户可以使用自己的数据,进行增量训练,首先需要将数据转换为模型输入的格式,并且所有数据文件均为"UTF-8"编码:

1. 分词训练
  • 数据样例

    与大多数开源分词数据集格式一致,使用空格作为单词切分标记,如下所示:

LAC 是 个 优秀 的 分词 工具 。
百度 是 一家 高科技 公司 。
春天 的 花开 秋天 的 风 以及 冬天 的 落阳 。
  • 代码示例
from LAC import LAC

# 选择使用分词模型
lac = LAC(mode = 'seg')

# 训练和测试数据集,格式一致
train_file = "./data/seg_train.tsv"
test_file = "./data/seg_test.tsv"
lac.train(model_save_dir='./my_seg_model/',train_data=train_file, test_data=test_file)

# 使用自己训练好的模型
my_lac = LAC(model_path='my_seg_model')
2. 词法分析训练
  • 样例数据

    在分词数据的基础上,每个单词以“/type”的形式标记其词性或实体类别。值得注意的是,词法分析的训练目前仅支持标签体系与我们一致的数据。后续也会开放支持新的标签体系,敬请期待。

LAC/nz 是/v 个/q 优秀/a 的/u 分词/n 工具/n 。/w
百度/ORG 是/v 一家/m 高科技/n 公司/n 。/w
春天/TIME 的/u 花开/v 秋天/TIME 的/u 风/n 以及/c 冬天/TIME 的/u 落阳/n 。/w
  • 代码示例
from LAC import LAC

# 选择使用默认的词法分析模型
lac = LAC()

# 训练和测试数据集,格式一致
train_file = "./data/lac_train.tsv"
test_file = "./data/lac_test.tsv"
lac.train(model_save_dir='./my_lac_model/',train_data=train_file, test_data=test_file)

# 使用自己训练好的模型
my_lac = LAC(model_path='my_lac_model')

文件结构

.
├── python                      # Python调用的脚本
├── c++                         # C++调用的代码
├── java                        # Java调用的代码
├── Android                     # Android调用的示例
├── README.md                   # 本文件
└── CMakeList.txt               # 编译C++和Java调用的脚本

在论文中引用LAC

如果您的学术工作成果中使用了LAC,请您增加下述引用。我们非常欣慰LAC能够对您的学术工作带来帮助。

@article{jiao2018LAC,
	title={Chinese Lexical Analysis with Deep Bi-GRU-CRF Network},
	author={Jiao, Zhenyu and Sun, Shuqi and Sun, Ke},
	journal={arXiv preprint arXiv:1807.01882},
	year={2018},
	url={https://arxiv.org/abs/1807.01882}
}

贡献代码

我们欢迎开发者向LAC贡献代码。如果您开发了新功能,发现了bug……欢迎提交Pull request与issue到Github。

lac's People

Contributors

bond-h avatar halfish avatar jshower avatar moberq avatar nickname1230 avatar sdshq avatar siuuuu7 avatar sunhf avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

lac's Issues

关于docker

我本身使用的是Ubuntu16.4,不打算使用docker,可以直接跳过安装步骤的第二步,以及第三步的第一行命令(docker run -it -v $pwd:/paddle -w /paddle paddle:dev /bin/bash) ,直接建立build 文件夹吗?

运行train.py 报错

Traceback (most recent call last):
File "train.py", line 338, in
train(args)
File "train.py", line 295, in train
fluid.io.save_inference_model(temp_save_model, ['word', 'target'], [num_infer_chunks, num_label_chunks, num_correct_chunks], save_exe)
File "/usr/local/python3.6.6/lib/python3.6/site-packages/paddle/fluid/io.py", line 954, in save_inference_model
var, 1., name="save_infer_model/scale_{}".format(i))
File "/usr/local/python3.6.6/lib/python3.6/site-packages/paddle/fluid/layers/nn.py", line 8848, in scale
name=name, dtype=x.dtype, persistable=False)
File "/usr/local/python3.6.6/lib/python3.6/site-packages/paddle/fluid/layer_helper.py", line 366, in create_variable
return self.main_program.current_block().create_var(*args, **kwargs)
File "/usr/local/python3.6.6/lib/python3.6/site-packages/paddle/fluid/framework.py", line 1198, in create_var
var = Variable(block=self, *args, **kwargs)
File "/usr/local/python3.6.6/lib/python3.6/site-packages/paddle/fluid/framework.py", line 350, in init
dtype))
ValueError: Variable save_infer_model/scale_0 has been created before. The previous data type is VarType.FP32; the new data type is VarType.INT64. They are not matched.

构建docker镜像报错

In file included from /usr/include/python2.7/Python.h:33:0,from linkcheck/HtmlParser/htmlsax.h:23,from linkcheck/HtmlParser/htmllex.c:1:
/usr/include/stdio.h:33:21: fatal error: stddef.h: No such file or directory compilation terminated. error: command 'x86_64-linux-gnu-gcc' failed with exit status 1

Command "/usr/bin/python -u -c "import setuptools, tokenize;file='/tmp/pip-install-7PA4uO/LinkChecker/setup.py';f=getattr(tokenize, 'open', open)(file);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, file, 'exec'))" install --record /tmp/pip-record-PtCwzn/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /tmp/pip-install-7PA4uO/LinkChecker/
The command '/bin/sh -c pip install -r /root/requirements.txt' returned a non-zero code: 1
截图:

qq 20181106163413

lac_demo 运行不了,提示W0808 10:51:53.736791 15022 init.cc:85] 'CUDA' is not supported, Please re-compile with WITH_GPU option

~/git/Paddle/lac$ ./output/demo/lac_demo ./conf 20 8
Loaded q2b dic -- num = 172
Loaded strong punc -- num = 5
Loaded word dic -- num(with oov) = 20940
Loaded tag dic -- num = 57
WARNING: Logging before InitGoogleLogging() is written to STDERR
W0808 10:51:53.736791 15022 init.cc:85] 'CUDA' is not supported, Please re-compile with WITH_GPU option
W0808 10:51:53.736852 15022 init.cc:101] 'CUDA' is not supported, Please re-compile with WITH_GPU option
Loaded customization dic -- num = 0
create lac handle successfully
create lac buff successfully
create lac buff successfully
create lac buff successfully
create lac buff successfully
create lac buff successfully
create lac buff successfully
create lac buff successfully
create lac buff successfully
然后就卡在这儿运行不下去了

安装的是 cuda-9.0 版本,不知道为什么不支持

UnicodeEncodeError on Chinese

Hi,my Python version is Python 2.7.17 |Anaconda custom (64-bit)| (default, Oct 21 2019, 19:04:46)
After pip install lac, I exec lac and input Chinese to test,then have an UnicodeEncodeError

$ lac
百度开源词法分析工具LAC2.0
Traceback (most recent call last):
  File "/home/lishoujun/anaconda2/bin/lac", line 8, in <module>
    sys.exit(main())
  File "/home/lishoujun/anaconda2/lib/python2.7/site-packages/LAC/cmdline.py", line 60, in main
    for word, tag in zip(words, tags)))
UnicodeEncodeError: 'latin-1' codec can't encode characters in position 0-1: ordinal not in range(256)

TF版本

能否像AnyQ 一样提供TF 版本?

train.py 训练收敛速度很慢

使用项目自带数据训练,收敛速度很慢,以下是训练日志和执行命令,batch_id已经执行到1600
请问训练速度慢是因为数据量不够多,还是别的原因?
是不是训练速度就应该这么慢
执行命令

python python/train.py --corpus_type_list news title --corpus_proportion_list 0.5 0.5

日志截取

batch_id:1684, avg_cost:53.753895
batch_id:1685, avg_cost:54.191296
news corpus finish a pass of training
title corpus finish a pass of training
batch_id:1686, avg_cost:53.003643
batch_id:1687, avg_cost:54.90624
news corpus finish a pass of training
title corpus finish a pass of training
batch_id:1688, avg_cost:54.78626
batch_id:1689, avg_cost:53.08998

源码是不是有内存泄漏

源码是不是有内存泄漏
void lac_buff_destroy(void* lac_handle, void* lac_buff) {
if (lac_handle == NULL && lac_buff != NULL) {
((Lac*) lac_handle)->destroy_buff(lac_buff);
}
}

lac的make时出错,显示fatal error:xxhash.h cannot find

前面的一切安装正常,但是在lac的make时,提示xxhash.h找不到;
我搜索了一下,fluid的build路径下面是有xxhash.h,而且报的错误是paddle本身代码的问题

用的是paddle v1.4版本;lac是for_paddle_1.1;

In file included from /home/lac/src/lac_glb.h:22:0,
from /home/lac/src/customization_tagger.h:20,
from /home/lac/src/customization_tagger.cpp:15:
/home/Paddle-release-1.4/build/fluid_install_dir/paddle/fluid/framework/scope.h:
18:20: fatal error: xxhash.h: No such file or directory
#include <xxhash.h>
^
compilation terminated.
make[2]: *** [CMakeFiles/lac.dir/src/customization_tagger.cpp.o] Error 1
make[1]: *** [CMakeFiles/lac.dir/all] Error 2
make: *** [all] Error 2

python 接口

能否提供简单的python接口,方便离线实验?

如何提速

用lac做ner的时候,发现速度不是那么理想。有什么办法,可以提速呢?这个很重要,当query条目太多的时候,特别耗时。

无法加载paddle模块

我在运行python python/train.py -h命令时,出现以下错误:
Traceback (most recent call last):
File "python/train.py", line 12, in
import paddle
ImportError: No module named paddle

请问怎么解决

fail to run lac_demo

Hi,
I have executed the demo file as ./output/demo/lac_demo conf 200 1.
But I got the following error messages:

Loaded q2b dic -- num = 172
Loaded strong punc -- num = 5
Loaded word dic -- num(with oov) = 20940
Loaded tag dic -- num = 57
terminate called after throwing an instance of 'paddle::platform::EnforceNotMet'
what(): Fail to parse program_desc from binary string. at [/home/craiditx/Paddle/paddle/fluid/framework/program_desc.cc:88]
PaddlePaddle Call Stacks:
0 0x4184d3p paddle::platform::EnforceNotMet::EnforceNotMet(std::__exception_ptr::exception_ptr, char const*, int) + 355
1 0x7f2001f29827p paddle::framework::ProgramDesc::ProgramDesc(std::__cxx11::basic_string<char, std::char_traits, std::allocator > const&) + 823
2 0x7f2001eca13fp paddle::inference::Load(paddle::framework::Executor*, paddle::framework::Scope*, std::__cxx11::basic_string<char, std::char_traits, std::allocator > const&) + 287
3 0x415606p lac::MainTagger::init_model(std::__cxx11::basic_string<char, std::char_traits, std::allocator > const&) + 262
4 0x4160b2p lac::MainTagger::create(char const*) + 530
5 0x40c3d1p lac::Lac::create(char const*) + 945
6 0x40951bp lac_create + 11
7 0x40899bp init_dict(char const*) + 11
8 0x409191p test_main(int, char**) + 97
9 0x408769p main + 9
10 0x7f2000e7b830p __libc_start_main + 240
11 0x4088b9p _start + 41

Would you please give me some advice to solve that? Thank you!

/bin/ld: cannot find -lmkldnn when make lac

Hi,

I have encounter an error:
/bin/ld: cannot find -lmkldnn
when I try to make the lac project
but I have already installed mkldnn with pip and the lib file libmkldnn.so.0 exist in the /usr/lib directory. What the reason for this phenomena, what should I do to complete the installation?

Thank you.

gcc version problem.

with gcc 4.8.4, the make process failed.
CMakeLists.txt Line 11 maybe change to if (GCC_VERSION VERSION_LESS 4.8)

申请发布在超大语料上训练好的模型

您好,感谢您们的贡献,为我们提供了这么好的框架。

Google 公开了 BERT 在超大语料上训练的模型
请问,Paddle能否发布一个在超大语料上训练好的模型?

谢谢!

build paddlev0.14.0 failed

ub16hp@UB16HP:~/ub16_prj/Paddle$ git checkout v0.14.0
HEAD is now at 163b5e5... Merge pull request #11805 from guoshengCS/cherry-pick-beam-search

ub16hp@UB16HP:~/ub16_prj/Paddle$ cd build/

ub16hp@UB16HP:/ub16_prj/Paddle/build$ cmake -DCMAKE_BUILD_TYPE=Release -DWITH_MKLDNN=OFF -DWITH_GPU=OFF -DWITH_FLUID_ONLY=ON ..
-- Found Paddle host system: ubuntu, version: 16.04.4
-- Found Paddle host system's CPU: 8 cores
-- The CXX compiler identification is GNU 5.4.0
-- The C compiler identification is GNU 5.4.0
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- CXX compiler: /usr/bin/c++, version: GNU 5.4.0
-- C compiler: /usr/bin/cc, version: GNU 5.4.0
-- Looking for pthread.h
-- Looking for pthread.h - found
-- Looking for pthread_create
-- Looking for pthread_create - not found
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - found
-- Found Threads: TRUE
-- Found Git: /usr/bin/git (found version "2.7.4")
-- Performing Test MMX_FOUND
-- Performing Test MMX_FOUND - Success
-- Performing Test SSE2_FOUND
-- Performing Test SSE2_FOUND - Success
-- Performing Test SSE3_FOUND
-- Performing Test SSE3_FOUND - Success
-- Performing Test AVX_FOUND
-- Performing Test AVX_FOUND - Success
-- Performing Test AVX2_FOUND
-- Performing Test AVX2_FOUND - Success
-- use pre defined download url
-- MKLML_VER: mklml_lnx_2018.0.3.20180406, MKLML_URL: http://paddlepaddledeps.cdn.bcebos.com/mklml_lnx_2018.0.3.20180406.tgz
-- Protobuf protoc executable: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/protobuf/bin/protoc
-- Protobuf library: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/protobuf/lib/libprotobuf.a
-- Protobuf version: 3.1
-- Found PythonInterp: /usr/bin/python2.7 (found suitable version "2.7.12", minimum required is "2.7")
-- Found PythonLibs: /usr/lib/x86_64-linux-gnu/libpython2.7.so (found suitable version "2.7.12", minimum required is "2.7")
-- Found PY_pip: /usr/local/lib/python2.7/dist-packages/pip
-- Found PY_numpy: /usr/local/lib/python2.7/dist-packages/numpy
-- Found PY_wheel: /usr/local/lib/python2.7/dist-packages/wheel
-- Found PY_google.protobuf: /usr/local/lib/python2.7/dist-packages/google/protobuf
-- Found NumPy: /usr/local/lib/python2.7/dist-packages/numpy/core/include
-- Found cblas and lapack in MKLML (include: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/mklml/include, library: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/mklml/lib/libmklml_intel.so)
-- BLAS library: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/mklml/lib/libmklml_intel.so
-- Found SWIG: /usr/bin/swig3.0 (found version "3.0.8")
-- warp-ctc library: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/warpctc/lib/libwarpctc.so
-- use pre defined download url
-- BOOST_TAR: boost_1_41_0, BOOST_URL: http://paddlepaddledeps.cdn.bcebos.com/boost_1_41_0.tar.gz
-- Enable Intel OpenMP with /home/ub16hp/ub16_prj/Paddle/build/third_party/install/mklml/lib/libiomp5.so
-- Looking for UINT64_MAX
-- Looking for UINT64_MAX - found
-- Looking for sys/types.h
-- Looking for sys/types.h - found
-- Looking for stdint.h
-- Looking for stdint.h - found
-- Looking for stddef.h
-- Looking for stddef.h - found
-- Check size of pthread_spinlock_t
-- Check size of pthread_spinlock_t - done
-- Check size of pthread_barrier_t
-- Check size of pthread_barrier_t - done
-- Performing Test C_COMPILER_SUPPORT_FLAG__fPIC
-- Performing Test C_COMPILER_SUPPORT_FLAG__fPIC - Success
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__fPIC
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__fPIC - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__fno_omit_frame_pointer
-- Performing Test C_COMPILER_SUPPORT_FLAG__fno_omit_frame_pointer - Success
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__fno_omit_frame_pointer
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__fno_omit_frame_pointer - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wall
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wall - Success
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wall
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wall - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wextra
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wextra - Success
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wextra
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wextra - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Werror
-- Performing Test C_COMPILER_SUPPORT_FLAG__Werror - Success
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Werror
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Werror - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wnon_virtual_dtor
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wnon_virtual_dtor - Failed
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wnon_virtual_dtor
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wnon_virtual_dtor - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wdelete_non_virtual_dtor
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wdelete_non_virtual_dtor - Failed
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wdelete_non_virtual_dtor
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wdelete_non_virtual_dtor - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_unused_parameter
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_unused_parameter - Success
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_unused_parameter
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_unused_parameter - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_unused_function
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_unused_function - Success
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_unused_function
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_unused_function - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_literal_suffix
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_literal_suffix - Success
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_error_literal_suffix
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_error_literal_suffix - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_sign_compare
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_sign_compare - Success
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_error_sign_compare
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_error_sign_compare - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_unused_local_typedefs
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_unused_local_typedefs - Success
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_error_unused_local_typedefs
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_error_unused_local_typedefs - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_parentheses_equality
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_parentheses_equality - Failed
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_error_parentheses_equality
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_error_parentheses_equality - Failed
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_ignored_attributes
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_ignored_attributes - Failed
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_error_ignored_attributes
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_error_ignored_attributes - Failed
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_terminate
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_terminate - Failed
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_error_terminate
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_error_terminate - Failed
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_unused_function
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_unused_function - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_array_bounds
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_array_bounds - Success
-- Paddle version is 0.14.0
-- Configuring done
-- Generating done
-- Build files have been written to: /home/ub16hp/ub16_prj/Paddle/build
ub16hp@UB16HP:
/ub16_prj/Paddle/build$ cmake -DCMAKE_BUILD_TYPE=Release -DWITH_MKLDNN=OFF -DWITH_GPU=ON -DWITH_FLUID_ONLY=ON ..
-- Found Paddle host system: ubuntu, version: 16.04.4
-- Found Paddle host system's CPU: 8 cores
-- CXX compiler: /usr/bin/c++, version: GNU 5.4.0
-- C compiler: /usr/bin/cc, version: GNU 5.4.0
-- MKLML_VER: mklml_lnx_2018.0.3.20180406, MKLML_URL: http://paddlepaddledeps.cdn.bcebos.com/mklml_lnx_2018.0.3.20180406.tgz
-- Protobuf protoc executable: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/protobuf/bin/protoc
-- Protobuf library: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/protobuf/lib/libprotobuf.a
-- Protobuf version: 3.1
-- Found cblas and lapack in MKLML (include: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/mklml/include, library: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/mklml/lib/libmklml_intel.so)
-- BLAS library: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/mklml/lib/libmklml_intel.so
-- warp-ctc library: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/warpctc/lib/libwarpctc.so
-- BOOST_TAR: boost_1_41_0, BOOST_URL: http://paddlepaddledeps.cdn.bcebos.com/boost_1_41_0.tar.gz
-- Current cuDNN header is /usr/include/cudnn.h. Current cuDNN version is v5.
-- Found CUDA: /usr/local/cuda-8.0 (found version "8.0")
-- Enable Intel OpenMP with /home/ub16hp/ub16_prj/Paddle/build/third_party/install/mklml/lib/libiomp5.so
-- Paddle version is 0.14.0
-- CUDA detected: 8.0
-- Added CUDA NVCC flags for: sm_30 sm_35 sm_50 sm_52 sm_60 sm_61
-- Configuring done
-- Generating done
-- Build files have been written to: /home/ub16hp/ub16_prj/Paddle/build
ub16hp@UB16HP:~/ub16_prj/Paddle/build$ make -j8
Scanning dependencies of target extern_threadpool
Scanning dependencies of target extern_eigen3
Scanning dependencies of target extern_mklml
Scanning dependencies of target extern_lib_any
Scanning dependencies of target extern_zlib
Scanning dependencies of target extern_boost
Scanning dependencies of target extern_gflags
Scanning dependencies of target extern_pybind
[ 0%] Creating directories for 'extern_threadpool'
[ 0%] Creating directories for 'extern_mklml'
[ 0%] Creating directories for 'extern_gflags'
[ 0%] Creating directories for 'extern_pybind'
[ 0%] Creating directories for 'extern_lib_any'
[ 0%] Creating directories for 'extern_boost'
[ 0%] Creating directories for 'extern_zlib'
[ 0%] Creating directories for 'extern_eigen3'
[ 0%] Performing download step (git clone) for 'extern_zlib'
[ 1%] Performing download step for 'extern_boost'
[ 1%] Performing download step (git clone) for 'extern_lib_any'
[ 1%] Performing download step (git clone) for 'extern_gflags'
[ 2%] Performing download step (git clone) for 'extern_pybind'
[ 2%] Performing download step for 'extern_mklml'
[ 2%] Performing download step (git clone) for 'extern_threadpool'
[ 2%] Performing download step (git clone) for 'extern_eigen3'
Cloning into 'extern_zlib'...
Cloning into 'extern_pybind'...
Cloning into 'extern_gflags'...
Cloning into 'extern_lib_any'...
Cloning into 'extern_threadpool'...
Cloning into 'extern_eigen3'...
Note: checking out '9a42ec1329f259a5f4881a291db1dcb8f2ad9040'.

You are in 'detached HEAD' state. You can look around, make experimental
changes and commit them, and you can discard any commits you make in this
state without impacting any branches by performing another checkout.

If you want to create a new branch to retain commits you create, you may
do so (now or later) by using -b with the checkout command again. Example:

git checkout -b

HEAD is now at 9a42ec1... changed typedef to using
[ 2%] No patch step for 'extern_threadpool'
Note: checking out '15595d8324be9e8a9a80d9ae442fdd12bd66df5d'.

You are in 'detached HEAD' state. You can look around, make experimental
changes and commit them, and you can discard any commits you make in this
state without impacting any branches by performing another checkout.

If you want to create a new branch to retain commits you create, you may
do so (now or later) by using -b with the checkout command again. Example:

git checkout -b

HEAD is now at 15595d8... Merge pull request #1 from PaddlePaddle/no_noexcept_in_function_pointers
[ 2%] No update step for 'extern_threadpool'
[ 2%] No configure step for 'extern_threadpool'
[ 2%] No patch step for 'extern_lib_any'
[ 3%] No build step for 'extern_threadpool'
[ 3%] No update step for 'extern_lib_any'
[ 3%] No install step for 'extern_threadpool'
[ 3%] No configure step for 'extern_lib_any'
[ 3%] No test step for 'extern_threadpool'
[ 3%] No build step for 'extern_lib_any'
[ 3%] Completed 'extern_threadpool'
[ 3%] No install step for 'extern_lib_any'
[ 3%] Built target extern_threadpool
Scanning dependencies of target extern_warpctc
[ 3%] No test step for 'extern_lib_any'
[ 3%] Creating directories for 'extern_warpctc'
[ 3%] Completed 'extern_lib_any'
[ 3%] Built target extern_lib_any
[ 3%] Performing download step (git clone) for 'extern_warpctc'
Scanning dependencies of target extern_snappy
[ 3%] Creating directories for 'extern_snappy'
Cloning into 'extern_warpctc'...
[ 3%] Performing download step (git clone) for 'extern_snappy'
Cloning into 'extern_snappy'...
Note: checking out '77592648e3f3be87d6c7123eb81cbad75f9aef5a'.

You are in 'detached HEAD' state. You can look around, make experimental
changes and commit them, and you can discard any commits you make in this
state without impacting any branches by performing another checkout.

If you want to create a new branch to retain commits you create, you may
do so (now or later) by using -b with the checkout command again. Example:

git checkout -b

HEAD is now at 7759264... repair wrong namespace problem
Submodule 'doc' (https://github.com/gflags/gflags.git) registered for path 'doc'
Cloning into 'doc'...
Note: checking out 'v2.1.1'.

You are in 'detached HEAD' state. You can look around, make experimental
changes and commit them, and you can discard any commits you make in this
state without impacting any branches by performing another checkout.

If you want to create a new branch to retain commits you create, you may
do so (now or later) by using -b with the checkout command again. Example:

git checkout -b

HEAD is now at 1df91d3... v2.1.1 release version bump
Submodule 'tools/clang' (https://github.com/wjakob/clang-cindex-python3) registered for path 'tools/clang'
Cloning into 'tools/clang'...
Note: checking out 'v1.2.8'.

You are in 'detached HEAD' state. You can look around, make experimental
changes and commit them, and you can discard any commits you make in this
state without impacting any branches by performing another checkout.

If you want to create a new branch to retain commits you create, you may
do so (now or later) by using -b with the checkout command again. Example:

git checkout -b

HEAD is now at 5089329... zlib 1.2.8
[ 4%] No patch step for 'extern_zlib'
[ 4%] No update step for 'extern_zlib'
[ 4%] Performing configure step for 'extern_zlib'
-- extern_zlib configure command succeeded. See also /home/ub16hp/ub16_prj/Paddle/build/third_party/zlib/src/extern_zlib-stamp/extern_zlib-configure-.log
[ 4%] Performing build step for 'extern_zlib'
Scanning dependencies of target zlib
[ 2%] Building C object CMakeFiles/zlib.dir/adler32.o
[ 5%] Building C object CMakeFiles/zlib.dir/compress.o
[ 7%] Building C object CMakeFiles/zlib.dir/crc32.o
[ 10%] Building C object CMakeFiles/zlib.dir/deflate.o
[ 12%] Building C object CMakeFiles/zlib.dir/gzclose.o
[ 15%] Building C object CMakeFiles/zlib.dir/gzlib.o
[ 17%] Building C object CMakeFiles/zlib.dir/gzread.o
[ 20%] Building C object CMakeFiles/zlib.dir/gzwrite.o
[ 22%] Building C object CMakeFiles/zlib.dir/inflate.o
[ 25%] Building C object CMakeFiles/zlib.dir/infback.o
[ 27%] Building C object CMakeFiles/zlib.dir/inftrees.o
[ 30%] Building C object CMakeFiles/zlib.dir/inffast.o
[ 32%] Building C object CMakeFiles/zlib.dir/trees.o
[ 35%] Building C object CMakeFiles/zlib.dir/uncompr.o
[ 37%] Building C object CMakeFiles/zlib.dir/zutil.o
[ 40%] Linking C shared library libz.so
[ 40%] Built target zlib
Scanning dependencies of target zlibstatic
[ 42%] Building C object CMakeFiles/zlibstatic.dir/adler32.o
[ 45%] Building C object CMakeFiles/zlibstatic.dir/compress.o
[ 47%] Building C object CMakeFiles/zlibstatic.dir/crc32.o
[ 50%] Building C object CMakeFiles/zlibstatic.dir/deflate.o
CMakeFiles/extern_mklml.dir/build.make:88: recipe for target 'third_party/mklml/src/extern_mklml-stamp/extern_mklml-download' failed
make[2]: *** [third_party/mklml/src/extern_mklml-stamp/extern_mklml-download] Error 8
CMakeFiles/extern_boost.dir/build.make:88: recipe for target 'third_party/boost/src/extern_boost-stamp/extern_boost-download' failed
make[2]: *** [third_party/boost/src/extern_boost-stamp/extern_boost-download] Error 8
CMakeFiles/Makefile2:941: recipe for target 'CMakeFiles/extern_mklml.dir/all' failed
make[1]: *** [CMakeFiles/extern_mklml.dir/all] Error 2
make[1]: *** Waiting for unfinished jobs....
CMakeFiles/Makefile2:830: recipe for target 'CMakeFiles/extern_boost.dir/all' failed
make[1]: *** [CMakeFiles/extern_boost.dir/all] Error 2
[ 52%] Building C object CMakeFiles/zlibstatic.dir/gzclose.o
Scanning dependencies of target example64
[ 55%] Building C object CMakeFiles/example64.dir/test/example.o
[ 57%] Building C object CMakeFiles/zlibstatic.dir/gzlib.o
Scanning dependencies of target minigzip64
[ 60%] Building C object CMakeFiles/minigzip64.dir/test/minigzip.o
[ 62%] Linking C executable example64
[ 62%] Built target example64
[ 65%] Building C object CMakeFiles/zlibstatic.dir/gzread.o
[ 67%] Linking C executable minigzip64
[ 67%] Built target minigzip64
[ 70%] Building C object CMakeFiles/zlibstatic.dir/gzwrite.o
[ 72%] Building C object CMakeFiles/zlibstatic.dir/inflate.o
[ 75%] Building C object CMakeFiles/zlibstatic.dir/infback.o
[ 77%] Building C object CMakeFiles/zlibstatic.dir/inftrees.o
Scanning dependencies of target minigzip
[ 80%] Building C object CMakeFiles/minigzip.dir/test/minigzip.o
Scanning dependencies of target example
[ 82%] Building C object CMakeFiles/example.dir/test/example.o
[ 85%] Linking C executable minigzip
[ 85%] Built target minigzip
[ 87%] Building C object CMakeFiles/zlibstatic.dir/inffast.o
[ 90%] Linking C executable example
[ 92%] Building C object CMakeFiles/zlibstatic.dir/trees.o
[ 92%] Built target example
[ 95%] Building C object CMakeFiles/zlibstatic.dir/uncompr.o
[ 97%] Building C object CMakeFiles/zlibstatic.dir/zutil.o
[100%] Linking C static library libz.a
[100%] Built target zlibstatic
[ 4%] Performing install step for 'extern_zlib'
[ 40%] Built target zlib
[ 80%] Built target zlibstatic
[ 85%] Built target example64
[ 90%] Built target minigzip
[ 95%] Built target example
[100%] Built target minigzip64
Install the project...
-- Install configuration: "Release"
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/zlib/lib/libz.so.1.2.8
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/zlib/lib/libz.so.1
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/zlib/lib/libz.so
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/zlib/lib/libz.a
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/zlib/include/zconf.h
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/zlib/include/zlib.h
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/zlib/share/man/man3/zlib.3
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/zlib/share/pkgconfig/zlib.pc
[ 4%] Completed 'extern_zlib'
[ 4%] Built target extern_zlib
Already on 'master'
Your branch is up-to-date with 'origin/master'.
[ 4%] No patch step for 'extern_warpctc'
[ 4%] No update step for 'extern_warpctc'
[ 4%] Performing configure step for 'extern_warpctc'
-- extern_warpctc configure command succeeded. See also /home/ub16hp/ub16_prj/Paddle/build/third_party/warpctc/src/extern_warpctc-stamp/extern_warpctc-configure-
.log
[ 4%] Performing build step for 'extern_warpctc'
[ 66%] Building NVCC (Device) object CMakeFiles/warpctc.dir/src/warpctc_generated_ctc_entrypoint.cu.o
[ 66%] Building NVCC (Device) object CMakeFiles/warpctc.dir/src/warpctc_generated_reduce.cu.o
Note: checking out '1.1.7'.

You are in 'detached HEAD' state. You can look around, make experimental
changes and commit them, and you can discard any commits you make in this
state without impacting any branches by performing another checkout.

If you want to create a new branch to retain commits you create, you may
do so (now or later) by using -b with the checkout command again. Example:

git checkout -b

HEAD is now at b02bfa7... Tag open source release 1.1.7.
[ 4%] No update step for 'extern_snappy'
[ 4%] No patch step for 'extern_snappy'
[ 4%] Performing configure step for 'extern_snappy'
loading initial cache file /home/ub16hp/ub16_prj/Paddle/build/third_party/snappy/tmp/extern_snappy-cache-Release.cmake
-- The C compiler identification is GNU 5.4.0
-- The CXX compiler identification is GNU 5.4.0
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Check if the system is big endian
-- Searching 16 bit integer
-- Looking for sys/types.h
-- Looking for sys/types.h - found
-- Looking for stdint.h
-- Looking for stdint.h - found
-- Looking for stddef.h
-- Looking for stddef.h - found
-- Check size of unsigned short
-- Check size of unsigned short - done
-- Using unsigned short
-- Check if the system is big endian - little endian
-- Looking for byteswap.h
-- Looking for byteswap.h - found
-- Looking for sys/endian.h
-- Looking for sys/endian.h - not found
-- Looking for sys/mman.h
-- Looking for sys/mman.h - found
-- Looking for sys/resource.h
-- Looking for sys/resource.h - found
-- Looking for sys/time.h
-- Looking for sys/time.h - found
-- Looking for sys/uio.h
-- Looking for sys/uio.h - found
-- Looking for unistd.h
-- Looking for unistd.h - found
-- Looking for windows.h
-- Looking for windows.h - not found
-- Looking for zlibVersion in z
-- Looking for zlibVersion in z - found
-- Looking for lzo1x_1_15_compress in lzo2
-- Looking for lzo1x_1_15_compress in lzo2 - not found
-- Performing Test HAVE_BUILTIN_EXPECT
-- Performing Test HAVE_BUILTIN_EXPECT - Success
-- Performing Test HAVE_BUILTIN_CTZ
-- Performing Test HAVE_BUILTIN_CTZ - Success
-- Looking for mmap
-- Looking for mmap - found
-- Looking for sysconf
-- Looking for sysconf - found
-- Configuring done
-- Generating done
CMake Warning:
Manually-specified variables were not used by the project:

BUILD_TESTING

-- Build files have been written to: /home/ub16hp/ub16_prj/Paddle/build/third_party/snappy/src/extern_snappy-build
[ 4%] Performing build step for 'extern_snappy'
Scanning dependencies of target snappy
[ 20%] Building CXX object CMakeFiles/snappy.dir/snappy-c.cc.o
[ 40%] Building CXX object CMakeFiles/snappy.dir/snappy-sinksource.cc.o
[ 60%] Building CXX object CMakeFiles/snappy.dir/snappy-stubs-internal.cc.o
[ 80%] Building CXX object CMakeFiles/snappy.dir/snappy.cc.o
[100%] Linking CXX static library libsnappy.a
[100%] Built target snappy
[ 4%] Performing install step for 'extern_snappy'
[100%] Built target snappy
Install the project...
-- Install configuration: "Release"
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/snappy/lib/libsnappy.a
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/snappy/include/snappy-c.h
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/snappy/include/snappy-sinksource.h
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/snappy/include/snappy.h
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/snappy/include/snappy-stubs-public.h
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/snappy/lib/cmake/Snappy/SnappyTargets.cmake
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/snappy/lib/cmake/Snappy/SnappyTargets-release.cmake
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/snappy/lib/cmake/Snappy/SnappyConfig.cmake
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/snappy/lib/cmake/Snappy/SnappyConfigVersion.cmake
[ 4%] Completed 'extern_snappy'
[ 4%] Built target extern_snappy
Submodule path 'doc': checked out '971dd2a4fadac9cdab174c523c22df79efd63aa5'
[ 4%] No update step for 'extern_gflags'
[ 4%] No patch step for 'extern_gflags'
[ 4%] Performing configure step for 'extern_gflags'
-- extern_gflags configure command succeeded. See also /home/ub16hp/ub16_prj/Paddle/build/third_party/gflags/src/extern_gflags-stamp/extern_gflags-configure-*.log
[ 4%] Performing build step for 'extern_gflags'
Scanning dependencies of target gflags_static
Scanning dependencies of target gflags_nothreads_static
[ 12%] Building CXX object CMakeFiles/gflags_nothreads_static.dir/src/gflags.cc.o
[ 25%] Building CXX object CMakeFiles/gflags_static.dir/src/gflags_reporting.cc.o
[ 37%] Building CXX object CMakeFiles/gflags_nothreads_static.dir/src/gflags_completions.cc.o
[ 50%] Building CXX object CMakeFiles/gflags_static.dir/src/gflags.cc.o
[ 62%] Building CXX object CMakeFiles/gflags_nothreads_static.dir/src/gflags_reporting.cc.o
[ 75%] Building CXX object CMakeFiles/gflags_static.dir/src/gflags_completions.cc.o
[ 87%] Linking CXX static library lib/libgflags.a
[ 87%] Built target gflags_static
[100%] Linking CXX static library lib/libgflags_nothreads.a
[100%] Built target gflags_nothreads_static
[ 5%] Performing install step for 'extern_gflags'
[100%] Built target gflags_static
[ 87%] Built target gflags_nothreads_static
Install the project...
-- Install configuration: "Release"
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/gflags/lib/libgflags.a
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/gflags/lib/libgflags_nothreads.a
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/gflags/include/gflags/gflags.h
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/gflags/include/gflags/gflags_declare.h
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/gflags/include/gflags/gflags_completions.h
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/gflags/include/gflags/gflags_gflags.h
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/gflags/lib/cmake/gflags/gflags-config.cmake
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/gflags/lib/cmake/gflags/gflags-config-version.cmake
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/gflags/lib/cmake/gflags/gflags-targets.cmake
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/gflags/lib/cmake/gflags/gflags-targets-release.cmake
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/gflags/bin/gflags_completions.sh
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/gflags/lib/pkgconfig/gflags.pc
-- Installing: /home/ub16hp/.cmake/packages/gflags/566c457f02710a6448b1d514b8bba414
[ 5%] Completed 'extern_gflags'
[ 5%] Built target extern_gflags
Submodule path 'tools/clang': checked out '254c7a91e3c6aa254e113197604dafb443f4d429'
[ 5%] No patch step for 'extern_pybind'
[ 5%] No update step for 'extern_pybind'
[ 5%] No configure step for 'extern_pybind'
[ 5%] No build step for 'extern_pybind'
[ 5%] No install step for 'extern_pybind'
[ 5%] No test step for 'extern_pybind'
[ 5%] Completed 'extern_pybind'
[ 5%] Built target extern_pybind
Scanning dependencies of target warpctc
[100%] Linking CXX shared library libwarpctc.so
[100%] Built target warpctc
[ 5%] Performing install step for 'extern_warpctc'
-- cuda found TRUE
-- Torch found
-- Building shared library with GPU support
-- NVCC_ARCH_FLAGS -gencode arch=compute_30,code=sm_30 -O2 -gencode arch=compute_35,code=sm_35 -gencode arch=compute_50,code=sm_50 -gencode arch=compute_52,code=sm_52 -gencode arch=compute_60,code=sm_60 -gencode arch=compute_61,code=sm_61 -gencode arch=compute_62,code=sm_62 --std=c++11 -Xcompiler -fopenmp
-- Configuring done
-- Generating done
-- Build files have been written to: /home/ub16hp/ub16_prj/Paddle/build/third_party/warpctc/src/extern_warpctc-build
[ 33%] Linking CXX shared library libwarpctc.so
[100%] Built target warpctc
Install the project...
-- Install configuration: "Release"
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/warpctc/lib/libwarpctc.so
-- Installing: /home/ub16hp/ub16_prj/Paddle/build/third_party/install/warpctc/include/ctc.h
[ 5%] Completed 'extern_warpctc'
[ 5%] Built target extern_warpctc
Note: checking out '917060c364181f33a735dc023818d5a54f60e54c'.

You are in 'detached HEAD' state. You can look around, make experimental
changes and commit them, and you can discard any commits you make in this
state without impacting any branches by performing another checkout.

If you want to create a new branch to retain commits you create, you may
do so (now or later) by using -b with the checkout command again. Example:

git checkout -b

HEAD is now at 917060c... Add static assertion for fixed sizes Ref<>
[ 5%] No patch step for 'extern_eigen3'
[ 5%] No update step for 'extern_eigen3'
[ 5%] No configure step for 'extern_eigen3'
[ 5%] No build step for 'extern_eigen3'
[ 5%] No install step for 'extern_eigen3'
[ 5%] No test step for 'extern_eigen3'
[ 5%] Completed 'extern_eigen3'
[ 5%] Built target extern_eigen3
Makefile:149: recipe for target 'all' failed
make: *** [all] Error 2
ub16hp@UB16HP:~/ub16_prj/Paddle/build$

Paddle编译出现如下错误怎么解决?

`You are in 'detached HEAD' state. You can look around, make experimental
changes and commit them, and you can discard any commits you make in this
state without impacting any branches by performing another checkout.

If you want to create a new branch to retain commits you create, you may
do so (now or later) by using -b with the checkout command again. Example:

git checkout -b

HEAD is now at d9875c657 swig-3.0.10 release
[ 1%] No patch step for 'swig'
[ 1%] No update step for 'swig'
[ 2%] Performing configure step for 'swig'

  • test -d Tools/config
  • aclocal -I Tools/config
    ./autogen.sh: 11: ./autogen.sh: aclocal: not found
    CMakeFiles/swig.dir/build.make:106: recipe for target 'third_party/swig/src/swig-stamp/swig-configure' failed
    make[2]: *** [third_party/swig/src/swig-stamp/swig-configure] Error 127
    CMakeFiles/Makefile2:173: recipe for target 'CMakeFiles/swig.dir/all' failed
    make[1]: *** [CMakeFiles/swig.dir/all] Error 2
    make[1]: *** Waiting for unfinished jobs....`

版本兼容问题

提交 47a7d21e 中,修改的头文件路径 #include "paddle/fluid/platform/init.h" 在 paddle v0.14.0 中还没有修改

分词

分词的时候可以加载本地词典吗

分词功能如何体现

模型的输出是词性标注和专名识别的结果,而不是分词的结果。
例如原句:
**人民政府宣布与美国建交
lac输出:
**人民政府/ORG 宣布/v 与/p 美国/LOC 建交/v
分词应为:
** 人民 政府 宣布 与 美国 建交
分词功能如何体现?

根据百度AI平台词法分析API返回结果说明,可知API提供分词功能。此时,"**人民政府"被认为是一个item,由基础词"** 人民 政府"构成。

可以增量训练吗?

你好:
lac开源模型对于一般语句的识别效果已经很好了,但对特定领域的识别效果不太好,所以我想基于开源的模型增量训练。如果想增量训练,需要用save_persistables接口保存模型,而lac用的是save_inference_model接口,没法直接增量训练。所以想问一下,有什么办法可以增量训练吗?

where is the inference_lib_dist and fluid_inference_lib?

原文:
make -j <num_cpu_cores> inference_lib_dist # 并发编译可提高速度, <num_cpu_cores>表示并发编译的线程数
问题:
How to profile the inference_lib_dist ? which form?
原文:
cmake -DPADDLE_ROOT=/path/to/fluid_inference_lib ..
问题:
If I use a paddle docker image, what should I do withe the fluid_inference_lib? where is file?

Thanks!

关于字典问题

发现models那个词法分析与这个项目的字典不一致,这个更小,比如“原”这个字都没有,感觉这个字很常见啊?不知道优化词表有没有什么根据?

目标模式不含有“%”

[ 8%] Built target profiler_py_proto_init
paddle/fluid/platform/CMakeFiles/profiler_py_proto.dir/build.make:60: *** 目标模式不含有“%”。 停止。
make[1]: *** [paddle/fluid/platform/CMakeFiles/profiler_py_proto.dir/all] 错误 2
make: *** [all] 错误 2

关于编译LAC的问题

问题描述:
原文中编译LAC操作:

cd build
# /path/to/fluid_inference_lib是上节第五步的对应的Fluid预测库编译产出路径
# LAC的demo程序,以及依赖LAC静态库的程序,都要依赖这个路径作为动态库的搜索路径
# 如果这个路径在编译完成之后有变动,需要手工设置LD_LIBRARY_PATH环境变量
cmake -DPADDLE_ROOT=/path/to/fluid_inference_lib ..
make
make install # 编译产出在 ../output 下

问题: 我的操作没有output文件夹生成,我的操作过程是

cd bulid
cmake -DPADDLE_ROOT=../bulid/fluid_install_dir/ ..

后面在bulid内没有新生成的文件。
于是在当前目录以及上一级执行了make 以及make install 后通过 find 命令查找,发现没有output文件夹生成。

关于这一步的操作能否详细说明下,非常感谢。

只能在linux编译吗

现在paddle主程序已经支持在windows下pip安装,这个可以在windows编译吗

编译lac出错

报错信息
Scanning dependencies of target lac
[ 12%] Building CXX object CMakeFiles/lac.dir/src/lac_util.cpp.o
[ 25%] Building CXX object CMakeFiles/lac.dir/src/ilac.cpp.o
In file included from /paddle/build/lac/src/lac.h:17:0,
from /paddle/build/lac/src/ilac.cpp:16:
/paddle/build/lac/src/main_tagger.h:21:40: fatal error: paddle/fluid/platform/init.h: No such file or directory
compilation terminated.
CMakeFiles/lac.dir/build.make:86: recipe for target 'CMakeFiles/lac.dir/src/ilac.cpp.o' failed
make[2]: *** [CMakeFiles/lac.dir/src/ilac.cpp.o] Error 1
CMakeFiles/Makefile2:67: recipe for target 'CMakeFiles/lac.dir/all' failed
make[1]: *** [CMakeFiles/lac.dir/all] Error 2
Makefile:127: recipe for target 'all' failed
make: *** [all] Error 2

seg模式和lac模式输出了不一样的分词结果

我用"朝着坏了的灯泡舞着别离"作为LAC(mode='seg')和LAC(mode='lac'的)输入,分别输出了一下两个结果:
['朝着', '坏', '了', '的', '灯泡', '舞', '着', '别离']
[['朝着', '坏', '了', '的', '灯泡舞', '着', '别离'], ['p', 'a', 'u', 'u', 'n', 'u', 'vn']]
出现了不一样的分词结果,这是正常的吗?那应该以哪个为准?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.