1、参考
https://zhuanlan.zhihu.com/p/50582974
https://github.com/hanxiao/bert-asservice/blob/master/client/README.md
2、下载bert中文词向量
地址:https://github.com/google-research/bert#pre-trained-models
中文向量链接:https://storage.googleapis.com/bert_models/2018_11_03/chinese_L-12_H-768_A-12.zip
3、提供服务
3.1 将上述压缩文件解压
3.2 构建环境
安装依赖环境:
pip install numpy
pip install -U bert-serving-server[http]
pip install bert-serving-client
pip install tensorflow>=1.10.0
4、 提供本地/远程服务
4.1 本地直接调用:
from bert_serving.client import BertClient
bc = BertClient()
bc.encode(['我 喜欢 你们', '我 喜 欢 你 们'])
4.2 远程请求服务
post服务:
curl -X POST http://**.*.*.68:8125/encode -H 'content-type: application/json' -d '{"id": 123,"texts": ["hello world"], "is_tokenized": false}'
返回结果:
{
"id":123,
"result":[[-0.00980051327496767,0.05821939557790756,-0.06836936622858047,
-0.4723478853702545,0.48761454224586487,-1.4105712175369263,
...
...
,-0.10073700547218323,-0.17246723175048828]],
"status":200
}
4.3、在一个GPU服务器(**.*.*.68)上部署bert服务,在另外一台cpu服务器(**.*.*.67)调用这个服务:
step1: 调用前先在(**.*.*.68)上安装client:
pip install bert-serving-client
step2: 调用服务demo
# on another CPU machine
from bert_serving.client import BertClient
bc = BertClient(ip='xx.xx.xx.xx') # ip address of the GPU machine
bc.encode(['First do it', 'then do it right', 'then do it better'])