Scaling API-first – The story of a global engineering organization
対話における商品の営業
1. 対話における商品の営業
佐藤 元紀|Motoki
Sato
対 話 E C
NAIST
(M1)
Preferred
Networks
Summer
Internship,
2016
(Mentors:
Unno-‐san,
Fukuda-‐san)
2. Introduction
l Recently,
chat-‐bots
are
used
in
many
field.
l Chat-‐bot will
be
used
to
sell
products
online.
2
Background
My
Internship
theme
l Explain
why
this
product
is
recommended
to
you.
l Generate
sentences
which explain
attractiveness
of
products.
商品の
特徴⽂文
ユーザに
おすすめな
理理由⽂文
Inference…?
Product
feature
sentence
Reason
sentence
3. Data 3
l Find
Travel (Curation
web
site
in
travel
domain)
l Articles
have
many
attractive
spots
in
Japan.
l Spot
Data
:
67,477
(spot,
hotel,
cafe)
Spot
name description information Image
URL
http://find-‐travel.jp/
4. Data Processing 4
l Split text
to
sentences.
l Extract
“reasoning
sentence” include
word
“なので” or
“ので”
(heuristic)
– Number
of
sentences
:
144,032
– Sentence
Examples
:
User
Value
Fact
(spot
feature
sentence)
→
なので
5. Model
l Sequence-‐to-‐Sequence Model
with
Attention
[Cho
et
al.,
2014,
Bahdanau et
al.,
2014]
l We
train
two
difference
networks.
(1.
Normal
and
2.Reverse)
5
Input:
output:
Hidden
unit 400
Network 1
layer
bi-‐LSTM
Batch
size 100
Optimizer Adam
User
Value
Fact
1.
Normal
Input:
output:
User
Value
Fact
2.
Reverse
9. Conclusions
l We
build
Neural
Sequence-‐to-‐Sequence
model
to
explain
product
by
sentence.
l Attention
alignment
work
so
good
l Attention
with
Databese or
Knowledge
Base
[Pengcheng Yin,
2016]
(QA)
Pengcheng Yin,
Zhengdong Lu,
Hang
Li,
Ben
Kao.
“Neural
Enquirer:
Learning
to
Query
Tables
with
Natural
Language”
IJCAI
2016
l Spot
search
using
Reinforcement
Learning
(user
feedback
signal)
9
Future
Work