21. 例1
A -> (A->B) ->B の証明
Coq < Lemma S1 : forall A B :Prop, A ->(A->B) ->B.
1 subgoal
============================
forall A B : Prop, A -> (A -> B) -> B
S1 < intros.
1 subgoal
A : Prop
B : Prop
H : A
H0 : A -> B
============================
B
Goalから、forallを消す。
→の前提部分を仮説に移す。
introsは、複数のintroを同時に適用する。
Γ1 新しい仮定
新しいゴール
機械証明
ニューラル・ネットワーク技術の特徴
42. Geoffrey E. Hinton
Geoffrey E. Hinton Yann LeCun,
Yoshua Bengio Andrew Ng
http://www.cs.toronto.edu/~hinton/ http://yann.lecun.com/
http://goo.gl/fOjkIC http://www.andrewng.org/
53. Alex Krizhevsky, Ilya Sutskever, Geoffrey E. Hinton
"ImageNet Classification with Deep Convolutional Neural Networks"
http://goo.gl/HCZ65W
Alex Krizhevsky
60. Power of Data
Googleの言語へのアプローチ (2006年)
Ben Jai “What’s Google Doing” から
http://life.math.ntu.edu.tw/sites/
life.math.ntu.edu.tw/files/
Fall-2006-Campus-Talk-TW.pdf
2006年
73. 無限性 Recursion
I ate a banana
I know I ate a banana
I think I know I ate a banana
I declare I think I know I ate a banana
What do you declare I think I know I ate?
This is the banana I declare I think I know I
ate
There are more bananas I declare I think I
know I ate than bananas that still grow on the
tree
This is the cat that caught the rat that ate the
cheese that... http://bit.ly/1uc4Taj
80. Timothy T. Rogers, James L. McClelland
"Parallel Distributed Processing at 25:
Further Explorations in the Microstructure of Cognition"
http://goo.gl/e4CgJ8
Rumelhartは、
2011年に逝去
2014年
87. “An example of what recurrent neural nets
can now do” WikiPedia で学習したもの
The meaning of life is the tradition of the
ancient human reproduction: it is less
favorable to the good boy for when to remove
her bigger. In the show’s agreement
unanimously resurfaced. The wild pasteured
with consistent street forests were
incorporated by the 15th century BE. In 1996
the primary rapford undergoes an effort that
the reserve conditioning, written into Jewish
cities, sleepers to incorporate the .St Eurasia
that activates the population.
http://goo.gl/vHRHSn
英語に見えるが
意味が、まったく
通じない。
88. “An example of what recurrent neural nets
can now do” New York Timesで学習
while he was giving attention to the second
advantage of school building a 2-for-2 stool
killed by the Cultures saddled with a halfsuit
defending the Bharatiya Fernall ’s office . Ms .
Claire Parters will also have a history temple
for him to raise jobs until naked Prodiena to
paint baseball partners , provided people to
ride both of Manhattan in 1978 , but what was
largely directed to China in 1946 , focusing on
the trademark period is the sailboat yesterday
and comments on whom they obtain overheard
within the 120th anniversary , where ......
http://goo.gl/vHRHSn
英語に見えるが
意味が、まったく
通じない。
89. "The Unreasonable Effectiveness
of Recurrent Neural Networks”
Andrej Karpathy
http://karpathy.github.io/2015/05/21/rnn-effectiveness/
RNNの驚くべき能力
90. Andrej Karpathy
"The Unreasonable Effectiveness of Recurrent Neural Networks”
http://karpathy.github.io/2015/05/21/rnn-effectiveness/
94. Recursive Deep Models for Semantic
Compositionality Over a Sentiment
Treebank
Richard Socher et al.
http://nlp.stanford.edu/~socherr/EMNLP2013_RNTN.pdf
文法をニューラル・ネットで!
96. 文 “This film does’nt care about cleverness wit or
any other kind of intelligenct humor” の解析
Socher, et al. http://goo.gl/bPAQ68
文法構造をニューラル・
ネットで表現する!
97. “Show, Attend and Tell:
Neural Image Caption Generation
with Visual Attention”
Kelvin Xu, Bengio et al.
イメージにキャプションをつける
http://arxiv.org/pdf/1502.03044v2.pdf
Kelvin Xu
https://github.com/kelvinxu
98. “Show, Attend and Tell: Neural Image Caption Generation with Visual Attention”
http://arxiv.org/pdf/1502.03044v2.pdf
「女性が公園でフリスビーを投げている」 「幼い子がテディベアを持ってベッドに座っている」
「一群の人たちが野外マーケットでショッピングをしている」
「フルーツスタンドにはたくさんの野菜がある」
イメージからテキストへ