Se ha denunciado esta presentación.
Utilizamos tu perfil de LinkedIn y tus datos de actividad para personalizar los anuncios y mostrarte publicidad más relevante. Puedes cambiar tus preferencias de publicidad en cualquier momento.

3

Compartir

Knowledge base completion presentation

ntegrating Knowledge Bases with Neural Networks - by Nick Powell:

Knowledge bases are used as the under-pinning for reasoning systems. This talk will describe experiences using deep learning to facilitate knowledge base completion. With an existing knowledge base as a training set, we programmed the neural net as a binary classifier to find likely relationships and then insert them back into the graph. We'll describe lessons learned and next steps.

Libros relacionados

Gratis con una prueba de 30 días de Scribd

Ver todo

Knowledge base completion presentation

  1. 1. Integrating Knowledge Bases with Neural Networks Nick Powell, GRAKN.AI
  2. 2. What are we working with? Knowledge Base layer Predictive layer 1. The facts that we know 2. An inference engine 1. Neural network for binary classification
  3. 3. What are we working with?
  4. 4. What makes graph databases good at modeling these knowledge bases? A B C D E F G H
  5. 5. What makes graph databases good at modeling these knowledge bases? A B C D E F G H Inference Rules
  6. 6. What are we working with? https://nlp.stanford.edu/~socherr/SocherChenManningNg_NIPS2013.pdf
  7. 7. If we can predict the dotted-line relationships, we add to our knowledge!
  8. 8. Goals: Maintain Grakn as a versatile and robust knowledge base even as additional (possibly false) relationships are added to it. See if the accuracy of the neural net classifier is improved with Grakn inferences
  9. 9. Algorithmic Flow 1. Build the project’s ontology and rule set in GRAKN (define the inference rules, and provide a structure to the knowledge base) 1. Train the neural tensor network, and calculate an initial accuracy on the test set 2. Use the results of the network to scan for likely triplets (these are not taken from the training/test data, but rather are constructed anew) 3. Insert n most likely triplets into the GRAKN knowledge base, and using the inference rules you have, loop through the test set again, calculating an updated accuracy. 4. Repeat steps 3 and 4 several times
  10. 10. Algorithmic Flow 1. Build the project’s ontology and rule set in GRAKN (define the inference rules, and provide a structure to the knowledge base) 1. Train the neural tensor network, and calculate an initial accuracy on the test set 2. Use the results of the network to scan for likely triplets (these are not taken from the training/test data, but rather are constructed anew) 3. Insert n most likely triplets into the GRAKN knowledge base, and using the inference rules you have, loop through the test set again, calculating an updated accuracy. 4. Repeat steps 3 and 4 several times
  11. 11. Algorithmic Flow 1. Build the project’s ontology and rule set in GRAKN (define the inference rules, and provide a structure to the knowledge base) 1. Train the neural tensor network, and calculate an initial accuracy on the test set 2. Use the results of the network to scan for likely triplets (these are not taken from the training/test data, but rather are constructed anew) 3. Insert n most likely triplets into the GRAKN knowledge base, and using the inference rules you have, loop through the test set again, calculating an updated accuracy. 4. Repeat steps 3 and 4 several times
  12. 12. Algorithmic Flow 1. Build the project’s ontology and rule set in GRAKN (define the inference rules, and provide a structure to the knowledge base) 1. Train the neural tensor network, and calculate an initial accuracy on the test set 2. Use the results of the network to scan for likely triplets (these are not taken from the training/test data, but rather are constructed anew) 3. Insert n most likely triplets into the GRAKN knowledge base, and using the inference rules you have, loop through the test set again, calculating an updated accuracy. 4. Repeat steps 3 and 4 several times
  13. 13. Algorithmic Flow 1. Build the project’s ontology and rule set in GRAKN (define the inference rules, and provide a structure to the knowledge base) 1. Train the neural tensor network, and calculate an initial accuracy on the test set 2. Use the results of the network to scan for likely triplets (these are not taken from the training/test data, but rather are constructed anew) 3. Insert n most likely triplets into the GRAKN knowledge base, and using the inference rules you have, loop through the test set again, calculating an updated accuracy. 4. Repeat steps 3 and 4 several times
  14. 14. Neural Network GRAKN Inference Engine
  15. 15. Findings The default inference rules were not extensive enough to cover the whole dataset. However, the knowledge base was consistently able to absorb more correct information than incorrect information - we can be very confident that this improves the accuracy of the neural net alone.
  16. 16. 0 rounds -> 20 rounds -> 1 round ->
  17. 17. Further applications? Using GRAKN inferences to give clues about ground truths. This could be done before the neural network is trained, perhaps to intelligently initialize network weights. Create inference rules by training neural networks - similar to this project, but much more difficult (and maybe rewarding!) ...and more!
  • Grexe

    Feb. 12, 2021
  • micheleorsi

    Oct. 25, 2017
  • GraknLabs

    Oct. 23, 2017

ntegrating Knowledge Bases with Neural Networks - by Nick Powell: Knowledge bases are used as the under-pinning for reasoning systems. This talk will describe experiences using deep learning to facilitate knowledge base completion. With an existing knowledge base as a training set, we programmed the neural net as a binary classifier to find likely relationships and then insert them back into the graph. We'll describe lessons learned and next steps.

Vistas

Total de vistas

574

En Slideshare

0

De embebidos

0

Número de embebidos

5

Acciones

Descargas

0

Compartidos

0

Comentarios

0

Me gusta

3

×