under construction: A Gentle Introduction to Deep Learning for Graphs

tutorial introduction to Deep Learning for Graphs

what's deep learning for Graph?

abstract

The paper takes a top-down view to the problem, introducing a generalized formulation of graph representation learning based on a local and iterative approachto structured information processing.

  • what does adj. top-down mean?
  • what is generalized formulation?
  • what does the whole sentence mean?

    It introduces the basic building blocks that can be combined to design novel and effective neural models for graphs.

  • in what way can it be combined to design neural models?
  • what's neural models for graphs?

    The methodological exposition is complemented by a discussion of interesting research challenges and applications in the field.

  • what's methodological exposition?
  • how is it complemented by discussion?

introduction

background

  1. graph is powerful
  2. require deep learning modelsthat can process graphs in an adaptive fashion.
  3. history:long-standing and consolidated history,rooting in the early nineties with seminal works on Recursive Neural Networks(RecNN) for tree structured data.....
  4. ???

This paper takes pace from this historical perspective to provide agentleintroduction to the field of neural networks for graphs, also referred to as deeplearning for graphs in modern terminology

purpose of this paper

.....

guidance

Section 2

we first provide a generalized formulation of the problem of representation learning in graphs, introducing and motivating the architecture roadmap that we will be following throughout the rest of the paper.
We will focus, in particular, on approaches that deal with local and iterative processing of information.

Section 3

introduce the basic building blocks that can be assembled and combined to create modern deep learning architectures for graphs.
In this context, we will introduce the concepts of graph convolutions as local neighborhood aggregation functions, the use of attention, sampling andpooling operators defined over graphs, and we will conclude with a discussionon aggregation functions that compute whole-structure embeddings.
???

Section 4

main learning tasks undertaken in graph representation learning
together with the associated cost functions and a characterization of the related induc-tive biases.
???
---
The final part of the paper surveys other related approaches andtasks (Section 5), and it discusses interesting research challenges (Section 6) andapplications (Section 7). We conclude the paper with some final considerationsand hints for future research directions.

上一篇:Learning Convolutional Neural Networks for Graphs(网友分析)


下一篇:文献阅读16期:Deep Learning on Graphs: A Survey - 5