帳號:guest(3.145.199.112)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):曾若淳
作者(外文):Tzeng, Ruo-Chun
論文名稱(中文):在多尺度個人網路中加入無尺度先驗以幫助圖的分類問題
論文名稱(外文):From Ego-Network To Multi-Level Graph Representations with Scale-Free Priors
指導教授(中文):吳尚鴻
指導教授(外文):Wu, Shan-Hung
口試委員(中文):李育杰
孫民
陳煥宗
帥宏翰
口試委員(外文):Lee, Yuh-Jye
Sun, Min
Chen, Hwann-Tzong
Shuai, Hong-Han
學位類別:碩士
校院名稱:國立清華大學
系所名稱:資訊工程學系所
學號:104062703
出版年(民國):106
畢業學年度:106
語文別:英文
論文頁數:27
中文關鍵詞:圖的嵌入模型卷積神經網路無尺度網路
外文關鍵詞:Graph EmbeddingConvolution Neural NetworksScale-Free Networks
相關次數:
  • 推薦推薦:0
  • 點閱點閱:272
  • 評分評分:*****
  • 下載下載:41
  • 收藏收藏:0
目前的圖(graph)的嵌入模型雖然可以產生對做圖的相 關問題時表現不錯的向量,但是在模型在學習如何產生圖 的嵌入向量的時候,是否能同時學習對任務有用的其他資 訊,卻是鮮少被探討的問題。
在這篇論文中,我們考慮在嵌入時同時偵測對監督任務 有決定性影響的「關鍵結構」的可能性。我們提出一個由 局部到全局的嵌入模型「Ego-CNNs」,主要想法是利用到 卷積網路(CNNs)的特性,將局部的嵌入向量,再透過一 層層的 Ego-Convolution 層之後,擴張成能覆蓋整張圖的向 量,其中這向量代表的是由不同的點(node)為中心出發看 待整張圖的觀點。只要跟監督任務模型(supervised task model)結合就能偵測出其中的關鍵結構。
我們呈現的結果有:(1)Ego-CNN 能得不比目前最好 的嵌入模型差的表現。(2)Ego-CNN 可以利用現有的卷積 網路視覺化的技術來呈現偵測到的結構。(3)Ego-CNN 在 計算上很有效率並且可以加入現實社群網路中常見的無尺 度(scale-free)的特性進一步提升學模型的學習效率。
While existing graph embedding models can generate useful embedding vectors for graph-related tasks, what valuable information can be jointly learned from a graph embedding model is less discussed. In this paper, we consider the possibility of detecting critical structures by a graph embedding model. We propose Ego-CNN to embed graphs, which works in a local-to- global manner to take ad- vantages of CNNs that gradually expands the detectable local regions on the graph as the network depth increases. Critical structures can be detected if Ego-CNN is combined with a supervised task model. We show that Ego- CNN is (1) competitive to state-of-the-art graph embeddings models, (2) can work nicely with CNNs visualization techniques to show the detected structures, and (3) is efficient and can incorporate with scale-free priors, which commonly occurs in social network datasets, to further improve the training efficiency.
1. INTRODUCTION ..............................................................6
2. RELATED WORK .............................................................10
Graph Kernels .....................................................................10
Graphical Models ................................................................11
Convolution-based Methods................................................11
3. EGO-CNN ........................................................................14
Effective Receptive Field on Ambient Graph .......................16
4. ADVANTAGES OF EGO-CNN ...........................................18
DETECTING CRITICAL STRUCTURES ..................................18
EFFICIENCY AND THE SCALE-FREE PRIOR .........................18
5. EXPERIMENTS .................................................................20
GRAPH CLASSIFICATION .....................................................20
SCALE-FREE REGULARIZER .................................................21
VISUALIZATION OF CRITICAL STRUCTURES .......................22
6. CONCLUSIONS .................................................................25
REFERENCES ........................................................................26
James Atwood and Don Towsley. Diffusion-convolutional neural networks. In Proceedings of NIPS, 2016.
Joan Bruna, Wojciech Zaremba, Arthur Szlam, and Yann LeCun. Spectral networks and locally connected networks on graphs. In Proceedings of
ICLR, 2013.
Stephen A Cook. The complexity of theorem-proving procedures. In
Proceedings of the third annual ACM symposium on Theory of computing. ACM, 1971.
Hanjun Dai, Bo Dai, and Le Song. Discriminative embeddings of latent variable models for structured data. In Proceedings of ICML, 2016.
Asim Kumar Debnath, Rosa L Lopez de Compadre, Gargi Debnath, Alan
J Shusterman, and Corwin Hansch. Structure-activity relationship of mutagenic aromatic and heteroaromatic nitro compounds. correlation
with molecular orbital energies and hydrophobicity. Journal of medicinal chemistry, 34(2):786–797, 1991.
Michaël Defferrard, Xavier Bresson, and Pierre Vandergheynst. Convolutional neural networks on graphs with fast localized spectral
filtering. In Proceedings of NIPS, 2016.
David K Duvenaud, Dougal Maclaurin, Jorge Iparraguirre, Rafael
Bombarell, Timothy Hirzel, Alán Aspuru-Guzik, and Ryan P Adams. Convolutional networks on graphs for learning molecular fingerprints. In
Proceedings of NIPS, 2015.
Laurent Itti, Christof Koch, and Ernst Niebur. A model of saliency-based
visual attention for rapid scene analysis. IEEE Transactions on pattern analysis and machine intelligence, 20(11):1254– 1259, 1998.
Kristian Kersting, Nils M. Kriege, Christopher Morris, Petra Mutzel, and Marion Neumann. Benchmark data sets for graph kernels, 2016. URL
http://graphkernels.cs.tu-dortmund. de.
Thomas N Kipf and Max Welling. Semi-supervised classification with graph convolutional networks. In Proceedings of ICLR, 2017.
Risi Kondor and Horace Pan. The multiscale laplacian graph kernel. In Proceedings of NIPS, 2016. Tomas Mikolov, Kai Chen, Greg Corrado,
and Jeffrey Dean. Efficient estimation of word representations in vector space. 2013.
Annamalai Narayanan, Mahinthan Chandramohan, Lihui Chen, Yang Liu, and Santhoshkumar Saminathan. subgraph2vec: Learning distributed
representations of rooted sub-graphs from large graphs. In Workshop on Mining and Learning with Graphs, 2016.
Mathias Niepert, Mohamed Ahmed, and Konstantin Kutzkov. Learning convolutional neural networks for graphs. In Proceedings of ICML, 2016.
Nino Shervashidze, Pascal Schweitzer, Erik Jan van Leeuwen, Kurt Mehlhorn, and Karsten M Borgwardt. Weisfeiler-lehman graph kernels.
JMLR, 12(Sep):2539–2561, 2011.
Pinar Yanardag and SVN Vishwanathan. Deep graph kernels. In
Proceedings of SIGKDD. ACM, 2015.
Matthew D Zeiler, Graham W Taylor, and Rob Fergus. Adaptive
deconvolutional networks for mid and high level feature learning. In Proceedings of ICCV. IEEE, 2011.
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *