Graph-Bert
At our group homepage, we also have a copy of our latest paper released: http://www.ifmlab.org/files/paper/graph_bert.pdf
The paper at arXiv is available at https://arxiv.org/abs/2001.05140
(1) SEGEN: Sample-Ensemble Genetic Evolutional Network Model https://arxiv.org/abs/1803.08631
(2) GResNet: Graph Residual Network for Reviving Deep GNNs from Suspended Animation https://arxiv.org/abs/1909.05729
A list of the latest research papers on graph-bert can be found via the following link
Page URL: https://github.com/jwzhanggy/graph_bert_work
@article{zhang2020graph, title={Graph-Bert: Only Attention is Needed for Learning Graph Representations}, author={Zhang, Jiawei and Zhang, Haopeng and Xia, Congying and Sun, Li}, journal={arXiv preprint arXiv:2001.05140}, year={2020} }
After downloading the code, you can run
python3 script_3_fine_tuning.py
directly for node classification. We suggest to run the code with Pycharm and python3.
(1) The Graph-Bert model takes (a) node WL code, (b) intimacy based subgraph batch, (c) node hop distance as the prior inputs. These can be computed with the script_1_preprocess.py.
(2) Pre-training of Graph-Bert based on node attribute reconstruction and graph structure recovery is provided by script_2_pre_train.py.
(3) Please check the script_3_fine_tuning.py as the entry point to run the model on node classification and graph clustering.
(4) script_4_evaluation_plots.py is used for plots drawing and results evaluation purposes.
You can change the "if 0" to "if 1" to turn on a script block, and the reverse to turn off a script block.
(1) pytorch (https://anaconda.org/pytorch/pytorch) (2) sklearn (https://anaconda.org/anaconda/scikit-learn) (3) transformers (https://anaconda.org/conda-forge/transformers) (4) networkx (https://anaconda.org/anaconda/networkx)
A simpler template of the code is also available at http://www.ifmlab.org/files/template/IFM_Lab_Program_Template_Python3.zip
(1) data.py (for data loading and basic data organization operators, defines abstract method load() )
(2) method.py (for complex operations on the data, defines abstract method run() )
(3) result.py (for saving/loading results from files, defines abstract method load() and save() )
(4) evaluate.py (for result evaluation, defines abstract method evaluate() )
(5) setting.py (for experiment settings, defines abstract method load_run_save_evaluate() )
The base class of these five parts are defined in ./code/base_class/, they are all abstract class defining the templates and architecture of the code.
The inherited class are provided in ./code, which inherit from the base classes, implement the abstract methonds.
(1) DatasetLoader.py (for dataset loading)
(1) MethodWLNodeColoring.py (for WL code computing)
(2) MethodGraphBatching.py (for subgraph batching)
(3) MethodHopDistance.py (for hop distance computing)
(1) MethodBertComp.py (for graph-bert basic components)
(2) MethodGraphBert.py (the graph bert model)
(1) MethodGraphBertNodeClassification.py
(2) MethodGraphBertGraphClustering.py
(3) MethodGraphBertNodeConstruct.py
(4) MethodGraphBertGraphRecovery.py
(1) ResultSaving.py (for saving results to file)
(1) EvaluateAcc.py (accuracy metric)
(2) EvaluateClustering.py (a bunch of clustering metrics)
(1) Settings.py (defines the interactions and data exchange among the above classes)
还没有评论,说两句吧!
热门资源
Keras-ResNeXt
Keras ResNeXt Implementation of ResNeXt models...
seetafaceJNI
项目介绍 基于中科院seetaface2进行封装的JAVA...
spark-corenlp
This package wraps Stanford CoreNLP annotators ...
capsnet-with-caps...
CapsNet with capsule-wise convolution Project ...
inferno-boilerplate
This is a very basic boilerplate example for pe...
智能在线
400-630-6780
聆听.建议反馈
E-mail: support@tusaishared.com