Help | Advanced Search

Computer Science > Machine Learning

Title: adversarial curriculum graph contrastive learning with pair-wise augmentation.

Abstract: Graph contrastive learning (GCL) has emerged as a pivotal technique in the domain of graph representation learning. A crucial aspect of effective GCL is the caliber of generated positive and negative samples, which is intrinsically dictated by their resemblance to the original data. Nevertheless, precise control over similarity during sample generation presents a formidable challenge, often impeding the effective discovery of representative graph patterns. To address this challenge, we propose an innovative framework: Adversarial Curriculum Graph Contrastive Learning (ACGCL), which capitalizes on the merits of pair-wise augmentation to engender graph-level positive and negative samples with controllable similarity, alongside subgraph contrastive learning to discern effective graph patterns therein. Within the ACGCL framework, we have devised a novel adversarial curriculum training methodology that facilitates progressive learning by sequentially increasing the difficulty of distinguishing the generated samples. Notably, this approach transcends the prevalent sparsity issue inherent in conventional curriculum learning strategies by adaptively concentrating on more challenging training data. Finally, a comprehensive assessment of ACGCL is conducted through extensive experiments on six well-known benchmark datasets, wherein ACGCL conspicuously surpasses a set of state-of-the-art baselines.

Submission history

Access paper:.

  • HTML (experimental)
  • Other Formats

license icon

References & Citations

  • Google Scholar
  • Semantic Scholar

BibTeX formatted citation

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

  • Institution

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .

IMAGES

  1. Figure 1 from CuCo: Graph Representation with Curriculum Contrastive

    graph representation with curriculum contrastive learning

  2. Curriculum Learning and Graph Neural Networks (or Graph Structure

    graph representation with curriculum contrastive learning

  3. Figure 1 from CuCo: Graph Representation with Curriculum Contrastive

    graph representation with curriculum contrastive learning

  4. Graph contrastive learning with edge dropout for recommendation

    graph representation with curriculum contrastive learning

  5. Figure 1 from Adversarial Curriculum Graph Contrastive Learning with

    graph representation with curriculum contrastive learning

  6. Figure 1 from Adversarial Curriculum Graph Contrastive Learning with

    graph representation with curriculum contrastive learning

VIDEO

  1. Part 3: graph representation learning via aggregation enhancement

  2. Part 2: prototypical contrastive learning of unsupervised representation

  3. Deep Graph Contrastive Representation Learning 2020

  4. L2-Norm and Unit-Sphere for Contrastive/Self Supervised Learning

  5. [rfp0950] Graph Contrastive Learning with Cohesive Subgraph Awareness

  6. KDD 2023