Dataset Viewer

The dataset viewer is not available because its heuristics could not detect any supported data files. You can try uploading some data files, or configuring the data files location manually.

Metamath Proof Graphs (10k)

This repository provides a PyTorch Geometric dataset designed for the TAG-DS TopoBench challenge.
It contains 20,000 graphs total: 10,000 theorem-only DAGs and 10,000 full proof DAGs drawn from the first 10k theorems in the Metamath [1] database.

Contents

  • data.pt
    A preprocessed PyG dataset containing:
    • data — global collated storage of all nodes, edges, and labels
    • slices — pointers for reconstructing individual graphs
    • train_idx, val_idx, test_idx — fixed graph-level splits

Dataset Structure

1. Theorem Graphs (indices 0–9,999)

Each theorem is represented as a small DAG consisting only of:

  • its hypothesis nodes
  • its conclusion node
  • no proof steps

These encode the statement only, not the derivation.

2. Proof Graphs (indices 10,000–19,999)

For each of the same theorems, the full proof DAG is included, containing:

  • hypothesis nodes
  • intermediate proof steps
  • the same conclusion node

Thus each theorem appears twice:

  1. once as a theorem-only graph
  2. once as the complete proof of that theorem

This pairing enables:

  • learning from theorem statements
  • evaluating on masked proof conclusions
  • consistent label space across both halves

Additional Details

  • Total graphs: 20,000
  • Node embeddings: 768-dimensional CodeBERT vectors
  • Graph type: directed acyclic graphs (DAGs)
  • Label space: 3,557 justification labels, where all labels with <5 training occurrences are collapsed into UNK
  • Conclusion masking: the conclusion node’s embedding is zeroed out; the model must infer its label from the structure and other nodes
  • Monotonicity constraint: in Metamath, proofs only use theorems with index <= the current theorem, so later theorems never appear in earlier graphs
  • Theorem-only graphs are included in training as prior knowledge for downstream proof prediction.

Basic Usage

import torch

obj = torch.load("data.pt", weights_only=False)

data      = obj["data"]
slices    = obj["slices"]
train_idx = obj["train_idx"]
val_idx   = obj["val_idx"]
test_idx  = obj["test_idx"]

Acknowledgements

Thanks to the Erdős Institute for providing the project-based, collaborative environment where key components of the preprocessing pipeline were first developed.


References

[1] Metamath Official Site — https://us.metamath.org/index.html

Downloads last month
78