Back to Models

GPFM

HKUST

Non-Commercial
Peer-reviewed

Model Information

Architecture
ViT-L/14
Parameters
303M
Training Method
DINOv2 (distillation)
Training Data
72K+ WSIs
Release Date
2024-07

Benchmark Rankings

PathBench: 7/21
Stanford: 9/22
HEST: 12/25
Plismbench: 10/15