YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

fokan/medgemma-4b-it-int8

INT8 dynamic quantized version of google/medgemma-4b-it

  • Quantization: Dynamic INT8 on Linear layers (PyTorch)
  • Ideal for CPU inference
  • 4ร— smaller than original model
Downloads last month
5
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Space using fokan/medgemma-4b-it-int8 1