SupraLabs

non-profit
Activity Feed

AI & ML interests

Train, finetune and explore small models with good results to revolutionize small AI models by making them accessible to everyone

Recent Activity

LH-Tech-AIΒ  new activity about 1 hour ago
SupraLabs/Supra-Mini-v4-2M:Awesome
LH-Tech-AIΒ  updated a collection about 12 hours ago
All Supra models
LH-Tech-AIΒ  updated a collection about 12 hours ago
Supra Mini series
View all activity

Organization Card

Welcome to SupraLabs!

🀝 Who we are

We are @AxionLab-official and @LH-Tech-AI and we're creating small open-source models for everyone.

🎯 What we do

We train, finetune and explore small models with good results to revolutionize small AI models by making them accessible to everyone!

🚫 What we do NOT do

We are not making bad, trashy or unclean models and we do not release models halfy open-source but completely open-source for you!

πŸ€– Models

  • Supra Mini 0.1M - Trained on Kaggle 2xT4, 100k parameters, compared to models 10x it size
  • Supra Mini v2 0.1M - the second version of the Supra Mini series.
  • Supra Mini v3 0.5M - the third version of the Supra Mini series.
  • Supra Mini v4 2M - the fourth version of the Supra Mini series. Improved. More powerful. With context understanding.
  • MicroSupra 1k - Trained on GTX 750 Ti 4GB, a scaling laws experiment.
  • More Coming Soon! Come Back later!

πŸ† Competing with other creators

We are competing with @CompactAI-O and @LH-Tech-AI (we know it's funny to compete against your own founder, but anyway πŸ€£πŸ˜‚).
See all our and their tiny models here: https://lh-tech.de/ai/compare-tiny-models.html

πŸ—οΈ Future roadmap

  • Supra-10M - Base, Chat, Reasoning - Trained on RTX 5060 Ti 16GB, with Nvidia technologies and CUDA
  • Supra-1M - Base, Chat, Reasoning - Trained on GTX 750Ti 4GB, with Nvidia Technologies and optimizations

πŸ’» Hardware

  • RTX 5060 Ti 16GB (LH-Tech AI)
  • GTX 750Ti 4GB (AxionLab)

πŸ“’ Blog

https://huggingface.co/spaces/SupraLabs/Blog

🫢 Feedback and Support

Feedback and Support is very welcome and feel free to ask to join our organization if you want. :-)

datasets 0

None public yet