Search
Results
OpenAI’s GPT-4o Mini isn’t much better than rival LLMs • The Register
Optimum
Optimum is an extension of Transformers that provides a set of performance optimization tools to train and run models on targeted hardware with maximum efficiency. It is also the repository of small, mini, tiny models.
google/bert_uncased_L-4_H-256_A-4 · Hugging Face
Repository of all Bert models, including small. Start using this model for testing.
(2) Are there any tiny (1-3b) models finetuned for coding available in GGUF format? : LocalLLaMA
bigcode (BigCode)
Research community developing various code models, small and big. Models may not be instruct
WizardLM (WizardLM)
deepseek-ai (DeepSeek)
They have the 1.3B version!!! This may be the best to start with Newspeak. Should work train even on huggingcface
stabilityai (Stability AI) - Stable Diffusion running on Huggingface
Chat, models. Not open source, but instruct and relatively small (3B). The 3B instruct may be the best to try on Newspeak.
Dwitter - javascript demos in 140 characters
charlesnicholson/nanoprintf: The smallest public printf implementation for its feature set.
SectorLISP Now Fits in One Sector
These people have lisp that fits into 512 bytes ...
GitHub - mufeedvh/binserve: A blazingly fast static web server in a single binary you can set up with zero code - written in Rust.
MVP.css - Minimalist stylesheet for HTML elements
GitHub - yanatan16/nanoajax: An ajax library you need a microscope to see
An ajax library you need a microscope to see. Contribute to yanatan16/nanoajax development by creating an account on GitHub.