AI & ML interests
None defined yet.
Recent Activity
BrainboxAI
Small, private, specialized AI models built for companies that cannot afford to send their data elsewhere.
Most AI models today are built for everyone. That makes them good at everything, and great at nothing. We build the opposite: small models, trained on your domain, that run entirely on your own hardware.
What we ship
Fine-tuned foundation models. We take open-source models (Gemma, Llama, Qwen) and specialize them for narrow, high-value domains where precision matters more than breadth.
Open datasets. The training data we use is published openly under Apache 2.0, so researchers and builders can reproduce, extend, and improve on our work.
Applied research on Hebrew. Modern Hebrew is under-served by most LLMs. We work to change that, through better tokenization, curated corpora, and evaluation benchmarks tailored to the language.
Flagship releases
| Name | Task | Base | Size | Training Data |
|---|---|---|---|---|
| law-il-E2B | Israeli legal reasoning | Gemma-4 E2B | 2B | 17,613 examples |
| code-il-E4B | Private coding assistant | Gemma-4 E4B | 4B | 40,000 examples |
| cyber-analyst-4B | Bilingual SOC analyst | Gemma-4 E4B | 4B | 1.16M + 107K delta |
Open datasets
- legal-training-il - 17,613 Israeli legal examples
- code-training-il - 40,000 Python and TypeScript, test-filtered
- brainboxai_cyber_train - 1.16M cybersecurity examples
- brainboxai_cyber_delta - 107K correction delta
Curated collections
Why small?
Small models run on a single GPU, sometimes on a laptop. They don't leak your data to third parties. They're cheaper to serve, faster to iterate on, and easier to audit.
They're not competing with GPT-5 or Claude. They're doing something different: one job, perfectly, in private.
Founded 2025 - Rehovot, Israel