Salamander 24B v1

Salamander

This is Checkpoint 82, a new della merge combining several 2501, 2506, and 2509 models, with fallen mistral 2503 also sprinkled in.

No refusals were observed in the initial tests. The model should not require ablation or jailbreaks.

architecture: MistralForCausalLM
models:
  ## BASE ##
  - model: B:\24B\Darkhn--Magistral-2509-24B-Text-Only
  ## 2501 ##
  - model: B:\24B\!models--ReadyArt--4.2.0-Broken-Tutu-24b
    parameters:
      weight: 0.1
      density: 0.9
      epsilon: 0.09
  - model: B:\24B\!models--ReadyArt--Broken-Tutu-24B-Transgression-v2.0
    parameters:
      weight: 0.1
      density: 0.9
      epsilon: 0.09 
  - model: B:\24B\PrivateMerge29 # This merge is no longer available on HF
    parameters:
      weight: 0.1
      density: 0.9
      epsilon: 0.09
  - model: B:\24B\!models--Nabbers1999--MS-24B-Bathory-GRPO
    parameters:
      weight: 0.1
      density: 0.9
      epsilon: 0.09
  - model: B:\24B\!models--dphn--Dolphin-Mistral-24B-Venice-Edition
    parameters:
      weight: 0.1
      density: 0.9
      epsilon: 0.09
  - model: B:\24B\!models--TroyDoesAI--BlackSheep-24B
    parameters:
      weight: 0.1
      density: 0.9
      epsilon: 0.09
  - model: B:\24B\!models--aixonlab--Eurydice-24b-v3.5
    parameters:
      weight: 0.1
      density: 0.9
      epsilon: 0.09
  - model: B:\24B\!models--Undi95--MistralThinker-v1.1
    parameters:
      weight: 0.1
      density: 0.9
      epsilon: 0.09 
  ## 2503 ##
  - model: B:\24B\!BeaverAI_Fallen-Mistral-Small-3.1-24B-v1e_textonly
    parameters:
      weight: 0.2
      density: 0.9
      epsilon: 0.09
  ## 2506 ##
  - model: B:\24B\!models--zerofata--MS3.2-PaintedFantasy-v2-24B
    parameters:
      weight: 0.1
      weight: 0.09
      epsilon: 0.09    
  - model: B:\24B\!models--TheDrummer--Cydonia-24B-v4.3
    parameters:
      weight: 0.2
      density: 0.9
      epsilon: 0.09
  - model: B:\24B\!models--TheDrummer--Rivermind-24B-v1
    parameters:
      weight: 0.1
      density: 0.9
      epsilon: 0.09
  - model: B:\24B\!models--trashpanda-org--MS3.2-24B-Mullein-v2
    parameters:
      weight: 0.1
      density: 0.9
      epsilon: 0.09
  - model: B:\24B\!models--allura-forge--ms32-final-TEXTONLY
    parameters:
      weight: 0.1
      density: 0.9
      epsilon: 0.09
  - model: B:\24B\!models--CrucibleLab--M3.2-24B-Loki-V1.3
    parameters:
      weight: 0.1
      density: 0.9
      epsilon: 0.09
  - model: B:\24B\!models--Darkhn--M3.2-24B-Animus-V7.1
    parameters:
      weight: 0.1
      density: 0.9
      epsilon: 0.09
  - model: B:\24B\MuXodious--Hearthfire-24B-absolute-heresy
    parameters:
      weight: 0.1
      density: 0.9
      epsilon: 0.09
  - model: B:\24B\!models--ReadyArt--Dark-Nexus-24B-v2.0
    parameters:
      weight: 0.1
      density: 0.9
      epsilon: 0.09    
  ## 2509##
  - model: B:\24B\!models--TheDrummer--Precog-24B-v1
    parameters:
      weight: 0.2
      density: 0.9
      epsilon: 0.09
  - model: B:\24B\!models--TheDrummer--Magidonia-24B-v4.3
    parameters:
      weight: 0.2
      density: 0.9
      epsilon: 0.09
  - model: B:\24B\llmfan46--MS3.2-PaintedFantasy-v4.1-24B-ultra-uncensored-heretic-v1
    parameters:
      weight: 0.2
      density: 0.9
      epsilon: 0.09
  - model: B:\24B\!models--zerofata--MS3.2-PaintedFantasy-v3-24B
    parameters:
      weight: 0.2
      density: 0.9
      epsilon: 0.09
  - model: B:\24B\MuXodious--Tiamat-24B-Magistral-PaperWitch-heresy\textonly
    parameters:
      weight: 0.2
      density: 0.9
      epsilon: 0.09
merge_method: della
base_model: B:\24B\Darkhn--Magistral-2509-24B-Text-Only
parameters:
  lambda: 1.0
  normalize: false
  int8_mask: false
  rescale: true
tokenizer:
  source: union
dtype: float32
out_dtype: bfloat16
name: C82
Downloads last month
12
Safetensors
Model size
24B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Naphula/Salamander-24B-v1

Collections including Naphula/Salamander-24B-v1

Paper for Naphula/Salamander-24B-v1