SyMuPe: EncDec baseline

EncDec-base is a Transformer-based Encoder-Decoder baseline for expressive piano performance rendering.

Introduced in the paper: SyMuPe: Affective and Controllable Symbolic Music Performance.

Architecture

  • Type: Transformer Encoder and Decoder
  • Objective: Causal Language Modeling (CLM)
  • Inputs:
    • Score features (y): Pitch, Position, PositionShift, Duration
    • Performance features (x): Velocity, TimeShift, TimeDuration, TimeDurationSustain
    • Conditioning (c_s): Velocity and Tempo score tokens for tempo and dynamics.
  • Outputs: Categorical distributions for performance tokens.
  • Training: Trained for 300,000 iterations on the PERiScoPe v1.0 dataset as described in the paper.

Quick Start

Before using this model, ensure you have the symupe library installed:

pip install -U symupe

Use the following code to render performances:

import torch
from symupe import AutoGenerator

device = torch.device("cuda" if torch.cuda.is_available() else "cpu")

# Build Generator by loading the model and tokenizer directly from the Hub
generator = AutoGenerator.from_pretrained("SyMuPe/EncDec-base", device=device)
# model, tokenizer = generator.model, generator.tokenizer

# Perform score MIDI (tokenization is handled inside)
gen_results = generator.perform_score(
    "score.mid",
    use_score_context=True,
    num_samples=8,
    seed=23,
)
# gen_results[i] is PerformanceRenderingResult(...) containing:
# - score_midi, score_seq, gen_seq, perf_seq, perf_midi, perf_midi_sus

# Save performed MIDI files
generator.save_performances(gen_results, out_dir="samples/encdec")

License

The model weights are distributed under the CC-BY-NC-SA 4.0 license.

Citation

If you use the dataset, please cite the paper:

@inproceedings{borovik2025symupe,
  title = {{SyMuPe: Affective and Controllable Symbolic Music Performance}},
  author = {Borovik, Ilya and Gavrilev, Dmitrii and Viro, Vladimir},
  year = {2025},
  booktitle = {Proceedings of the 33rd ACM International Conference on Multimedia},
  pages = {10699--10708},
  doi = {10.1145/3746027.3755871}
}
Downloads last month
56
Safetensors
Model size
25.1M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train SyMuPe/EncDec-base

Collection including SyMuPe/EncDec-base

Paper for SyMuPe/EncDec-base