--- license: apache-2.0 pipeline_tag: text-generation language: - en - zh ---

SongComposer

[💻Github Repo](https://github.com/pjlab-songcomposer/songcomposer) [📖Paper](https://arxiv.org/abs/2402.17645)
**SongComposer** is a language large model (LLM) based on [InternLM2](https://github.com/InternLM/InternLM) for lyric and melody composition in song generation. We release SongComposer series in two versions: - SongComposer_pretrain: The pretrained SongComposer with InternLM2 as the initialization of the LLM, gains basic knowledge on lyric and melody. - SongComposer_sft: The finetuned SongComposer for *instruction-following song generation* including lyric to melody, melody to lyric, song continuation, text to song. ### Import from Transformers To load the SongComposer_sft model using Transformers, use the following code: ```python from transformers import AutoTokenizer, AutoModel ckpt_path = "Mar2Ding/songcomposer_sft" tokenizer = AutoTokenizer.from_pretrained(ckpt_path, trust_remote_code=True) model = AutoModel.from_pretrained(ckpt_path, trust_remote_code=True).cuda().half() prompt = 'Create a song on brave and sacrificing with a rapid pace.' model.inference(prompt, tokenizer) ``` ### 通过 Transformers 加载 通过以下的代码加载 SongComposer_sft 模型 ```python from transformers import AutoTokenizer, AutoModel ckpt_path = "Mar2Ding/songcomposer_sft" tokenizer = AutoTokenizer.from_pretrained(ckpt_path, trust_remote_code=True) model = AutoModel.from_pretrained(ckpt_path, trust_remote_code=True).cuda().half() prompt = 'Create a song on brave and sacrificing with a rapid pace.' model.inference(prompt, tokenizer) ``` ### Open Source License The code is licensed under Apache-2.0, while model weights are fully open for academic research and also allow free commercial usage.