For those that keep getting a PEFT error on Google Colab..
I had ChatGPT help fix the error/issue. Here is the code that was able to get everything resolved, and have the model run successfully on Google CoLab:
!pip install transformers
!pip install peft
import torch
from accelerate import Accelerator
from diffusers import AnimateDiffPipeline, LCMScheduler, MotionAdapter
from diffusers.utils import export_to_gif
Initialize the Accelerator
accelerator = Accelerator()
Load your models and weights
adapter = MotionAdapter.from_pretrained("wangfuyun/AnimateLCM", torch_dtype=torch.float16)
pipe = AnimateDiffPipeline.from_pretrained("emilianJR/epiCRealism", motion_adapter=adapter, torch_dtype=torch.float16)
pipe.scheduler = LCMScheduler.from_config(pipe.scheduler.config, beta_schedule="linear")
pipe.load_lora_weights("wangfuyun/AnimateLCM", weight_name="sd15_lora_beta.safetensors", adapter_name="lcm-lora")
pipe.set_adapters(["lcm-lora"], [0.8])
Optimize for GPU memory
pipe.enable_vae_slicing()
pipe.enable_model_cpu_offload()
Generate the output
output = pipe(
prompt="A wizard working on magic potions in his wizard hut, 4k, high resolution",
negative_prompt="bad quality, worse quality, low resolution",
num_frames=32,
guidance_scale=2.0,
num_inference_steps=10,
generator=torch.Generator("cpu").manual_seed(0),
)
Export the frames to a GIF
frames = output.frames[0]
gif_path = export_to_gif(frames, "animatelcm.gif")
Download or display your GIF as needed
I had to also explicitly install diffusers library. So just add it to the code.!pip install diffusers