sd_xl_dpo_lora_v1 MJ52 Ultra0.9tBakedPrunes DPO (Direct Preference Optimization) LoRA for XL and 1.5 - OpenRail++
Size: 672 x 1024
AI Model: DPO (Direct Preference Optimization) LoRA for XL and 1.5 - OpenRail++
Created date: January 05, 2024, 4:48 AM
Picture ID: 65b95e9ff72540489120ff7f