sd_xl_dpo_lora_v1 MJ52 Ultra0.9tBakedPrunes DPO (Direct Preference Optimization) LoRA for XL and 1.5 - OpenRail++
Size: 672 x 1024
AI Model: DPO (Direct Preference Optimization) LoRA for XL and 1.5 - OpenRail++
Created date: January 05, 2024, 4:48 AM
Picture ID: KAnZ-o0BpuE-PQ9eDdWI