enfugue
DPO (Direct Preference Optimization) LoRA for XL and 1.5 - OpenRail++
Size: 512 x 768
AI Model: DPO (Direct Preference Optimization) LoRA for XL and 1.5 - OpenRail++
Created date: December 24, 2023, 10:51 PM
Picture ID: 65b95e8af7254048911f24c7