DPO (Direct Preference Optimization) LoRA for XL and 1.5 - OpenRail++
Size: 1024 x 1024
AI Model: DPO (Direct Preference Optimization) LoRA for XL and 1.5 - OpenRail++
Created date: December 24, 2023, 10:50 PM
Picture ID: 65b95e8af7254048911f24ba