Skip to content

Issues: lllyasviel/Omost

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

lamas
#118 opened Sep 16, 2024 by gustvta
请问能否支持FLUX??
#116 opened Aug 25, 2024 by wuliang19869312
要是支持Flux就牛了!
#115 opened Aug 23, 2024 by e813519
How to train an own LLM(Mixtral)?
#111 opened Jul 29, 2024 by tdlhyj
Train a Llama 3.1 model
#110 opened Jul 26, 2024 by revolvedai
So where did this go?
#107 opened Jul 13, 2024 by Lustwaffel
windows version
#106 opened Jul 10, 2024 by Archviz360
Linux下运行错误
#105 opened Jul 4, 2024 by a937983423
Loop and Repeat Output
#103 opened Jun 26, 2024 by Liuqh12
Problem with GTX 1080Ti
#101 opened Jun 23, 2024 by quwassar
Not work in RTX 4060 Ti
#99 opened Jun 21, 2024 by canytam-krystal
CPU Torch版本报错
#96 opened Jun 14, 2024 by skyqvn
any plan to support ip-adapter?
#94 opened Jun 13, 2024 by 1093842024
Support SD3
#93 opened Jun 13, 2024 by Green-li
ProTip! Find all open issues with in progress development work with linked:pr.