Server error when adding openai api to open webui

I have open webui on zimaos linked to another container serving ollama-nvidia. I want to add openai api to have a bigger choice of models but i added the variables and ended up with a broken container. I didn’t even get a chance to test it. Has anyone else had this problem?

Which app are you using and do you have a screenshot of the error?

Hi there, no i didn’t get a shot of it. I’m running open webui from the zima store, as opposed to the one from bigbear which has different variables in its settings pane (i wonder if that could be the problem), but then I have ollama-nvidia linked to the back end supposedly to optimise for gpu usage. Then in the UI of open webui i have also inserted my openai api key, so in theory i can switch between open source and paid models. I have only installed these apps like anyone would. Could there be a conflict between the two different ways ollama-nvidia and openai do their api calls?

In-app usage is recommended to look at the open webui community or discuss it in our discord community.

1 Like

ok