Skip to content

Using bia-bob with azure hosted OpenAI dodel #236

@sebi06

Description

@sebi06

Hi @haesleinhuepf

I tried to test you really cool "bia-bob" a bit internally, but so far I just cannot get it to work. And I think it might be due to our internal environment.

  • when using our internal endpoint & key I can list all our models (gpt's and others), for example: gpt-4o-2024-08-06
...
# that code works fine
client = AzureOpenAI(
                api_version=os.environ["AZURE_OPENAI_API_VERSION"],
                azure_endpoint=os.environ["AZURE_OPENAI_ENDPOINT"],
                api_key=os.environ["AZURE_OPENAI_API_KEY"],
            )
...

later in my notebook I tried: bob.initialize(endpoint="azure", model="gpt-4o-2024-08-06") and also using the endpoint directly, but always get errors. I tried to modify "_utilities.py" because I think our internal endpoint confuses the code because:

...
elif "gpt-" in model:
        full_response = generate_response_from_openai(
            model,
            system_prompt,
            user_prompt,
            chat_history,
            image,
            vision_model=Context.vision_model,
            vision_system_prompt=vision_system_prompt,
        )

get called but we are using gpt-models, but not hosted by open AI. I then leads to: NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}

Our endpoint looks like: https://xyz.openai.azure.com/

Do you have a hint what I need to modify to make this work?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions