does the AI Professional plugin work with my own endpoint?

Anyone tested to use the AI Professional (appstore.rws.com/.../239) plugin to connect to your own AzureOpenAI endpoint to provide GPT4 suggestion?

I tried to config my Azure OpenAI instance, but can't get it done. This plugin can't read the model list when I input my AzureOpenAI endpoint.

Trados Studio settings window showing an 'Add Model Connection' form with fields for Name, Provider set to AzureOpenAI, Endpoint URL, API Key, and Model. A 'Test Connection' button is visible.



Generated Image Alt-Text
[edited by: Trados AI at 7:34 AM (GMT 0) on 28 Mar 2024]
emoji
Parents Reply
  • Hi  , sorry to hear you're having problems connecting to your Azure OpenAI account. 

    Finally we found this plugin only support api-version=2023-07-01-preview or older.

    Can you point me to the documentation where the plugin only supports version 2023-07-01-preview or older?


    The plugin should not be affected if you use a deprecated version of the API (e.g. 2023-07-01-preview).  However, I would still recommend that you use the latest API version when adding the endpoint in the plugin settings.
    Make reference here to better understand the supported API releases, including deprecated versions: https://learn.microsoft.com/en-us/azure/ai-services/openai/api-version-deprecation

    Here is an example of the full endpoint that is supported in the plugin when connecting to azure openai; you should change the api-version as needed... I've used the current latest version in this example.

    • chat completions (gpt-4, gpt-3.5-turbo etc...)
      https://YOUR_RESOURCE_NAME.openai.azure.com/openai/deployments/YOUR_DEPLOYMENT_NAME/chat/completions?api-version=2024-03-01-preview
    • legacy completions (gpt-3.5-turbo-instruct)
      https://YOUR_RESOURCE_NAME.openai.azure.com/openai/deployments/YOUR_DEPLOYMENT_NAME/completions?api-version=2024-03-01-preview

    emoji
Children