How to use Azure openAI in Trados Studio 2024

Dear all,

does anybody know which Azure openAi option we can use to only do some minor tests regarding the integration of LLM/Azure in Trados 2024 and machine translation?

There are numerous options on the Azure price list (Azure OpenAI Service – Preise | Microsoft Azure, country Germany) and we don't know which is/are suitable.

Furthermore, is there anyboday who has already set up an Azure account and can quickly guide us through this process?

Thank you for your help!

Sabrina

emoji
Parents
  • Hi  , I'll try help you as best I can, although I haven't setup this up in Azure, there is a lot of documentation around this from MS to help create minimal walk-through underneath.

    Recommended models (balance of quality/speed/cost)

    • Default for testing and most production: gpt-4o-mini (OpenAI or Azure OpenAI). Lowest cost, very fast, strong translation quality.
    • When quality really matters: gpt-4o. Better handling of nuance/formatting; higher cost and slightly slower.
    • Not recommended for MT: o1/o3 (reasoning-focused, slow/expensive) and 3.5 family (older, weaker quality/formatting).

    Useful Links

    Quick Azure OpenAI setup (minimal-cost)

    1. Access and subscription
    2. Create the resource
      • Azure Portal > Create resource > Azure AI Services > Azure OpenAI.
      • Choose an EU region that has your target model (often West Europe or North Europe; Germany regions may not have all models).
    3. Deploy a model
      • Go to Azure AI Foundry (https://ai.azure.com) or the resource in the Portal.
      • Create a Deployment for gpt-4o-mini (or gpt-4o). Give it a short deployment name, e.g., gpt4o-mini.
    4. Get connection details
      • From the Azure OpenAI resource: copy Endpoint (looks like https://YOURRESOURCE.openai.azure.com), and one of the Keys.
      • Note the API version your deployment supports (for chat/completions; a recent 2024 date is fine).

    Configure the OpenAI Provider in Studio 2024

    • Install the OpenAI Provider from the RWS AppStore.
    • Studio > Project Settings > Language Pairs > Translation Memory and Automated Translation > Add > OpenAI Provider.
    • Choose a backend:
      • OpenAI API: paste your OpenAI API key; Base URL https://api.openai.com/v1; Model gpt-4o-mini or gpt-4o.
      • Azure OpenAI: select Azure; paste Endpoint (root URL only), API key, Deployment name (the one you created), and API version.
    • Suggested settings for translation:
      • Temperature: 0.0–0.2 for consistent, non-creative output.
      • Ensure “chat/completions” is used.
    • Put the provider near the top of your MT list so it’s queried during interactive translation and pre-translate.

    Patrick Andrew Hartnett | Developer Experience | Team Lead | RWS Group

    emoji
  • Dear All,

    I would appreciate some input from colleagues who have this actually working in Trados Studio 2024 SR1.

    I have followed the recommended setup exactly as described:

    • Azure OpenAI access enabled

    • Azure OpenAI resource created (West Europe)

    • Model deployed (gpt-4o-mini)

    • Endpoint and API key copied from the Azure OpenAI resource

    However, in Trados Studio 2024 SR1 (18.1.3.x), the built-in OpenAI Provider for Trados only exposes a connection form with:

    • OpenAI-style model selection (gpt-3.5, gpt-5, etc.)

    • Mandatory Reasoning effort

    • Model type “Chat Completions (Reasoning)”

    • No field for Azure deployment name

    As a result, Trados sends OpenAI-native parameters (e.g. reasoning_effort) that are not accepted by Azure OpenAI, leading to 400 errors such as “Unrecognised request: reasoning_effort”.

    The documentation suggests that Azure OpenAI can be configured by entering Endpoint, API key and Deployment name, but that Azure-deployment-specific UI does not appear in my Studio 2024 SR1 installation.

    My question to the community:
    Has anyone successfully connected Azure OpenAI deployments (e.g. gpt-4o-mini) directly to Trados Studio 2024 SR1 using the built-in OpenAI Provider?
    If so:

    • Which exact Studio build / plugin version are you using?

    • Do you see a deployment-based Azure OpenAI configuration screen (without reasoning parameters)?

    Any confirmation, workaround, or clarification would be greatly appreciated.

    emoji
Reply
  • Dear All,

    I would appreciate some input from colleagues who have this actually working in Trados Studio 2024 SR1.

    I have followed the recommended setup exactly as described:

    • Azure OpenAI access enabled

    • Azure OpenAI resource created (West Europe)

    • Model deployed (gpt-4o-mini)

    • Endpoint and API key copied from the Azure OpenAI resource

    However, in Trados Studio 2024 SR1 (18.1.3.x), the built-in OpenAI Provider for Trados only exposes a connection form with:

    • OpenAI-style model selection (gpt-3.5, gpt-5, etc.)

    • Mandatory Reasoning effort

    • Model type “Chat Completions (Reasoning)”

    • No field for Azure deployment name

    As a result, Trados sends OpenAI-native parameters (e.g. reasoning_effort) that are not accepted by Azure OpenAI, leading to 400 errors such as “Unrecognised request: reasoning_effort”.

    The documentation suggests that Azure OpenAI can be configured by entering Endpoint, API key and Deployment name, but that Azure-deployment-specific UI does not appear in my Studio 2024 SR1 installation.

    My question to the community:
    Has anyone successfully connected Azure OpenAI deployments (e.g. gpt-4o-mini) directly to Trados Studio 2024 SR1 using the built-in OpenAI Provider?
    If so:

    • Which exact Studio build / plugin version are you using?

    • Do you see a deployment-based Azure OpenAI configuration screen (without reasoning parameters)?

    Any confirmation, workaround, or clarification would be greatly appreciated.

    emoji
Children
No Data