Connection failed code -2146233088 (OpenAI provider for Trados Studio)

Dear all,

we deployed an Azure OpenAI model in our MS Azure infrastructure, according to our IT team it should be up and running.

When setting up the connection details in the Trados Studio plugin I get the following error message (see screenshot below).

It doesn't seem to be an API error (https://platform.openai.com/docs/guides/error-codes/api-errors).

The Studio log file does not give more details:

OpenAI.Provider.API.AIProvider: 2025-08-20 17:16:49.4640 Info Method: "TestConfiguration", Message: "Fehler beim Senden der Anforderung."
OpenAI.Provider.API.AIProvider: 2025-08-20 17:17:01.3201 Info Method: "TestConfiguration", Message: "Fehler beim Senden der Anforderung."

I'm using Trados Studio 2024 SR1. 

Any suggestions?

Regards,

Bruno

Screenshot showing the Trados Studio 'Connections' window with a selected connection named 'gpt-3.5-turbo' and its endpoint URL.



Generated Image Alt-Text
[edited by: RWS Community AI at 4:18 PM (GMT 1) on 20 Aug 2025]
emoji
Parents
  • Probability score: 0.6  

    Question improvement suggestion: Provide more details about the connection settings you are using and any specific configurations made in Azure.  

    In the meantime, based on the information you have provided, here is a suggested answer to your question:  

    Hello Bruno,

    It seems you're encountering a connection issue with the Azure OpenAI model in Trados Studio. Here are some steps you can take to troubleshoot the problem:

    **Step 1:** Verify Connection Settings  

    Double-check the connection details you entered in the Trados Studio plugin. Ensure that the API endpoint, authentication method, and any other required parameters are correctly configured.

    **Step 2:** Check Azure Configuration  

    Confirm with your IT team that the Azure OpenAI model is indeed running and accessible. You may want to test the connection using a different tool or method to ensure that the service is operational.

    **Step 3:** Review Firewall and Network Settings  

    Ensure that there are no firewalls or network configurations blocking the connection from Trados Studio to the Azure service. You might need to whitelist certain URLs or ports.

    **Step 4:** Examine Logs for More Details  

    While you mentioned that the Studio log file does not provide more details, it might be helpful to look for any additional logs in Azure that could give insights into the connection attempts.

    **Step 5:** Reset Trados Studio Settings  

    If the issue persists, consider resetting your Trados Studio settings. You can do this by renaming the Studio16 folder located at `C:\Users\%USERNAME%\App Data\Roaming\SDL\SDL Trados Studio\Studio16`. After renaming, restart Trados Studio and re-enter your connection details.

    **Step 6:** Update Trados Studio  

    Ensure that you are using the latest version of Trados Studio 2024 SR1. Check for any updates that might address known issues with the plugin.

    If you continue to experience difficulties, please provide more details about your connection settings and any specific configurations made in Azure, as this will help in diagnosing the issue further.

    I hope this answer addressed your question. If not, please reject the answer, add the question improvement suggestion information, and click the RWS Community AI checkbox on the right to try again.  

    RWS Community AI

    emoji
Reply
  • Probability score: 0.6  

    Question improvement suggestion: Provide more details about the connection settings you are using and any specific configurations made in Azure.  

    In the meantime, based on the information you have provided, here is a suggested answer to your question:  

    Hello Bruno,

    It seems you're encountering a connection issue with the Azure OpenAI model in Trados Studio. Here are some steps you can take to troubleshoot the problem:

    **Step 1:** Verify Connection Settings  

    Double-check the connection details you entered in the Trados Studio plugin. Ensure that the API endpoint, authentication method, and any other required parameters are correctly configured.

    **Step 2:** Check Azure Configuration  

    Confirm with your IT team that the Azure OpenAI model is indeed running and accessible. You may want to test the connection using a different tool or method to ensure that the service is operational.

    **Step 3:** Review Firewall and Network Settings  

    Ensure that there are no firewalls or network configurations blocking the connection from Trados Studio to the Azure service. You might need to whitelist certain URLs or ports.

    **Step 4:** Examine Logs for More Details  

    While you mentioned that the Studio log file does not provide more details, it might be helpful to look for any additional logs in Azure that could give insights into the connection attempts.

    **Step 5:** Reset Trados Studio Settings  

    If the issue persists, consider resetting your Trados Studio settings. You can do this by renaming the Studio16 folder located at `C:\Users\%USERNAME%\App Data\Roaming\SDL\SDL Trados Studio\Studio16`. After renaming, restart Trados Studio and re-enter your connection details.

    **Step 6:** Update Trados Studio  

    Ensure that you are using the latest version of Trados Studio 2024 SR1. Check for any updates that might address known issues with the plugin.

    If you continue to experience difficulties, please provide more details about your connection settings and any specific configurations made in Azure, as this will help in diagnosing the issue further.

    I hope this answer addressed your question. If not, please reject the answer, add the question improvement suggestion information, and click the RWS Community AI checkbox on the right to try again.  

    RWS Community AI

    emoji
Children
  • None of the above help, Trados Studio was already update to latest versions, Azure configurations are correct. What "details about your connection settings" shall I provide (provided I can share them with you/in this forum).

    Thank you,

    Bruno

    emoji
  • Probability score: 0.7  

    Question improvement suggestion: Please specify the exact connection settings you are using, including the API endpoint, authentication method, and any other relevant parameters.  

    In the meantime, based on the information you have provided, here is a suggested answer to your question:  

    Hello Bruno,

    I understand that you're still facing challenges with the connection to the Azure OpenAI model in Trados Studio. Let's try to narrow down the issue further. Here are some steps and details that could help us troubleshoot more effectively:

    **Step 1:** Provide Connection Settings  

    Please share the specific connection settings you are using in the Trados Studio plugin. This includes:

    - The API endpoint URL

    - The authentication method (e.g., API key, OAuth)

    - Any headers or parameters you are including in the request

    **Step 2:** Confirm Azure Configuration  

    Since you mentioned that the Azure configurations are correct, it might be helpful to double-check the following:

    - Ensure that the Azure OpenAI service is set to allow requests from your Trados Studio instance.

    - Verify that the correct permissions are granted for the API key or authentication method you are using.

    **Step 3:** Test Connectivity  

    If possible, use a tool like Postman or curl to test the connection to the Azure OpenAI API using the same settings. This can help determine if the issue lies within Trados Studio or the Azure service itself.

    **Step 4:** Review Logs Again  

    While you mentioned that the Studio log file does not provide more details, it might be worth checking the Azure logs for any incoming request attempts or errors that could give us more insight.

    **Step 5:** Check for Updates Again  

    Even though you have updated to the latest version, it’s always good to double-check if there are any patches or updates specifically related to the OpenAI plugin that might have been released recently.

    If you can provide the connection settings and any additional context, it will greatly assist in diagnosing the issue further.

    I hope this answer addressed your question. If not, please reject the answer, add the question improvement suggestion information, and click the RWS Community AI checkbox on the right to try again.  

    RWS Community AI

    emoji
  • Hi  ,

    Please send us the logs for OpenAI provider from this location:

    C:\Users\[USERNAME]\AppData\Roaming\Trados AppStore\OpenAI Provider for Trados Studio\Logs

    We'll be able to find more details from the logs.

    Regards,

    Oana

    Oana Nagy | RWS Group

    _____________
    Design your own training!

    You've done the courses and still need to go a little further, or still not clear? 
    Tell us what you need in our Community Solutions Hub

    emoji
  • Dear Oana, where can I send you the log? Pasting it here triggers your spam filter.

    Regards,

    Bruno

    emoji
  • Hi Oana, maybe with this link I will be able to send it.
    Before 2025-08-20 17:03:54 I was probably using a wrong URL, please focus in the log after that time.

    Regards,

    Bruno

    /cfs-file/__key/communityserver-discussions-components-files/160/OpenAI.Provider.Logs.20250820.txt

    emoji
  • Hi Oana, now we've also tested with curl, using the same connection details the request is completed.

    Regards,

    Bruno

    emoji
  • Hi  , I did some more testing with our IT. Could it be that the API version of our Azure deployment is not comtabile with the version supported by the Trados Studio plugin, what version do we need to select? Attached the logs files when trying to select another model version.

    Regards,

    Bruno

    OpenAI.Provider.ApplicationInstance: 2025-08-22 08:35:25.3369 Info Method: "LoadTelemetryService", : {"PlugInName":"OpenAI Provider for Trados Studio", "Version":"1.2.10.0", "Description":"OpenAI Provider for Trados Studio", "Author":"Trados AppStore Team", "RequiredProduct":{"name":"TradosStudio", "minversion":"18.1", "maxversion":"18.1.9"}, "Include":{"Files":["OpenAI.Provider.API.dll","OpenAI.Provider.dll.config","Trados.Telemetry.dll","Trados.Telemetry.dll.config","Trados.Terminology.API.dll","SegmentComparer.dll","Polly.dll","Polly.Core.dll"]}}  
    OpenAI.Provider.ApplicationInstance: 2025-08-22 08:57:17.0328 Info Method: "LoadTelemetryService", : {"PlugInName":"OpenAI Provider for Trados Studio", "Version":"1.2.10.0", "Description":"OpenAI Provider for Trados Studio", "Author":"Trados AppStore Team", "RequiredProduct":{"name":"TradosStudio", "minversion":"18.1", "maxversion":"18.1.9"}, "Include":{"Files":["OpenAI.Provider.API.dll","OpenAI.Provider.dll.config","Trados.Telemetry.dll","Trados.Telemetry.dll.config","Trados.Terminology.API.dll","SegmentComparer.dll","Polly.dll","Polly.Core.dll"]}}  
    OpenAI.Provider.API.AIProvider: 2025-08-22 10:37:56.6827 Info Method: "TestConfiguration", Message: "An error occurred while sending the request."  
    OpenAI.Provider.API.AIProvider: 2025-08-22 13:04:48.7807 Info Method: "TestConfiguration", Response Status: BadRequest, Payload: "{\"error\":{\"code\":\"BadRequest\",\"message\":\"Model o4-mini is enabled only for api versions 2024-12-01-preview and later\"}"  
    OpenAI.Provider.API.AIProvider: 2025-08-22 13:04:48.8018 Info Method: "TestConfiguration", Message: "Unexpected end when deserializing object. Path 'error', line 1, position 118."  
    OpenAI.Provider.API.AIProvider: 2025-08-22 13:13:41.7001 Info Method: "TestConfiguration", Message: "An error occurred while sending the request."  
    OpenAI.Provider.API.AIProvider: 2025-08-22 13:17:59.2798 Info Method: "TestConfiguration", Message: "An error occurred while sending the request."  
    OpenAI.Provider.API.AIProvider: 2025-08-22 13:19:09.3338 Info Method: "TestConfiguration", Message: "An error occurred while sending the request."  
    OpenAI.Provider.API.AIProvider: 2025-08-22 13:19:56.5591 Info Method: "TestConfiguration", Response Status: BadRequest, Payload: "{\"error\":{\"code\":\"BadRequest\",\"message\":\"Model o4-mini is enabled only for api versions 2024-12-01-preview and later\"}"  
    OpenAI.Provider.API.AIProvider: 2025-08-22 13:19:56.5591 Info Method: "TestConfiguration", Message: "Unexpected end when deserializing object. Path 'error', line 1, position 118."  
    OpenAI.Provider.API.AIProvider: 2025-08-22 13:20:00.6946 Info Method: "TestConfiguration", Response Status: BadRequest, Payload: "{\"error\":{\"code\":\"BadRequest\",\"message\":\"Model o4-mini is enabled only for api versions 2024-12-01-preview and later\"}"  
    OpenAI.Provider.API.AIProvider: 2025-08-22 13:20:00.6981 Info Method: "TestConfiguration", Message: "Unexpected end when deserializing object. Path 'error', line 1, position 118."  
    OpenAI.Provider.API.AIProvider: 2025-08-22 13:24:48.5436 Info Method: "TestConfiguration", Message: "An error occurred while sending the request."  
    OpenAI.Provider.API.AIProvider: 2025-08-22 13:25:11.4953 Info Method: "TestConfiguration", Message: "An error occurred while sending the request."  
    OpenAI.Provider.API.AIProvider: 2025-08-22 13:26:12.5116 Info Method: "TestConfiguration", Message: "An error occurred while sending the request."  
    OpenAI.Provider.API.AIProvider: 2025-08-22 13:27:08.2213 Info Method: "TestConfiguration", Message: "An error occurred while sending the request."  
    OpenAI.Provider.API.AIProvider: 2025-08-22 13:27:19.6733 Info Method: "TestConfiguration", Message: "An error occurred while sending the request."  
    

    emoji
  • Hi  ,

    Thank you for the logs and details. 

    We'll need to investigate and check the limitations/support for Azure, so I'll log it with Development.

    Once I have a fix, I'll update the post.

    Thank you,

    Oana

    Oana Nagy | RWS Group

    _____________
    Design your own training!

    You've done the courses and still need to go a little further, or still not clear? 
    Tell us what you need in our Community Solutions Hub

    emoji
  • Hi  , Can you confirm with your team that you have a model named o4-mini in your Azure deployments?  
    Note: you can also provide the updated API version in the endpoint. learn.microsoft.com/.../api-version-lifecycle

    Patrick Andrew Hartnett | Developer Experience | Team Lead | RWS Group

    emoji
  • Hi  ,

    yes I can confirm that we deployed the model o4-mini as shown in the endpoint in the screenshot (and we can access via curl).

    Regards,

    Bruno

    emoji