Can LLM suggestions take my TMs into account?

Hi,

My company wil soon test the OpenAI provider in Trados. We're configuring Azure endpoints that we'll use for these tests. We're now looking at how to integrate our ressources in the suggestions. I know there is an checkbox we can check to include the terminolgy from our termbases in the suggestions. However, it's not clear to me if there is a similar option where the suggestion would consider our translation memories to suggest segments with the "voice" of our company. Is there such an option, or do we have to configure our LLM on our side to do it?

Thank you,

Charles

emoji
Parents
  • Hi  , Yes, it is possible to have the AI-generated suggestions consider your existing TMs to reflect your company's voice and style. While the OpenAI provider primarily generates translations using AI, you can enhance its output by incorporating your TMs into the prompt sent to the AI. This approach instructs the AI to align its suggestions with your established translations.

    How to Integrate TMs into AI Suggestions

    Here's how you can achieve this:

    1. Enable the Option to Include Existing Translations:

    • OpenAI Provider Settings: In the OpenAI provider settings within Trados Studio, there is an option to include the existing translation from your TM in the prompt sent to the AI. Make sure this option is enabled.
    • Purpose: By including the existing translation, you're providing the AI with a reference to your preferred terminology and style, which it can use to generate more consistent suggestions.

    2. Customize the AI Prompt:

    • Use Key Terms in Your Prompt: When writing prompts, you can use key terms such as `Source` and `Translation` to guide the AI.
    • Example Prompt:
      ```
      Source: [Your source text]
      Translation: [Your existing translation from the TM]
      User Prompt: Improve this translation while maintaining our company's style and terminology.
      ```
    • Explanation: This prompt tells the AI to consider both the source text and your existing translation, refining it as needed while adhering to your company's voice.

    3. Incorporate Terminology from Termbases:

    • Enable Terminology Integration: As you mentioned, you can check the option to include terminology from your termbases. This ensures that specific terms are used consistently in the AI's suggestions.
    • Provide Terms in the Prompt: You can include a list of terms and their preferred translations in your prompts to reinforce their usage.

    Example Usage

    Here's an example of how you might structure your prompt.  Depending on the options you select in the settings, either the Terms or Translation or both will be included with your prompt when requesting AI to generate a new translation.

    Source: "Our innovative technology accelerates growth and maximizes efficiency."
    Translation: "La nostra tecnologia innovativa accelera la crescita e massimizza l'efficienza."

    Terms:
    - innovative technology: tecnologia innovativa
    - accelerates growth: accelera la crescita
    - maximizes efficiency: massimizza l'efficienza

    User Prompt: "Refine this translation using the specified terms and maintain our company's tone."

    emoji
Reply
  • Hi  , Yes, it is possible to have the AI-generated suggestions consider your existing TMs to reflect your company's voice and style. While the OpenAI provider primarily generates translations using AI, you can enhance its output by incorporating your TMs into the prompt sent to the AI. This approach instructs the AI to align its suggestions with your established translations.

    How to Integrate TMs into AI Suggestions

    Here's how you can achieve this:

    1. Enable the Option to Include Existing Translations:

    • OpenAI Provider Settings: In the OpenAI provider settings within Trados Studio, there is an option to include the existing translation from your TM in the prompt sent to the AI. Make sure this option is enabled.
    • Purpose: By including the existing translation, you're providing the AI with a reference to your preferred terminology and style, which it can use to generate more consistent suggestions.

    2. Customize the AI Prompt:

    • Use Key Terms in Your Prompt: When writing prompts, you can use key terms such as `Source` and `Translation` to guide the AI.
    • Example Prompt:
      ```
      Source: [Your source text]
      Translation: [Your existing translation from the TM]
      User Prompt: Improve this translation while maintaining our company's style and terminology.
      ```
    • Explanation: This prompt tells the AI to consider both the source text and your existing translation, refining it as needed while adhering to your company's voice.

    3. Incorporate Terminology from Termbases:

    • Enable Terminology Integration: As you mentioned, you can check the option to include terminology from your termbases. This ensures that specific terms are used consistently in the AI's suggestions.
    • Provide Terms in the Prompt: You can include a list of terms and their preferred translations in your prompts to reinforce their usage.

    Example Usage

    Here's an example of how you might structure your prompt.  Depending on the options you select in the settings, either the Terms or Translation or both will be included with your prompt when requesting AI to generate a new translation.

    Source: "Our innovative technology accelerates growth and maximizes efficiency."
    Translation: "La nostra tecnologia innovativa accelera la crescita e massimizza l'efficienza."

    Terms:
    - innovative technology: tecnologia innovativa
    - accelerates growth: accelera la crescita
    - maximizes efficiency: massimizza l'efficienza

    User Prompt: "Refine this translation using the specified terms and maintain our company's tone."

    emoji
Children
  • Thank you for the answer! I never thought aboutt providing terms in the prompt to reinforce their usage.

    However, if I understand correctly, this will improve a translation only if there exists a match between the source segment and the TM. There is no way to "train" the AI with your TMs in a way that when it is prompted with a new sentence that has no match with the TMs, it will still try to generate a translation with the voice of my company?

    emoji
  • Hi  , currently, to use existing translations as context for AI-generated translations, you need to pre-translate the document or have existing translations for the segments. This allows the OpenAI provider to utilize the pre-translated content as context when refining translations or generating new ones.

    We're reviewing this process for future feature enhancements. Our goal is to automate the workflow by enabling the system to automatically search for existing translations from the Translation Memory (TM) and incorporate them as context in the prompts sent to the AI. This means you wouldn't need to manually pre-translate the document—the system would handle it, facilitating a smoother and more efficient translation process.

    Regarding the question about Training or Fine-Tuning the LLM, this is already possible via the OpenAI API UI, however does come with an additional hit on costs first to perform the fine-tuning operaton and then increased price per token usage.

    We are currently reviewing how we could work with pretrained LLM's (from hugging face) locally to achive similar results, but that is a different topic... +  

    emoji