Think I asked somewhere recently, but can't remember. What is the maximum TM size in MBs/GBs and/or TUs, in Studio 2017? IN DVX£, e.g., it is 2GB per individual file, due to constraints of the underlying database system.
Michael
Think I asked somewhere recently, but can't remember. What is the maximum TM size in MBs/GBs and/or TUs, in Studio 2017? IN DVX£, e.g., it is 2GB per individual file, due to constraints of the underlying database system.
Michael
Paul Filkin | RWS Group
________________________
Design your own training!
You've done the courses and still need to go a little further, or still not clear?
Tell us what you need in our Community Solutions Hub
Paul Filkin | RWS Group
________________________
Design your own training!
You've done the courses and still need to go a little further, or still not clear?
Tell us what you need in our Community Solutions Hub
Unknown said:PS: I didn't remember/know that SDLTMs are SQL-based. That's great news.
Actually SQLite... my bad. But I reckon 40,000,000 TUs is going to be too much for the application around it. Maybe you could get a TM provider developed for your TM directly using the API. Probably not too hard and then you could look it up and pretranslate from it directly.
Paul Filkin | RWS Group
________________________
Design your own training!
You've done the courses and still need to go a little further, or still not clear?
Tell us what you need in our Community Solutions Hub
Hi Michael,
I PM'ed you about this yesterday, thinking I couldn't reply to this topic, but I was wrong. So I'll share my answer to anyone who wants to read this:
Last week I was doing a project with mixed content. I used a single large TM of 5,98 GB together with some 'smaller' TM's with a total of about 3GB (some of which had fragment align) in Studio 2017. So that's an impressive amount of 9GB of TM. The fragment align is a different matter. That 6GB TM, I don't even dare to dream about uplifting that one. I'm using UpLifted TM's of about 2GB each.
With large TM's you need to plan your setup really well. For some pages of the document I would use this total of 9GB in real time, so not just for pretranslation. I got mixed performance results, sometime 3 seconds, sometimes 20. (both for TM matching and concordance searches). When I got to a section where only one topic was relevant, I'd unselect some TM's in the project settings. I'd never needed so many TM's before, but this time it was a text with mixed content and I had to make sure I was making references to the right court rulings, terminology, etc....
I now realize that your question is on a slightly different topic, but at least my answer is somewhat related :)
Hi,
I came across a source for some large TMs last night so had a little play upgrading a TMX containing 9.5 mullion TUs... thought you might be interested:
Paul Filkin | RWS Group
________________________
Design your own training!
You've done the courses and still need to go a little further, or still not clear?
Tell us what you need in our Community Solutions Hub