Hello, all
My team is working on Lam research project, and its working file are very big usually.
So studio is shut down frequently due to the file size.
We've raised a ticket to the IT team about this issue, but they said that this issue cannot be solved now.
Answer from the IT team: In the case of processing the large files or projects (40MB or more) in studio, there is a possibility that the PC freezes the studio is shut down frequently.
In this case, we recommend that cut the large files into several small files and process it on Studio.
But here are some further issues;
1. We've splited the package with smaller word count, but the file size is still big.
We can split the package into very small files, but it takes too much time to split and merge them.
Plus, it is hard to tracking every small pieces, and the quailty risk will be increased.
2. We've splited package files smaller than 40mb. But sometimes, it is not working well on vedor's machine.
They reported Studio shutdown issues and try to reject this project.
3. This project includes many cross file repetitions and fuzzy matches, and they need context check.
But we cannot open all files together, so it is hard to check consistency properly.
Batch find and replace add-in is not working on Studio 2019, and transistor cannot be used because of tags.
Do you have any advice to reduce these negative impacts?
If so, please let me know.
Thanks in advance.
Translate