API call to save Document to Local Storage?

I'm working on some scripts to streamline how my team interacts with our SDL Publication Manager / Knowlege Center (v12.0.2417.0).

I've gotten the Content Manager API calls for DocumentObj working to get a file out of the repository and decode the file content. However, what I'd like to do next is save that file into the user's Local Storage folder. Ideally, I'd like an API call that will do this so I can avoid having to find the local storage location and re-create the correct local storage meta-data file format.

Does an API call exist for this?  If not, is there documentation that would help me to pull the local storage directory out of the user's Publication Manager settings?

Parents
  • I'm a bit confused with your intention. I feel two separate things are being mixed here.
    If by user's local storage folder you refer to the operating system specific location e.g. C:\Users\username then this is not connected to the Content Manager API, as it is hosted on the server side, on multiple servers often.

    Just in any case, checkout ISHRemote PowerShell module that offers PowerShell cmdlets over the Content Manager api, in a very nice way.
  • I'm referring to the Local Storage path set via Publication Manager -> Tools -> Options -> Local Storage. It looks like it is stored locally on the client machine as <localStoragePath> in %LOCALAPPDATA%\SDL\InfoShare Client\12.0\Trisoft.InfoShare.config.

    On a local system with SDL Publication Manager and Authoring Bridge (we're using Oxygen as the XML author), if I check out and open a map via Publication Manager or Authoring Bridge, the SDL tool will automatically retrieve all of the objects referenced by that map and save a copy of the object's XML data and the ISHMetadata as a file in the localStoragePath.

    What I want to do is replicate the same functionality in my script. If it isn't in the Content Manager API, perhaps accessing the same library which Publication Manager and/or Authoring Bridge are using would be OK as well. Unfortunately, that doesn't appear to be documented so it would be much more difficult. I just don't want to have to re-create the behavior from scratch if I can avoid it.

    I'm looking at ISHRemote now. I haven't done a thorough dive yet but it doesn't look like ISHRemote has a function for this either. It can retrieve the data for me but not store it locally.

    The goal here is to streamline the interface for our SMEs so they can edit their files more easily without having to poke around in the SDL Repo Browser and check out each file individually.
Reply
  • I'm referring to the Local Storage path set via Publication Manager -> Tools -> Options -> Local Storage. It looks like it is stored locally on the client machine as <localStoragePath> in %LOCALAPPDATA%\SDL\InfoShare Client\12.0\Trisoft.InfoShare.config.

    On a local system with SDL Publication Manager and Authoring Bridge (we're using Oxygen as the XML author), if I check out and open a map via Publication Manager or Authoring Bridge, the SDL tool will automatically retrieve all of the objects referenced by that map and save a copy of the object's XML data and the ISHMetadata as a file in the localStoragePath.

    What I want to do is replicate the same functionality in my script. If it isn't in the Content Manager API, perhaps accessing the same library which Publication Manager and/or Authoring Bridge are using would be OK as well. Unfortunately, that doesn't appear to be documented so it would be much more difficult. I just don't want to have to re-create the behavior from scratch if I can avoid it.

    I'm looking at ISHRemote now. I haven't done a thorough dive yet but it doesn't look like ISHRemote has a function for this either. It can retrieve the data for me but not store it locally.

    The goal here is to streamline the interface for our SMEs so they can edit their files more easily without having to poke around in the SDL Repo Browser and check out each file individually.
Children
  • thank you for the clarification. The API of the server cannot and doesn't own client side settings.

    Client Tools is a large application with many layers implemented in different assemblies. I would not advice to create dependencies to those assemblies, as this is not officially supported and you will have hard time figuring out the dependencies yourself both to dlls, logging and configuration values.

    First of all you need to make sure you are working on the same path with the xml and met files. So you can hardcode it in your scripts or simply read the value from the file you mentioned. The location of clients tools can easily calculated.

    It's not clear to me what you are building for your SMEs. How would an SME avoid the SDL Repository Browser for example? If the SME makes local changes to the local storage path, who is going to push them to the server (check-in)? Is it another UI or they will open the xml files directly?

    A few thoughts for insight and help the rest of the discussion.

    If you are planning to use the local storage as a working folder, then please keep in mind that that location represents functional state for the editors and publication manager. If you for example change the files or push them to the database, you would be interfering with their logic.

    If you plan to build a UI, then please consider using the Authoring Bridge. The Authoring Bridge is what allows the editors to consume the functionality and win forms of the "Publication Manager". As such, it is nothing more than an sdk/gateway into the forms and the ui you see in Publication Manager or Oxygen. That includes the SDL Repository browser that you mentioned you want to avoid.
  • Alex,

    The short version is, my SMEs are just starting with the system now and are complaining (a lot!) about two things:

    - They can't do a Find/Replace All on referenced topics in a map (Oxygen can find all references of a search query, but can't replace).
    - They have to manually check-out (SDL Knowledge Center -> Check Out -> "Yes" on the popup window) each file they edit, which quickly gets tiresome if they are having to make a large number of changes.

    As far as I've been able to find, there isn't a way to solve these problems with the built in Knowledge Center features.

    In our particular application only one person (the relevant SME) is ever working on a particular publication, so I've been trying to create a "Checkout All" script that will check out all Topics referenced in a particular DITA Map. I have the checkout logic working, but I can't get it to play nice with Oxygen+Authoring Bridge. For example, if I open a topic in Oxygen after it is checked out via the script, Oxygen/AuthoringBridge still has it read-only, and Check In is greyed out.

    I understand that both of these concepts don't necessarily fit with the standard DITA reuse ideas or the collaboration features that are the goal of SDL. However, in this narrow use case where I have SMEs who need to directly edit documents I think it is a reasonable request (and necessary to get their cooperation...).
  • I understand. Without this being my area of expertise, I know that PublicationManager+Oxygen use the file system to maintain a single state between those processes. They also use file system hooks (FileWatcher in .NET) to notify each other that an action was taken. For example, when Oxygen takes an action, it changes the file system in a manner that PublicationManager will pick up and adjust it's own UI. So my hypothesis is that you are not doing everything that the PM or Oxygen are doing on the file system.

    Again with the understanding that it is not advised to interfere with this process, I would recommend you to take a look in the metadata files next to the xml. Also, do a simple action e.g. checkout and do a delta on the file system. Take a snapshot before and after and compare them. I believe there is a cmdlet in PowerShell that does this also, but in this case I would use a tool such as WinMerge or BeyondCompare to better visualize. You can also create a .net application that creates filewatcher hooks. On each hook make sure you output to the console what happened. You can also copy the files to keep snapshots before and after each event. All these will help you draw a picture of what happened in the filesystem on a simple UI action such as Check Out. Once you solve the puzzle, then mimicking the same behavior should achieve what you need.

     

    I hope this helps you understand the feasibility and the risks. Also if you decide to move forward I hope it set you into the right direction.

  • Hi Brant,
    Not sure if this is useful but understood that every now and then you prefer another way of working compared to the mainstream way of working suggested by the menus and training material.
    A possible option to overcome the problem is to try the following:
    1. Clear the local storage via Tools->Local Storage, Select all (Ctrl+A) and press Delete
    2. Find a way to search all the objects you need in Publication Manager
    3. In the search result you can select all topics and choose Check Out with.... if you select a tool like Notepad, then it will just open something in Notepad and the side-effect is that the objects are checked out and available in the local storage
    4. From that point on you can do a Find/Replace over multiple files in Oxygen. It is also possible to copy the files to another location first and later on copy the updated files back to the local storage folder.
    5. Via Tools->Local storage it is possible to just checkin all files in one go

    To search for a word for topics referenced by a map it would be possible to create a helper publication object and add the map to that publication object. Then use Ctrl+F and search within the scope of the publication.

    About the API. The Web Services API provides an abstraction layer for applications to operate on the core capabilities of the KC system. In this perspective the Publication Manager and Authoring bridge are applications that solely operate via the API. If you want to automate operations around the local storage then you would need an API for functionality offered by the client tools. But this API is not available. Using the API to manipulate objects in the local storage is not straightforward and may proof to be unmanageable at the end.

    If the work-arounds using the tool and local storage do not meet your needs I can imagine you use the API or ISHRemote to search and get/checkout/update/checkin content from the repository and manage it in a separate folder. In Oxygen you will need to select an alternative DITA catalog file to avoid error messages around @cid, @ishcondition and other proprietary attributes when you open files. The catalog file is located in %LOCALAPPDATA%\SDL\InfoShare Client\xxxxxxx\Config\DocTypes\catalog.xml.

    I hope my suggestions give you some additional handles and ideas to optimize the process where your SME's get currently stuck.