Unable to download objects using IshRemote

Hi,

I'm unable to get this script to work, would someone be able to take a peek at it?   maybe? :D 

This gist is, I'm trying to download the latest version of all map objects from my repo modified after 2019. I need to do some analysis on our doc set. Let me know if any more info is needed.

Fullscreen
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
$ishSession = New-IshSession -WsBaseUrl "https://*******/ISHWS" -IshUserName "******" -IshPassword "******"
# Define MetadataFields for status and FDOCUMENTS
$statusFields = Set-IshMetadataFilterField -IshSession $ishSession -Name "FSTATUS" -Level "Lng" -ValueType "Element" -FilterOperator "Equal" -Value "VSTATUSRELEASED"
$cardtypeFields = Set-IshMetadataFilterField -IshSession $ishSession -Name "FMASTERTYPE" -Level "Logical" -ValueType "Value" -FilterOperator "Equal" -Value "Map"
$modifiedOn = Set-IshMetadataFilterField -Level Lng -Name MODIFIED-ON -FilterOperator GreaterThanOrEqual -Value "01/01/2019"
$docLang = Set-IshMetadataFilterField -Level Lng -Name DOC-LANGUAGE -FilterOperator Equal -Value en
# Merge metadatafields into an array
$metadataFields = $statusFields, $cardtypeFields, $modifiedOn, $docLang
# Get unique logical objects
$logicalIds = Find-IshDocumentObj -IshSession $ishSession -MetadataFilter $metadataFields | Select-Object -Property IshRef -Unique
# Define the directory where files will be downloaded
$directoryPath = "D:\InfoShare\Data\Publish\ExportedMaps"
# Create the directory if it does not exist
if (!(Test-Path -Path $directoryPath)) {
New-Item -ItemType Directory -Path $directoryPath
}
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

thanks,

Charlie

emoji
Parents
  • Hi Charlie

    Sometimes you get stuck in a loop, leading up to errors like yours which seem unexplainable. So I hope you don't mind that I used your one-sentence goal and started from scratch using ISHRemote (v7+) available cmdlets and PowerShell constructs.

    Best wishes,
    Dave

    Fullscreen
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    $ishSession = New-IshSession -WsBaseUrl "https://*******/ISHWS" -IshUserName "******" -IshPassword "******"
    $masterTypeValue = "VMASTERTYPEMAP" # Get-IshLovValue -LovId DMASTERTYPE
    $statusValue = "VSTATUSRELEASED" # Get-IshLovValue -LovId DSTATUS
    # Define the directory where files will be downloaded, create if does not exist
    $folderPath = "C:\TEMP\20240515.ISHRemoteDownloadMaps"
    if (!(Test-Path -Path $folderPath)) {
    New-Item -ItemType Directory -Path $folderPath
    }
    # Define metadata filter criteria
    $filterMetadata = Set-IshMetadataFilterField -Level Logical -Name FMASTERTYPE -ValueType Element -FilterOperator Equal -Value $masterTypeValue |
    Set-IshMetadataFilterField -Level Lng -Name FSTATUS -ValueType Element -FilterOperator Equal -Value $statusValue |
    Set-IshMetadataFilterField -Level Lng -Name MODIFIED-ON -FilterOperator GreaterThanOrEqual -Value "01/01/2019" |
    Set-IshMetadataFilterField -Level Lng -Name DOC-LANGUAGE -FilterOperator Equal -Value "en"
    $downloadCount = 0
    # Using the PowerShell pipeline operator "|"
    # Recursively going to DITA Map folders, preferred over 1 big Find-call,
    # per folder retrieving the latest version of the Content Objects matching the filter criteria
    # where every matching ishobject, identified by loop variable "$_" is processed
    XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

    emoji
  • Hi Dave,

    Thanks for the reply! Glad to see you're still around - it's been many years since we met :) 

    That script works perfectly, thank you. I do have some additional requirements I hope you can help with.

    • For each file I download, I need to automate the creation of a .met file with the same name that contains: the username of the last person who edited it and the date edited 
    • In the output folder, I need to create folders that correspond to the fourth level of the repo folder structure, eg, General/Cloud Content/DGC/ProductName. And when the script is recursing under the ProductName folder, I need all the content to be downloaded into that folder.

    Can you tell me if these things are conceptually possible? I've been trying to use ChatGPT to help with this, but it keeps generating code that's conceptually flawed IMO.

    thanks,
    Charlie

    emoji
  • Sure has been a while :)

    I'll hint you how I would solve your two mentioned challenges. I hope you don't mind that I'm not your personal AI Assistant ;-)

    Regarding extra metadata, ISHRemote by default already retrieves some human-readable metadata (not system fields). This means that the loop variable "$_" already holds data, so you could replace line 26 with

    Fullscreen
    1
    Write-Host ("Downloaded fileName[" + $fileInfo.BaseName + "] fishlastmodifiedby[" + $_.fishlastmodifiedby + "] fishlastmodifiedon[" + $_.fishlastmodifiedon + "]")
    XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

    Regarding aggregating per level 4 folders "Product Name". The solution is some loop around the above code sample. Not the fastest code, but it reads relatively easy

    Fullscreen
    1
    2
    3
    4
    5
    6
    7
    8
    9
    # Simply execute this line of code, from here you could build your File System folder structure
    Get-IshFolder -FolderPath "General" -Recurse -Depth 4 | Get-IshFolderLocation
    # And you will see a structure appear like
    # \General\Graphics
    # \General\Import
    # \General\Libraries
    # \General\Masters
    # \General\Mobile Phones Demo
    # \General\Publications
    XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

    Note that line 22 in the above example contains the word "General", you could dynamically change the starting folder of the recursion to your wanted deeper "General\Cloud Content\DGC\ProductName" (note the backslashes instead of slashes).

    Hope this helps,
    Dave

    emoji
Reply
  • Sure has been a while :)

    I'll hint you how I would solve your two mentioned challenges. I hope you don't mind that I'm not your personal AI Assistant ;-)

    Regarding extra metadata, ISHRemote by default already retrieves some human-readable metadata (not system fields). This means that the loop variable "$_" already holds data, so you could replace line 26 with

    Fullscreen
    1
    Write-Host ("Downloaded fileName[" + $fileInfo.BaseName + "] fishlastmodifiedby[" + $_.fishlastmodifiedby + "] fishlastmodifiedon[" + $_.fishlastmodifiedon + "]")
    XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

    Regarding aggregating per level 4 folders "Product Name". The solution is some loop around the above code sample. Not the fastest code, but it reads relatively easy

    Fullscreen
    1
    2
    3
    4
    5
    6
    7
    8
    9
    # Simply execute this line of code, from here you could build your File System folder structure
    Get-IshFolder -FolderPath "General" -Recurse -Depth 4 | Get-IshFolderLocation
    # And you will see a structure appear like
    # \General\Graphics
    # \General\Import
    # \General\Libraries
    # \General\Masters
    # \General\Mobile Phones Demo
    # \General\Publications
    XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

    Note that line 22 in the above example contains the word "General", you could dynamically change the starting folder of the recursion to your wanted deeper "General\Cloud Content\DGC\ProductName" (note the backslashes instead of slashes).

    Hope this helps,
    Dave

    emoji
Children