How to replicate data between different sites?

Hi All,

I'm wondering whether or not is possible to implement something for data replication between sites.

For instance, something such as 'proxies' on different sites (around the globe) where the CMS data will be placed and those proxies will sync everyday, this in order to have the data in all the sites and mainly help those where the bandwidth is not much favorable with the intent to avoid delays/timeouts when the data (documents) are being transfered from the main server to the client and viceversa during business hours.

Our goal is basically create a lightweight process for data manage between sites within the CMS (SDL KnowledgeCenter 2016).

Is it possible or is there any other solution/option/alternative, that we can implement to achieve this goal with Knowledge Center 2016?

 

Appreciate any comments.

Thanks,
Jose

Parents
  • Hi Jose,

    This is not currently supported.

    Syncing multiple databases once a day is likely to result in many conflicts that need manual resolution (or that simply overwrite previous changes). These sync operations would likely become enormously complex.

    If you consider a simple internet proxy that caches content, you might try a simple test with fiddler to see if binaries downloaded multiple times appear to be the same request. Binaries are probably the largest files you download. Then test if a modified version of the binary shows as a different request.
    If unmodified binaries show the same request, and modified binaries show as a different request, it may be worth discussing a caching proxy with your network team.
    Note I don’t know of anyone who has tried this.

    Another approach would be to find your longest operations (binary download, sync operations, etc.), benchmark them in your various locations, then focus on how you can optimize those operations both in Knowledge Center and across your internal network.

    As part of that process, benchmarking general network performance in each location might be helpful. We once discovered that a remote office had faster internet on their phones than in the office.

    Jeff
Reply
  • Hi Jose,

    This is not currently supported.

    Syncing multiple databases once a day is likely to result in many conflicts that need manual resolution (or that simply overwrite previous changes). These sync operations would likely become enormously complex.

    If you consider a simple internet proxy that caches content, you might try a simple test with fiddler to see if binaries downloaded multiple times appear to be the same request. Binaries are probably the largest files you download. Then test if a modified version of the binary shows as a different request.
    If unmodified binaries show the same request, and modified binaries show as a different request, it may be worth discussing a caching proxy with your network team.
    Note I don’t know of anyone who has tried this.

    Another approach would be to find your longest operations (binary download, sync operations, etc.), benchmark them in your various locations, then focus on how you can optimize those operations both in Knowledge Center and across your internal network.

    As part of that process, benchmarking general network performance in each location might be helpful. We once discovered that a remote office had faster internet on their phones than in the office.

    Jeff
Children
No Data