3 Replies Latest reply on Oct 5, 2017 10:09 AM by phoffmann

    File transfer

    TonyOgles Apprentice

      We are currently creating a task to copy Office 365 to multiple locations, the package size is very large. Once the files are transferred to one system at that location successfully, the next time the task is run on a system at that location will it pull from that system or pull the files from the server again. If not is there a setting that would make that possible? Hopefully that's not too vague, any ideas or suggestions appreciated.

        • 1. Re: File transfer
          phoffmann SupportEmployee

          OK so ... I'll try to explain to you how things work & give you a couple of useful tips afterwards. I'm assuming you're on LDMS 9.6 / LDMS 2016 / IEM 2017 or so (the basic soft dist methods we've got / use have been around for a long time, it's just the details that differ if we're going back further than that). Listing what version you're on would generally be helpful (as a note for the future) .

           

          Short version -- It depends on how you configure things (I'll show you), but most likely "yes, we'll do peer download".

           

          Longer version: (Screenshots are from LDMS 2016 for reference)

          So it depends on what download methods you select / permit when you roll out your package based upon the "Distribution And Patch" agent setting you select.

           

          Refer to this screenshot (with highlights) for reference:

           

           

          Attempt Peer download (red highlight):

          This turns on the ability for clients to check (in their broadcast domain -- if you don't know what that is, think of it in simpler terms as "a subnet" [not entirely accurate, but it'll do]) whether they holler around the local network "hey - I'm looking for File X. Does anyone have it? ... which then allows other clients to respond with either "I have the full file" or "I have 35% of the file" (and such), to minimise download over the WAN.

           

          Attempt Preferred Server (yellow highlight):

          If you have preferred servers set up / configured on the Core, this can make your software distribution life much easier.

           

          At it's heart, how this works is it changes an example download string of - \\{MyHomeServer}\MyShare\MyFile.exe -- over to -- \\{MyLocalServer}\MyShare\MyFile.exe --for every file that needs to be downloaded. If the file exists on the preferred server - great. If it doesn't exist on the preferred server, we fail over to source if need be.

           

          Allow Source (purple highlight):

          Enabling/disabling this grants you fine control over whether you wish to either allow or disallow clients being able to "reach home" to the Core (or wherever you host your packages centraly).

           

          In your particular scenario, where you've pre-spread the file out, this might be a useful control mechanism. However, if things go wrong (because the file was corrupted), clients will not be able to perform the task (as they weren't allowed to download a healthy copy from the source).

           

          Use (Self-organised) Multicast (cyan highlight):

          This sets clients up for a self-organised multicast session. The "self organised" side of things means that it's the clients that set up the multicast session among themselves.

           

          1 client serves as the download agent from wherever the package is, and then multicasts it around the local network. As such, this is a subnet aware download (1 and only 1 client at a time will be downloading the package). This has in-built resilience, so if "the download agent" gets shut down, someone else picks up the slack & becomes "the new download agent"

           

           

          TIPS & Education:

          • In my experience, when transferring Large packages with lots of files, it's pretty much always better to ZIP them up first (and then decompress them locally on the client). Office especially is a candidate for this.
            • Reasons include that it's generally easier / more efficient to copy over ONE large file rather than 10,000-s of them.
            • Also the likelihood of something going wrong with ONE of those 10,000 files is much larger than something going wrong with the large container.

           

          • Files stay in a clients' cache for a specific amount of time. This duration (and location of the cache, if deviating from the default) are configurable in the CLIENT CONNECTIVITY agent setting. The default is usually set to 7 days.

           

          Right - that should answer your question(s) and give you something to read?

           

          Generally, I'd suggest that you read through some of the articles explaining software distribution in this section (including the troubleshooting articles as you're going to have "something" go wrong at some point) here -- Software Distribution .

           

          Hope that helps .

          1 of 1 people found this helpful
          • 2. Re: File transfer
            TonyOgles Apprentice

            When it looks for the file, does it look for the same package or if another package has the files it needs it pulls it from it as well?

            • 3. Re: File transfer
              phoffmann SupportEmployee

              Specifically, the request look essentially like this (HTTP or UNC-path respectively):

               

              "I am looking for this file -- http://ServerName/ShareName/Filename.exe -- with a Hash of 123456790ABCDEF -- does anyone have the file in full or in part?"

               

              So if a client happents to have "Filename.exe" but it's from "Http://Server571/SomeOtherShare/SomethingElse" - then that's not going to advertise, as it's not part of the same package (again - another reason why "a big zip containing all the files" is a better idea.

               

              Does that make sense?

               

              We (in essence) duplicate the file structure(s) of wherever we download files from, to guarantee uniqueness. Otherwise, you'd run into file-name duplicate problems.