7 Replies Latest reply on Dec 7, 2018 2:01 PM by phoffmann

    Data Translation Services

    meltdowner Apprentice

      I have no idea how this feature works or what it's for.  I read https://help.ivanti.com/ld/help/en_US/LDDA/10.0/Content/DA/ldda-t-data-translation-services.htm


      "Data Translation Services (DTS) is a Data Analytics tool for Ivanti® Management Suite that scans your organization's devices for the inventory data you most care about, such as software licensing, warranties, and so on. Once the data is scanned into the inventory database, you can customize, aggregate, and organize it in reports to make informed and practical decisions about hardware and software purchases and needs."


      Ok, sounds great.  So how to I use it?


      When I go to the data translation services to find details about how these are being generated, it's all empty folder structures.  I did "green light" the thing.


      Does this affect client user/local device CPU processing?  Meaning, will the user see a performance hit from enabling it?



      My whole goal here is to populate things like the "owner email" fields in the database.  These have never been populated for us and we have always been deriving the data from logon user names.

        • 1. Re: Data Translation Services
          phoffmann SupportEmployee

          So # 1 - please don't "just hit the green light" thing ... that enables processing of real-time data analytics rules which - done carelessly - can hammer your Core quite a bit (depending on whether or not it's been struggling with I/O to begin with).


          # 2 - There's quite a bit of information to be had in the dedicated section of community on that topic here -- Data Analytics . Including...


          # 3 - I'd STRONGLY recommend you start playing with Data Analytics rules on a test / dev box first. You can have a *LOT* of power with DTS ... but as such, "use it responsibly" (and it's much better / easier to reset a dev Core to a snapshot than it is to clean up a live server) .


          The dev-core should also be used to test any real-time processing stuff you want to develop / test first -- Even with the best VB-confidence, there's usually some things to be gotten wrong ... and you ONLY want to have "gold acceptable" stuff be put on the live server.


          DTS is *VERY* powerful and can pick up a whole bunch of data-enrichment. Just treat it with the respect it deserves ... .

          • 2. Re: Data Translation Services
            meltdowner Apprentice

            Hey Pat,


            Thanks for the reply but I'm still wondering how this can be useful for me.  Can you give me an example of how this is used for some clients out there?  A common use.

            • 3. Re: Data Translation Services
              phoffmann SupportEmployee

              To append - MOST of the DTS rules do NOT affect the clients much.


              There might be a bit additional stuff (say, like the SQL Server Instance data gathering), but most of the processing (such as normalisation of stuff) happens on the CORE. That's usually where often the mistake of "turning real-time processing rules" can bring a Core to its knees potentially (especially if it was already struggling to keep up).


              Depending on what's enabled / configured, there can be QUITE an overhead in processing inventory scans. (Essentially, the real-time processing picks up inventory scans BEFORE they get processed, trawls through them & "does the stuff" (such as - normalising vendor names, creating a new entry for an Unpadded IP-address, etc) ... and spits out a modified inventory scan file for the Inventory service to hoover up.


              Does that explain how things work "at the basic level" sufficiently?

              1 of 1 people found this helpful
              • 4. Re: Data Translation Services
                meltdowner Apprentice

                Hmmm.  It does help.


                I'm stuck on use cases, though.


                Thanks Pat, I will use the links you provided and do my own research from here.  Interesting.

                • 5. Re: Data Translation Services
                  phoffmann SupportEmployee

                  Sure thing, a few examples coming up.


                  • Using "MAP Gateway to location", you can specify (for instance) "Gateway == Site London" ... so any device coming in with a Gateway IP address of would get a "Location = London" field added.

                    The (network) Gateway information is one of the best / most consistent ways to at least place devices to a certain site.
                  • De-coding FQDN's / Device Names into sensible bits of data (if there's a set format to that). So for instance. "MyDevice-123456-ABC.SomeDomain.SomeBranch.Com" could break down into (and be ADDED to the inventory / decoded dynamically) ...
                    • Device Name == "MyDevice"
                    • "123456" ==> "The company asset-tag for this device is 123456".
                    • "ABC" == "This device belongs to Marketing-group ABC"
                  • ... and many more. Essentially, you can have a look at the existing rules (check the CALCULATION and NORMALISE groups for some examples). This includes things such as "calculate the TOTAL disk space / total % of available disk space", as a quality of life improvement.

                  • Read out the (by default - 0-padded) IP-address(-es) and create additional / separate entries of UNPADDED IP-addresses. So "" would be (separately) put in the unpadded field with a value of "" ... which MAY make life easier for certain reporting needs.
                  • Normalise manufacturer names ... so those "20 variations of writing HP / Hewlett-Packard" would all be consistent.
                  • Highlight "Licensable software" (in a separate branch of inventory). This is a popular, "reverse SLM" sort of thing. While SLM shows you "here's what you HAVE / are USING...", the "Licenseable software" thing covers things from a "Here's what you should make sure to have LICENSES for..." (as you - for instance - have 50 copies of SQL Server flying around that you didn't know about).

                  ... Does that help as an insight into where / how this can be useful?

                  1 of 1 people found this helpful
                  • 6. Re: Data Translation Services
                    meltdowner Apprentice

                    Yes Sir!


                    It's like string manupilation scripting before the data is thrown into inventory.


                    Am I able to create my own inventory records using this?


                    Am I able to disable EVERYTHING in data translation services that's there by default, and use only a few new "fields" which I create?

                    • 7. Re: Data Translation Services
                      phoffmann SupportEmployee

                      You can disable / de-activate everything that's there by default - yes. That's quite simple. Though a lot of things are quite useful (and included for a reason).


                      You can create/add your own rules (VB-scripting based), which is usually how the string-decoding stuff (for example) is handled.


                      You don't need DTS to create your own inventory records. Inventory scans are "just text files", so you can always copy & edit them (VERY useful on dev/test cores when you want to test this or that).


                      Unsure how to generate an output scan file? Check here -- How to generate an inventory output file from client machine for LANDESK support to do further troubleshooting?


                      ... changing the DEVICEID (and MAC-addresses) field(s) will allow you to create unique / distinct entries.


                      Again - I'd suggest that you start playing on a dev core first, and use/switch your accepted rules over to the live Core (be they "real time" rules, or "just run 1x / day" or whatever schedule you want).


                      That should help you out here .