3 tips to improve your DA Imports

Version 1

    Verified Product Versions

    LANDESK Management Suite 9.5LANDESK Management Suite 9.6LANDESK Management Suite 2016.x

    As Data Analytics has progressed in the Management Suite product there have been several features that have been optimized to work within certain conditions. One of those things is scheduling imports to run at specific times of day, instead of relying on real-time scans to import data from remote endpoints. In this post, I'll cover three ways that you can make your imports cleaner and more optimized to work in your environment.

     

    Tip 1: Use VBScript to add custom formatting to certain fields.

    In most of the tools available in Data Translation Services, you're allowed to use VBScript. This is super helpful if you're trying to normalize certain pieces of data or run your own calculations on data. This is definitely the most powerful part of DA, so freshen up on some VBScript if you'd like to really get into it with DA rules.

    Tip 2: Use Static values to easily find data you're looking for.

    This is an underused feature in DTS. A lot of times people will set up imports and attempt to run them, and then wonder if their import worked or not and what devices it ran on. If you use this simple tip, you can always find devices that the rule ran on, even if the rule wasn't supposed to run on that device. This is great for debugging your process while you figure out how you want to properly import your data. The idea here is that you'd set up a static value and put in a unique value or even the name of the import you're creating. Then you can set up a query to target your static value and see a list of all the devices that have that static value. it's an easy thing to do and I think it's a must if you're setting up any kind of import for the first time.

    Tip 3: Use LDMS queries as a target for your DTS configurations.

    As mentioned in the last tip, you can use queries to find static information that you've imported into inventory. However, don't forget that you can also use queries as targets for your imports. This can allow you to be much more specific with import data for existing devices in your inventory. A lot of the DA pro's use this and I'd recommend getting familiar with it.

     

    Scenario

    To put all of these together so you can see it in action, here's a use case that you can use to put all of this into context. So here it goes, let's pretend that we have an environment where I have information that needs to be synced from a third-party application that contains specific device information, like maybe location info or some other user data. Ideally, I only need this information to be in Management Suite if it changed or was out of date but if I just schedule an import then it will always import a bunch of data all at once over and over. With the tips that are detailed above, there are some fun things that we can do in Data Translation Services to get a system set up for this so that I know that the data is up to date and I'm not importing large amounts of data as often.

     

    Note: I've attached a proof of concept to this post. You can import those into a lab and play around with the rules if you'd like or you can follow along and build it yourself.

     

    Step 1: Add a new field to your import that imports the date and time since the last import.

    You can add a date field pretty easily with some VBScript that I’ve illustrated in the image below. The idea is that you’d put the date in the same group with the rest of the data your importing and then you can use that as a key for further queries.

     

    For this example I’ve set up a CSV file to import from, these same rules and options will apply to a SQL import.

    Here’s what my csv looks like:

    The first column is my Device ID for an existing device in my core. Since this is just an example I just used a simple way to match data to a specific device in my inventory, but you'd probably end up using something like serial number or asset tag in a real environment.

     

    Here’s how I built the import:

    The Import field is a static field that I've added to this import which has the name of my import in it. This allows me to run queries and find any device that has this kind of data in it really quickly.

    Here’s how I’ve set up the date field. Essentially I marked a static value and then used the VBScript field to overwrite that with the current date.

    Now when I run the import I get the following data imported into an existing device in my inventory:

     

    Now that I have data in here, I'd like to set up a way to get this data regularly, but only if it's needed. To do that I figured I'd use a calculation rule and then use a query to filter the devices down only to machines that need data to be synced at a particular time.

     

    Step 2: Calculate if the devices need to be synced.

    In this step, I added a calculate rule with some additional VBScript to calculate if a sync was needed.

     

    After creating this I ran it and had it update my inventory for my existing device. (Note: this rule can be activated and ran real-time. This will keep the query that we’ll create later up to date with devices that have met the 6 hour criteria.)

    Here’s what my inventory looks like now:

    2017-01-24_11-50-47.png

     

    Step 3: Create a Query.

    Here’s what my query looks like:

     

    Step 4: Create a User Group and drop in the rules.

    Now we can go ahead and create a group called whatever you'd like and add your two rules to it. This will make it easy for you to get at these later.

    (Right-click on User Group and select Add Group...)

     

    Note that I’ve set the Calculate Sync rule active at this point.

     

    Step 5: Schedule the import.

    In our fifth and final step, I realized that I needed to also make sure that I updated Sync Needed column so that it didn't need another sync till it met the 6-hour mark we specified in our calculation step. So I added this static column to the import:

    Then I added a target to the import rule and selected the query that I created earlier: (right-click on the rule and select Set Target... to get this window)

    And now I have a target selected:

    Now that you have a target set up, all that needs to happen now is to right-click and schedule the rule to run once a day and as long as inventory is coming in from your devices they will eventually reach the sync needed state and get picked up by your import.

    Conclusion

    Once you've completed this then you'll have a process set up to update this information roughly every day as long as inventory is coming in for your devices. You can adjust the numbers and values as you see fit, but you can use this kind of technique anywhere within the import tools in DA, and if your end-point that your collecting from contains audit history date's it can get even more interesting where you can only run an import if there is new data to get. Data Translation Services is a super powerful tool and hopefully, these tricks help you accomplish goals that you might have in your organization.