Data Analytics 101

Version 3

    Verified Product Versions

    LANDESK Management Suite 9.6LANDESK Management Suite 2016.xLANDESK Endpoint Manager 2017.x

    Introduction

     

    This document is an introduction to Data Analytics. In this guide we'll be going over each part of Data Analytics at a basic level, as well as some best practices.

     

    Data Analytics Components

     

    Asset Control

     

    Asset Control is a section of Data Analytics dedicated to devices that can't be managed, or shouldn't be managed (i.e. no agent installed). These are usually routers, switched, monitors, computer parts, etc. However, Asset Control may also be used as a repository for decommissioned assets that you still need to report on, and may still have a historical inventory, but are no longer in use.

     

    There are a few different ways to populate Asset Control:

     

    Data Translation Services rule

    Discovery Services scan

    Archiving an asset using the right click menu

     

    It's important to keep in mind the following things:

     

    You can move an asset from "Management Suite" (i.e. managed with full inventory, assumed to have an agent, etc) to Asset Control, but you cannot move a device from Asset Control to Management Suite. If you archive an asset, and then need it back in Management Suite, your only option is to re-install the agent and wait for an inventory scan from the agent.

     

    Asset Control is a section that is completely cut off from the rest of Management Suite. The data is stored in its own tables in the database, and it has its own functions. You shouldn't expect functions to cross over.

     

    Asset Control relies heavily on an attribute called "Class". This doesn't exist in Management Suite, but is roughly equivalent to "Type" (type also exists in asset control). If you import devices into asset control with a custom DTS rule, you should try to have something map to Device.Class.

     

    In Management Suite, the "root" attribute is called Computer. In Asset Control, it's called Device.

     

    Console Extender

     

    Console Extender allows you to add custom functions to the console's right click functions. Console Extender is basically just a "Run" process that's pre-filled, so anything you try to put into console extender, you should first try in a normal "Run" prompt.

     

    Console Extender is useful for remote functions you may preform regularly, like opening the C$ share of an endpoint, or opening that machine's services. The main benefit of Console Extender is that you can "inject" inventory attributes into the command, so they run dynamically per machine. An example of this is the built in Ping function, which pulls the device's IP address from inventory and runs ping -t against that value. It's important to note that Console Extender runs under the windows user that launched the console, not the user credentials provided to login to the console. If you log in to windows with user A and launch the console, and then login to the console under user B, console extender functions will run as user A, the user that actually launched the console itself.

     

    Console extender entries are per console, and stored in the machine's registry. That means if you make an entry on console A, and you want console B to have that option, you either need to add it to console B manually, or you can export the regkey from console A and import it into console B, then restart console B.

     

    The console extender regkey is: HKEY_LOCAL_MACHINE\SOFTWARE\Managed Planet\Console Extender\Plugins

    Data Translation Services

    Data Translation Services is comprised of rules that modify, move and create inventory data. It's important to keep in mind that DTS does not remove data, it can only transform or create data. This is a fundamental concept of DTS.

     

    Most DTS rules have 2 main attributes to consider, the Source Attribute and Destination Attribute. One of the most powerful things about DTS is that the destination attribute does not need to be an existing attribute. You can decide what the attribute is, like "Computer.Random.Some Random Attribute". DTS will create any missing attributes needed to store a value in the Destination attribute.

     

    DTS has several rule categories, too many to cover here. The most commonly used rule categories are explained below.

     

    Barcode Web Form/Web Group: A barcode web form is a web form that can take field inputs and use those to create a new device, or update an existing one based on a common attribute. More info can be found here: How To: Import Assets Using Barcode Web Forms

     

    Calculate Data: This type of rule takes a source attribute, and based on that, calculates a new value that is then stored in a destination attribute. This rule type makes use of vbscript to preform custom calculations.

     

    Import Data: This type of rule is a general purpose data import. It can connect to various data sources such as a database, excel sheet, xml, etc, and map that data to attributes in a new or existing asset.

     

    Licensed Software: These rules are rarely ever touched, but are very important, so they're being covered. These rules are essentially "Product Definitions" for software licensing in data analytics. They determine if software is installed, and how to preform license allocations. These run nightly to keep software usage and licensing up to date.

     

    Map Data: These rules take an attribute from a source attribute and copy it to a destination, often modifying the attribute in the process. A good example of a use case for this rule is the Standardize Serial Number rule, which copies the value of multiple source attributes into a single, easy to report on destination, which addresses issues arising from the Serial Number potentially being stored in one of a dozen different attributes.

     

    Software License Import: These rules connect to a data source that contains software license information, and imports those licenses for use in DA and SLM software license calculations. More info can be found here: How To: Import Software Licenses With A Software License Import Rule

     

    Discovery Services

     

    Discovery Services is a powerful network discovery tool in Data Analytics. It is primarily comprised of configurations for SNMP and WMI scanning.

     

    SNMP Scanning:

     

    This method will scan for devices on specified subnets, using the snmp configuration you provide. This supports up to SNMPv3, with security up to AuthPrivacy using Sha and AES256.

     

    By default we provide the iso.dod.org.int MIB to scan for. However, scanning some appliances may require importing a MIB file from your device's vendor to obtain all possible information. A good example of this is Lexmark Printers, some of which use OIDs under 1.3.6.1.4.1.641.5 for ink and toner levels, which is not part of the default MIB provided in DA. For this, you would need to obtain the .mib file from Lexmark or an online MIB repo and import it into DA.

     

    Devices scanned by SNMP can be stored in Asset Control will all obtained SNMP data. This can be a robust reporting option, as SNMP scans, with the proper MIBs, can gather a lot of information from a device. Printers are a good example, with many being able to report ink and toner levels, hardware health, uptime, current pages in the tray, error history, what features are available, mode and vendor info, etc.

     

    WMI Scanning:

     

    WMI scanning uses remote WMI calls to pull in asset information. This is primarily limited to Windows Machines given the user of WMI, but can often pull in even more information than SNMP, and can serve as a lightweight replacement for the agentless scanner service.

     

    Discovery services can also deploy SNMP and WMI scanning configurations to "Discovery Services agents", which is an agent that sits on top of an existing EPM agent and can run these configurations on remote subnets that aren't accessible from the Core directly, or that would take the Core too long to scan due to routing, latency, etc.

    Executive Report Pack

     

    The Executive Report Pack, often called ERP, is a powerful reporting option that takes advantage of features in DA like Asset Control. ERP reports are web based, and can be sent as a link to others, or saved in a variety of formats like HTML, PDF, etc.

     

    ERP Reports have the ability to report on both Management Suite and Asset Control data, or even both at once, providing a bridge between the 2 sections.

     

    ERP Reports can also connect to multiple cores and report off of the combined data, providing a unified reporting option.

    Rapid Deployment

     

    Rapid Deployment is a process that is used to automatically deploy agents to discovered unmanaged devices. The process is made up of 3 main steps:

     

    Scan

    Filter

    Deploy

     

    The scan is run by Unmanaged Device Discovery at whatever interval is set in Rapid Deployment. You can then configure a query that filters the results, allowing you exclude certain subnets, operating systems, specific hostnames, etc.

     

    Rapid deployment configurations are set per Agent Configuration, so you could have one running against a Windows Workstation agent configuration that uses a query to only target machines running a workstation edition of Windows, and then vice versa for a Windows Server agent configuration.

     

    Real Time Scan Processing

     

    This isn't necessarily a separate component, as it's part of Data Translation Services, but it's worth covering separately. Real Time scan processing is an option that will run selected DTS rules against every scan file that comes in, as it comes in. This has some benefits:

     

    Data manipulation occurs immediately

    You can schedule fewer rules

    You can avoid unwanted data that will be transformed later

     

    However, using real time scan processing causes higher CPU usage, and can cause higher RAM usage. It also adds extra time to scan processing, and some rules are not a good fit to run as an active rules.

     

    Best Practices

     

    Below are some best practices to keep in mind when using data analytics for various functions:

     

    • Set the Managed Planet Core Scan Processor service to a window user of some kind that has local admin access to the Core. The main reasons for this are:
      • If running any DTS rules that download data, it's easier to set the INETCACHE to clear automatically for a user, than it is for LocalSystem
      • If running any DTS rules that connect to an external resource, that user will be making the web request, etc. If your environment uses proxies or firewalls, it's likely LocalSystem will fail to pass authentication.
      • If running any DTS rules that connect to a file share or other local network resource, the user the service runs under will need to have access. This is often easier with a user account than a machine account.
    • The following DTS rule types should not be made active for any reason
      • Anything that imports a file (ex, Software License Import from a CSV)
      • Anything that contacts a web resource (ex, Lenovo Warranty Import)
      • Anything in this document
    • The following DTS rule types are recommended to not be made Active, and performance should be monitored closely after adding any as active. They should only be added one at a time.
      • Calculate Data - These are CPU intensive
      • LDAP Import/Export - These can cause excess strain on your domain controller, and extra wait time for scans

     

    Additional Information

     

    DA Landing Page

    Basic Troubleshooting Guide - Data Translation Services - Real Time Processing Basic Troubleshooting Guide - Data Translation Services - Real Time Processing

    Basic Troubleshooting Guide - Data Translation Services - B2B Connectors

    Basic troubleshooting guide for Data Analytics - Web import rules

    What rules should be Scheduled and not included in the DTS Active group for Data Analytics?