5 Replies Latest reply on Feb 14, 2018 2:37 AM by phoffmann

    Modeled Data showing zero

    rictersmith Specialist

      I have a custom data registry key that I am successfully pulling in from our Windows devices. The registry key is a string value which contains only integers from -3 to 300. By default the unmodeled data is being captured as an NVARCHAR. This is causing issues when trying to create reports and queries that use the greater than\less than logic.

       

      I attempted to model the data and change the data type to INT. I left every thing else as default and added the data to a new table name. Upon completion, the table in the database and all the inventory data showed all values as 0. Essentially wiping out all of my captured data. I have since used the database doctor to clear out the attribute setting it back to unmodeled data. New scans are starting to repopulate the data as expected, but I am still left with the problem and need to get this set as an INT datatype. 

        • 1. Re: Modeled Data showing zero
          RickS SupportEmployee

          Rick,

          Please refer to this document:  Basic Data Modeling for Management Suite  and the sample I provided via email.

          • 2. Re: Modeled Data showing zero
            rictersmith Specialist

            Thanks. I will give this a try and see what happens and report back in case others are having this issue. Since Model Attributes is broken when trying to do this through the Manage Software List area, we have requested a case be opened to fix the product so its working as designed instead of having to use XML files, using CoreDBUtil, and working through a basic 23 pg document on how to do this correctly. I've still got several of these to do geared towards adding ROI on the product, which seems to be getting eaten up by functionality that isn't working and a pain in the butt work around.

             

            Rick

            • 3. Re: Modeled Data showing zero
              phoffmann SupportEmployee

              Actually, building modelled tables via XML is what you WANT to do.

               

              1 - it's 100% controllable (none of this "experimenting in live" shenanigans).

              2 - It's 100% repeatable (SUPER useful for those multi-core environments / "oh noes, everything burned down" situations) without human error being a factor

              3 - It's easy to implement (building the XML definition is usually what trips up folks).

              4 - ... if things work out, I may host a session @ next years' Interchange on (a) how to do it, and (b) have a detailed article at some point next year on how to do it from 0 to finish. That's one of my hopes (and pet projects) at least. Will see how this goes / whether my suggestion for the class finds favour with the powers that be.

               

              There's a LOT of benefits to using the XML's for it - chief one being including indices, and so on.

               

              Trust me - it's the best way to do it. I've had a lot of experience building modelled tables for all manner of folks & situations over the years .

              • 4. Re: Modeled Data showing zero
                rictersmith Specialist

                So there are a few issues here at play with my original issue and question:

                1. Ivanti needs to fix the data modeling to persist and carry over the existing information, if its in the product you gotta own the issue. This has been validated as broken with our TAM and suggesting customers don't use built in features is dumb. The GUI simplifies IT management. The XML over complicates it from an end user perspective, which is probably why there is such a low adoption rate and learning curve even for internal ivanti employees. Glad to see there is an alternative, aka 'the original method', but we paid for a working product aimed at simplifying IT.
                2. Not often included in the instructions, but well known by those that constantly do this, is that you have to restart the service.. the instructions actually seem to call for a full server reboot in some documents. As part of the modeling data, it would make sense to prompt the user and then offer to restart the service automatically at the end of the process. Building this into the product makes it less error prone and fool proof (and I would not have wasted several hours because the original instructions I was sent didn't include the specification to reboot or restart the services after the change.)
                3. So why all zeros? Besides the fact that it didn't copy the data over, it seems with SQL an integer column must have a value, so 0 is the default value inserted. I think usually -1 is used to represent NULL often times, but this also explained why the same process did not write all zeros and there was just empty data for a column that was NVARCHAR.

                 

                So after 2 months of trying to get a simple XML to convert the one field to an INT for reporting (I appreciate those from ivanti that tried to jump in and assist with an XML, but I point out the 2 months just to reiterate how overly complicated this process was and more than it should have been) I ended up getting the data modeling to work, by modeling the data through the GUI, allowing the bad zeros to appear, restarting the service and then waiting for the correct values to show up. Reporting now works great!

                 

                Rick

                • 5. Re: Modeled Data showing zero
                  phoffmann SupportEmployee

                  Couple of observations here. (Not taking offence to anything here - just hoping to enlighten a little bit )

                   

                   

                  • Ivanti needs to fix the data modeling to persist and carry over the existing information(...)

                    This will take a lot of writing up on why what you're asking for is a "rather complicated" affair at best and - in actual fact - quite likely undesirable. A quick main point here is that *ALL* unmodelled data is stored as a string -- and you may in fact want/need certain data to be handled as differently formatted strings / datetime's etc.

                    Happy to discuss / explain this if you can make it to Interchange in detail. You can have your TAM point me out easily enough if you're planning on making it (I think he's coming? Otherwise, he can describe me to you easily enough / we'll get a way to meet up somehow).

                   

                  • Not often included in the instructions, but well known by those that constantly do this, is that you have to restart the service. the instructions actually seem to call for a full server reboot in some documents. As part of the modeling data, it would make sense to prompt the user and then offer to restart the service automatically at the end of the process. Building this into the product makes it less error prone and fool proof (and I would not have wasted several hours because the original instructions I was sent didn't include the specification to reboot or restart the services after the change.)

                    >> Are you intending to come to Interchange this year (be it Dallas or Madrid)? If yes, I'd suggest signing up to the class I'll be presenting on just this ("how to model custom data") and associated topics (how to copy / move currently unmodelled custom data, etc).

                    Even if you're not going to be present, around Interchange time, I should have (*fingers crossed*) a whole "nose to tail" (with multiple examples & so on) type white paper finished & have that be published on community.

                    The great news is that PROPER data modelling (i.e. - using the XML route) is neither COMPLICATED as such (it really isn't as long as a few rules I'll be pointing out are followed), nor is it difficult really. It DOES require paying attention to detail (in some cases, fiendishly so), but there's no requirement for any kind of PHD or whatnot there is the good news.

                    It's one of my intentions for this year (both at Interchange for those who can attend & for the community in general with the article when it's all done) to de-mystify this whole process for everyone. Has been on my "to do" pile for a while, but there's only so much time for various pet projects & various other things to do / write up, etc. This year, it's data modelling's turn.

                   

                  • (... )it seems with SQL an integer column must have a value, so 0 is the default value inserted. I think usually -1 is used to represent NULL often times, but this also explained why the same process did not write all zeros and there was just empty data for a column that was NVARCHAR.

                    >> Not correct actually. Both Integer & String values in SQL can have NULL values (which are empty, not '-1'). The only time a value of some sort is required (usually defaulting to '0' commonly) is if a field is defined as "must not be NULL". You can "see" (sort of) this in regular inventory by us not displaying things. Essentially, by default we will not display a NULL value in the console / inventory tree (unless one specific clause applies).

                    So - for instance, you won't see a Linux "Software Package Manager" type tree for a Windows device (because such entries don't exist and/or are NULL). It's just "not seeing" stuff is a bit trickier to demonstrate.

                    This is something that is very controllable when using XML (which in 99% of cases is much more preferable, ESPECIALLY for enterprise level accounts, by the way, for numerous reasons) -- and better still, you can inlude indexing & display masks and various other tricks & useful items.

                    Much like the "Query tool" in the Windows console is a simplified version of SQL (without "needing to actually know SQL") you inevitably lose a lot of functionality by simplifying it. The LDDA data modelling tool has a similar problem. It attempts to simplify something (making it more accessible) ... but in the process, loses out on a LOT of stuff.

                    It's why I'm getting around to finally demystifying this whole process this year.

                   

                  • So after 2 months of trying to get a simple XML to convert the one field to an INT for reporting (I appreciate those from ivanti that tried to jump in and assist with an XML, but I point out the 2 months just to reiterate how overly complicated this process was and more than it should have been) I ended up getting the data modeling to work, by modeling the data through the GUI, allowing the bad zeros to appear, restarting the service and then waiting for the correct values to show up. Reporting now works great!

                    >> It doesn't have to be. It's again - not really complicated (I appreciate you sort of have to take my word on that at present, because I don't have a finished doc), but it does have rules that need to be followed. Knowing those rules (which is something I'll be clearly communicating) is by and large all the difference here.

                    It comes down to the the "six P-s" rule by and large - going into this forearmed with knowledge/information. Which is precisely the reason why I've had this on my "must get around to this"-list of things for a longer time. So that people CAN have this easily accessible & know what they do need to know / watch out for to make informed decisions & attain success.

                    =====

                    As an aside (and minor preview), no reboot of the server is ever needed for this (modelling of data, that is). It may not be a bad idea to "make Windows feel better" (since you may treat it as downtime, potentially) but it's not a requirement. Just a (re-)start of the Inventory service is all that's needed here.

                    Reason for this is that the service caches the schema of the database when it starts, and changing it in the background is a great way for introducing logical corruption (which can be cleaned up) into your DB unnecessarily.

                   

                  Hope to perhaps talk / see you at Interchange.