The IT industry will be very familiar with Information Technology Asset Management (ITAM) - but it’s had a new twist as it’s grown up and dropped the ‘I’. The concept has morphed into Technology Asset Management (TAM) - because technology is no longer just ‘IT assets.’

Technology now encompasses a variety of assets coming online in every organization. These are not always controlled by IT, yet must necessarily be protected by it. It forms a conundrum for the teams trying to maintain a compliant and secure operational environment.

The.True.Cost.of.Data.Center.Over.Provisioning.FutureFacilities.PNG

The True Cost of Data Center Over-Provisioning

Almost half of data center decision makers are having to increase capacity within their existing power & cooling infrastructure.

Download this ebook by Cadence (Formerly Future Facilities) to find out more.

What’s changed?

In previous best practice, ITAM assets included devices such as servers, workstations, laptops, mainframes - and all configuration items including software. With Internet of Things (IoT) technology now increasingly pervasive there are many, many more things in the enterprise environment and on the network outside of what traditional IT could manage. Consider HVAC, security and environment sensors, webcams, etc., potentially owned by many different departments. All of these and many more can now be connected to the internet.

IT departments need to ensure that these are well-managed in accordance with regulation and basic cyber- and IT-hygiene needs. That’s why it’s wise to consider a new way of providing new solutions to these problems. TAM as a concept is to be agnostic to the type of technology device or operating system that an organization may own, and still track, audit, and understand the status of the assets in question.

Gartner believes that an enterprise needs to draw all the information in an organization’s technology assets and their current configuration items from a single, trusted unified and normalized source.

Imagine a company that builds products, but in the course of doing so has no control over the parts, sourcing or inventory of their products’ components. No assembly line will operate if a component is not available. IT faces a parallel problem - if we do not understand what we own, how it works, what it contains, its lifecycle, who is using it, and all the associated costs, then the organization runs a significant risk from the system becoming unstable. Problems might start small, but they will snowball.

The right way to build that assembly line or business is from a solid foundation capable of withstanding the vicissitudes of what real situations throw at it. It’s the same for a proper TAM plan – if built on various independent data sources it will undoubtedly get out of sync quickly. In that case it would then cause inaccuracy and lag that would resonate throughout the infrastructure and projects relying on the data to form accurate information and decision-making.

TAB in when you’re ready

Prepping a TAM program requires the creation of a Technology Asset Baseline (TAB) as the principal repository. It’s the first and most critical component to any asset management endeavor. That means that it should be cost-effective to implement and maintain. A primary design goal of a good solution is to collect, memorialize, normalize and present the essence, in part or whole, of the organization's technology in a single trusted and up-to-date source: The Technology Asset Baseline. The baseline supplies specific information to all use cases and the infrastructure at large, i.e. ERP, HR, cybersecurity, CMDB, etc.along with the projects that are ongoing at any given time.

The TAB is the trusted source of data to the infrastructure as it relates to the assets within. It must transcend all protocols in the gathering of the asset and configuration items to take advantage of all the instrumentation of the device, and utilize a vast array of protocols in its mission.

Those just beginning this process should take heed. If the organization has tools already they must be able to interrogate the asset with more than one option. They must have additional intellectual property to use in cases where protocols and instrumentation are disabled and lacking.

The baseline must be deep and granular, and requires the following capabilities:

  • Agnostic to the operating system of the technology device
  • Not limited to desktop or network or data center assets or IoT devices, it must be gathered from all categories of the active environment
  • Able to capture, contain and update the current state of any technology asset and display the associated components of the technology at the time of the scan
  • Include all technology hardware and related software regardless of what it is
  • Connect with, and by its nature combine, the intelligence of other standard infrastructure tools - such as Active Directory or Directory Services for ownership and location
  • Have examined the binaries of the software installed to expose challenging areas like Oracle and IBM software
  • Can capture and serve assets in sensitive and secure areas
  • Be gathered with agentless scanning and the intrinsic ability to normalize the scanned data
  • Be organized and technically capable of interacting and connecting with any area of the infrastructure
  • Must not create or open security issues and by its nature must be able to work in the most secure areas of any infrastructure
  • Must be able to gather information utilizing virtually any protocol or, if protocols are disabled or not available the baseline, must have alternative methodologies to collect the required information
  • Serve any part of the business or project with the salient asset information required to power the organization

The baseline must have the ability to view technology assets over time, and by their status, to ensure the accuracy and supply deltas to illustrate the evolution of all technology assets. After achieving this state the business is capable of driving operations and any projects that require and accurate technology baseline to be relied upon. Projects such as upgrading to Windows 10 from 7, or producing an Oracle status for the LMS team become deliverable in record and will be reliable, accurate, and confirmable.

Technology can then be broken down by corporate ownership, and can provide the finance team information needed to budget and capitalize the technology in use in every area of infrastructure.

What’s more, the ability to connect and share information between operating systems like ERP, ITSM, CMDB and others makes the information value of the baseline extendable. Data helps the business find potential security weakness; reconcile with the fixed asset system to create the delta between what was purchased to what is in use; reconcile maintenance in implementations; plan disaster recovery responses; handle software audits; update service management - and more.

The baseline is the foundation. By exchanging accurate, granular and specific data flow between other infrastructure solutions the organization eliminates much of the mystery of cost, value and risk associated with the technology in use. What’s more, it increases the teamwork of the whole system, making the entire infrastructure more useful. Less individual components, more pulling together - when there’s no ‘I’ in TAM.