Leading IT: Top Tech Trends and Business Value Chain

Tech - main image new


Everyday there seems to be a new technology that pops out of the woodwork and becomes the darling of IT fashion.

Often, these “hot” technologies are accompanied by exhortations that it is the next best thing since sliced bread. Furthermore, you are cautioned and even threatened that if you don’t jump on-board that you will suffer dire consequences, including presumably a career dead-end, possible job loss, and probably even incurable bad breath.

One of the perhaps most important activities for any IT executive consists of helping to curate these emerging technologies and ferret out which ones are ready for investment and which ones are not. This is what many non-IT executives expect that their IT executive will help do for them. Given that the IT executive is supposed to be tech savvy, it seems fitting and directly relevant that the IT executive should be on top of this.

That being said, some IT executives get themselves into a bind by being the only one at their firm that is involved in keeping up-to-date on the latest technologies and assessing them. This solo approach is usually a bad sign, namely it suggests that the IT executive is acting separately from the business. Plus, there is at times a tendency to have a kind of so-called “technology lust” toward exciting new technologies, which is not necessarily the right business reason to want a new technology.

It is crucial that the CIO or CTO involve the business directly in the ongoing assessment of new technologies and collaboratively ascertain the value that such technologies might or might not provide to the business.

One increasingly popular organizational mechanism includes establishing an IT-related Center of Excellence (CoE), or an equivalent kind of R&D element to bring together the techies with the business members of the firm. The CIO or CTO goes out of their way to ensure that there is appropriate representation from other units and functions of the firm, and does so in more than a perfunctory way. A perfunctory way is usually just asking for anyone from other departments and then in a hollow manner assigning them to be part of the CoE. That’s not going to do much good, and in fact is likely to harm the CoE and the whole notion of having the CoE.

In past years, progressive CIO’s and CTO’s tried to get these kinds of new technology organizational assessment entities going and were often rebuked. Why should other parts of the business “waste” their limited resources towards such an internal effort? Won’t it just involve sitting around and getting to play with new toys? We have “real” work to do, those other business heads would say, and come back to me when you’ve vetted something that I should truly care about.

This head-in-the-sand response has lessened in many respects.

Today, the cry for all businesses to become digital businesses has spurred companies to reconsider how they do business. They are bombarded with examples of companies and competitors that are making facets of their business into digital winners, positively impacting not only the internal running of the firm but also the products and services being delivered to customers.

In today’s blog, I take a look at the aspects of new technologies and how businesses are transforming in the fast paced digital age.





Take a look at Figure 1 to see an indication of the highlights of what we’ll be covering.


Tech - Figure 1 lessons learned


There is a lot of talk these days about transforming the business.

What does that mean?

It indicates that businesses are changing how they do business, undoing and altering prior practices and transforming into something that will presumably be stronger, more resilient, more efficient, more effective, and intending to be more profitable and maximize shareholder wealth.

Businesses are doing so for purposes of being better at what they do, but also to avoid being squashed by competition or being otherwise put out of business or battered.

The word “disruption” is a key part of why companies are seeking to transform. The popular example of Uber is an indicator of what it means to disrupt an industry. Conventional taxi and limo services were dramatically disrupted by the introduction of Uber and other Uber-like car transportation services. It had been hard to hail a taxi, the quality of the taxi car was often dismal, the taxi driver was frequently sour and unpleasant, the paying for the taxi ride was difficult and cumbersome, etc. The Uber and Uber-like services found a means to either eliminate those pain points or at least reduce them.

Some would say that the “friction” between the consumer and the act of getting a ride was reduced and instead the whole experience has now become one of a pleasant nature rather than a dread.

Firms want to avoid getting disrupted and they know that the Uber and Uber-like firms all used technology as a cornerstone for bringing about the disruption in the ride hailing industry.

So, by prospering and appropriately embracing new technology, a firm can avoid becoming disrupted or at least mitigate the disruption impact.

Embracing the new technology involves transforming the business from where it is today and into something different that integrates the technology into crucial aspects of the business.

Besides avoiding being disrupted, firms would like to take things a further step forward and be the disrupter. In other words, it’s one thing to be ready for and make sure that your firm can withstand a disruption, that’s a defensive type of posture, while it is another way of thinking to go ahead of the pack and be the actual disrupter, that’s a proactive posture.

That sounds reasonable, you might say, and then wonder what new technologies might have that potential for being disruptive and therefore should be cooked into the transformation of the business.

Glad you asked.

Let’s take a look at this next.


I realize that at this juncture you might be eagerly awaiting a list of which technologies to be considering for your firm.

Before we jump there, let’s again keep in mind that the technologies will only be of genuine value if they can have an impact on the business.

If a new technology is exciting as a technology but it cannot be ascertained as to how a business can leverage it, we would be back into the realm of looking at something that though interesting does not have an actual payoff.

It can be hard to figure out whether a new technology will or will not make an impact to a business. This requires that both the technologists and the business collaborate on making such a determination, as earlier emphasized herein. In fact, some businesses prefer to be “fast followers” that wait until someone else has figured out how to make a connection between a new technology and positively impacting the business. They cleverly wait until others have tried and sometimes failed, and then cherry pick from those that successfully integrated a new technology. They quickly grasp the brass ring and don’t let it go around a couple of times, doing so in hopes of being early enough to be at the lead of the pack, but not have been one of the pack that struck out by being too early.

A fast follower strategy has tradeoffs. You might not time it well and end-up actually being a late follower when you were actually aiming to be a fast follower. Or, you might go too fast, falsely thinking that a technology is ripe and wanting so much to be a fast follower that you end-up at the so-called “bleeding edge stage” of a new technology.

The laggards are firms that wait until the technology is so proven that most of the rest of the market has already embraced it. For them, they view that the cost of being a fast follower or even worse being a bleeding edger is generally foolhardy. Once something is tried-and-true, then and only then will the laggard adopt it. This does have some merits in that by then the way in which the adoption should occur is usually more well established, as is the predictability of how much of an investment is needed and what the payback will be.

The laggard though faces potential business disruption like a large tidal wave that might wash over them and collapse them. By the time that the laggard acts, the wave of adoption by speedier firms can be so significant that the laggard has already lost the game. Today, we see conventional taxi and limo services that are struggling to adopt the same technologies and approaches of Uber and Uber-like companies, but their outdated business models and their laggard-like adoption might ultimately doom them.

A well-known tech research company called the Gartner Group has popularized a technology adoption graph known as the Hype Cycle.

Take a look at Figure 2.


Tech - Figure 2 hype cycle


Along the X-axis is time and along the Y-axis is expectation. There is an undulating curve that starts on the left side of the chart as being low in terms of expectations and at the low of time. This is what I will refer to as Stage A of a new technology.

During Stage A, a new technology gets introduced into the world. There is some kind of trigger that makes it seems attractive and interesting.

Gradually, there is momentum built-up that this new technology has great promise. The curve then proceeds upward, getting higher and higher on the expectations metric. This brings that technology into Stage B.

At some point, the expectation peaks, reaching a super high point, and then once the excitement starts to wear off, it begins to turn downward.

As shown in Stage C, a new technology can lose steam and drop down into a trough. This frequently happens when few can figure out how to profit from the technology in terms of turning it into something useful and usable for business.

Eventually, for some technologies, once the kinks are worked out, it begins to come back and starts to rise again. This is shown in Stage D, whereby the technology has finally gotten successful adoptions and is clawing its way back upward in terms of expectations of merit.

Usually, the technology then flattens out at some level of expectation that is more than where it started but less than where it peaked. This is Stage E, when the technology has reached a plateau and has found its balance in the marketplace in terms of adoption.

For the latest version of Gartner’s Hype Cycle, take a look here:


It gets updated annually by Gartner.

Currently, Stage A has technologies such as Virtual Personal Assistants, Internet of Things (IoT) Platforms, Smart Robots, Neuromorphic Hardware, and so on.

Stage B, the peak reaching stage, currently has Machine Learning, Autonomous Vehicles, etc.

Stage C, the trough, currently has Natural Language Question Answering, Augmented Reality, etc.

Stage D, which they call the slope of enlightenment, contains Virtual Reality, and they don’t list any specific technologies in the Stage E.

I am bringing to your attention the Hype Cycle to ensure that when you think about new technologies that you are also thinking about where they fit in the cycle of usefulness and utility to business.

Whether you agree or not with Gartner’s latest placement of current technologies into the various particular stages, it is more important to be considering that you should be indeed contemplating and considering where any new technology fits in the curve.

If you believe that a particular technology is currently in say Stage A, it likely means that if you are going to adopt it, you are going to be a bleeding edger. Your risks are higher. You are betting on something not yet proven. It might not be clear yet as to how the technology will help the business. And so on.

If you believe that a particular technology is currently in say Stage D or Stage E, it likely means that the technology has stabilized, it is better understood from a business ROI perspective. You are likely not a fast follower per se and more likely a laggard.

Seeing how a research firm like Gartner has classified new technologies is useful as a gauge to compare to your own firm and its assessment of the technologies. Keep in mind that your firm might see a particular technology in a different light than what an overarching graph depicts. For example, if your firm already has been using AI technologies and needs to be at the forefront to remain competitive, you might perceive that say Machine Learning is actually toward Stage D now rather than its position of Stage B.

Also, realize that the undulating curve is just one such curve of potential technology adoption. Not all technologies necessarily go through this shape of a curve and through each of the stages. It is possible that a technology might for example arise in Stage A and skip directly to Stage E.


We have now taken a look at how new technologies tend to arise over time and undergo adoption.

But, you might naturally ask, adoption for what aspects of the business?

One of the useful ways to consider how a new technology might impact a business involves looking at the Value Chain of the business and considering how technology plays a role in each piece of the business.

Take a look at Figure 3.


Tech - Figure 3 value chain


The Value Chain was initially popularized by Michael Porter and the diagram provides a handy depiction of the elements and activities of business which lead to creating value.

I have labeled each portion with a number, ranging from 1 to 9. The first four numbers (1 through 4) are considered support-related activities of a business. The last five numbers (5 through 9) are considered the primary activities of a business.

Under the support activities, we have:

1) Administrative, finance, IT, legal, accounting, infrastructure
2) Human resources management, recruitment, training
3) Product and technology development, R&D, market testing
4) Procurement, supplier management, subcontracting

There are of course other support related activities that aren’t overtly named on the diagram due to brevity, but you can assume are included.

For your particular firm, fit any such unnamed support activities into whichever of the aforementioned four buckets seems most sensible for your business.

The primary activities of the business are considered:

5) Inbound logistics
6) Operations
7) Outbound logistics
8) Sales & Marketing
9) Servicing

These primary activities are the advent of your products and services and at some point in these activities involves likely some kind of direct interaction with your customers.

I have found quite useful the overlaying of new technologies onto the Value Chain, trying to ascertain which new technologies can be useful for a particular piece of the Value Chain.

A new technology might be useful for more than just one piece. Also, sometimes a new technology might be initially best oriented towards an internal use among the support activities, and then once so proven it moves outward into the primary activities. This can happen in the other direction too, namely that a new technology is adopted in a primary activity and then once proven it gets adopted into a support activity.

Part of the reason that I urge you to consider using a framework such as the Value Chain is that often when a new technology is being considered by a firm it is done so in a narrow manner. For example, suppose the HR group is interested in using Machine Learning as an approach to cull through thousands of applicants and help determine which are the most promising for hiring purposes. The IT group perhaps helps in this assessment and focuses on this particular use case. What might be missed is that Machine Learning might be applicable in other parts of the business too.

Why would that matter?

Sometimes trying to justify the cost and effort toward adopting a new technology can be hard on the basis of a single use case. If the same technology can be applied to other parts of the company, it can sometimes be more readily cost justified. There are economies of scale gained by leveraging across the company. Multiple portions of the company might be able to pony up the needed resources by each contributing toward the larger overall cost.

Often, a pinpoint use is what instigates a spark of interest in a company toward a particular new technology, and it is then incumbent upon the CIO or CTO to have an overarching perspective and help consider how the technology can be applied across the company. This is in fact one of the greatest joys and challenges often for a CIO or CTO, trying to aid the firm in seeing technology on a more macroscopic scale and across the breadth of the company. The flip side of this coin is that when trying to take a single use case and enlarge to more of the company, it can inadvertently lead to delays in adoption as more hands get involved and a larger dialogue takes place.


We are now finally at a point where it makes sense to start discussing specific new technologies.

Rather than just describing what the new technology consists of, we can now also consider what stage of marketplace adoption is it in, where might it be going, and also why would it be potentially useful to business as seen through the framework of the Value Chain.



One of the most touted “new technologies” is the rise of the power of algorithms.

What does this mean?

Generally, it means that the underlying series of steps or formulas that are used in a company’s computer systems is becoming increasingly valued and crucial to the business.

You might say that we’ve had algorithms since the start of the computer field, and so why is this something new or worthy of any special attention.

The argument made for the advent of today’s algorithms is that we are able to take more of what humans might be doing algorithmically and infuse that into software and systems. This can be done more readily now since the ability to develop such software and systems is markedly easier today and can be more economically readily made available than before.

Let’s take an example to showcase this. The Consumer Financial Protection Bureau (CFPB) is using an algorithm developed by the research think tank Rand that determines the likelihood of someone being racially discriminated by an automobile lending company. For example, the CFPB is seeking $80 million from General Motors based on alleged discriminatory lending practices as “revealed” by the use of the algorithm.

The Rand algorithm makes use of large data sets such as census data and geographic data and tries to estimate the likelihood that someone is of a particular racial base. It does this based only on the address of where the person lives and their last name. They call it the Bayesian Improved Surname Geocoding (BIGS) system. There is heated debate about whether this algorithm is right or wrong, good or bad, but nonetheless the point is that it is an algorithm and it is being used for a business related purpose.

There are dangers inherently in using algorithms. There is an implicit assumption that an algorithm is correct. There is an implicit assumption that the base algorithm has been correctly implemented in software or a system. There is an implicit assumption that the algorithm works correctly for the aspects that it is intended, and that it is not being used for aspects for which it is not intended.

One illustrative example of these kinds of dangers is the introduction of Tay by Microsoft. This was an AI chat bot that ultimately went haywire due to how it was taking input from users and then altering its approaches. It began to emit insulting messages that were widely offensive. In this case, it did not do any substantive harm per se, but imagine if such an approach was used by a system that had more life-and-death types of consequences, such as running factory machinery, monitoring hospital equipment, or used in military systems.

A savvy CIO or CTO looks for ways in which aspects of the business can be turned into algorithms, exploring each area of the Value Chain, and ultimately identifying systems that could embody those algorithms, but also with the right protections and validation to help ensure that the algorithms are working appropriately. It is also important to ensure that the business is able to safeguard those algorithms including the Intellectual Property (IP) rights associated with them.

Machine Learning

Related to the topic of algorithms is the role of machine learning.

Most would agree that machine learning is a subset of the field of AI. AI is an umbrella term that includes natural language processing, vision and image processing, machine learning, and other facets.

Machine learning refers to having a computer-based system that is able to “learn” as it runs and thus presumably improve its efforts over time. It is therefore a kind of intelligence or intelligent like behavior. I put the word “learn” into quotes because the notion that a computer is learning like a human learns is not quite what is meant by the use of the word “learning” in machine learning.

Essentially, the details of how humans learn are still a mystery of science and it is not reasonable to claim that any computer system learns in the same manner per se as a human does. Therefore, the word “learn” is overloaded with all sorts of impressions and meanings, and should be used cautiously when referring to computer systems.

In any case, machine learning is usually based on either supervised learning or unsupervised learning.

In the case of supervised learning, the system is guided by a human in a supervisory manner as to what is intended. For example, suppose we want a system that can identify houses and trees in a photograph or image. We could have humans label what are the houses and what are the trees, and the system then tries to generalize from the labeling to then identify in new images any houses or trees.

In contrast to supervised learning, in unsupervised learning the system would not use explicit labeling but instead use a form of reinforcement, whereby there is a reward function that provides guidance. The underlying reward function is part of a field of study known as the credit assignment problem, namely what kind of reward or credit should be used during learning and how should it be best be utilized. In the game Go, the moves will ultimately lead to winning the game or losing the game. An unsupervised learning approach tries to utilize the win or loss to ascertain which moves were “good” and led to a win versus which moves were “bad” and led to a loss. For the image identification of houses and trees, we might have an unsupervised learning wherein the system guesses at which is a house or a tree, there is some kind of credit or reward for making a right or wrong guess, and the system tries to ascertain what to classify as a house or tree accordingly.



Artificial Neural Networks

Typically, a machine learning approach uses Artificial Neural Networks (ANN) to implement the system, often using ANN’s for both the supervised or unsupervised learning approaches. ANN refers to a system that uses simulated neurons, crudely and simply modeled after biological neurons, and interconnects the simulated neurons in the same kind of notion as having biological neurons interconnected via synapses. This is done as a simulated approach of what we generally believe biological neurons are accomplishing. I emphasize that this is not to be misinterpreted or misunderstood as actually replicating in any ideal sense what the biological neurons are doing. It is an artificial and simulated approach.

Neural networks have been around for several decades, but are having resurgence due to being able to now ramp-up the magnitude of how many neurons and interconnections that can be simulated and the speed at which they can be run.

The phrase “deep learning” tends to refer to machine learning that is using relatively large sets of simulated neurons and interconnections, being “deep” in the quantity of how many are involved. This has become increase in the number of simulated neurons has become feasible due to the advances in the underlying hardware and the dropping of prices for the underlying hardware. For example, the use of commonly available Graphical Processing Units (GPU’s) and Field-Programmable Gate Arrays (FPGA’s) for simulating these artificial neurons has become economical given the rapid reduction in price, the improvements in miniaturization, and the increase in their capabilities.

Neuromorphic computers or architecture refers to whole computer systems that are shaped around the artificial neural network approach. One of the breakthroughs in having this kind of hardware was IBM’s system launched in 2014 that contained the SyNAPSE chip with 5.4 billion transistors and an on-chip network of 4,096 artificial neural cores, allowing for a computer that could simulate 1 million artificial neurons and 256 million interconnections or synapses.

Per the earlier discussion about the Hype Cycle, the technologies of Machine Learning and the Artificial Neural Networks are still in the early stages, likely Stage A and Stage B. Businesses are still grappling with how to best use this technology. For purposes of undertaking natural language processing, there have been impressive improvements in being able to interact in a human-like natural language way via incorporating machine learning and ANN into the natural language processing capabilities. For your firm, consider what areas of the Value Chain could potentially be aided or disrupted by utilizing these technologies.

Internet of Things

The Internet of Things (IoT) refers to the connecting of “things” into the Internet that have previously not been readily considered electronic devices that can be somehow added onto the Internet. Clorox for example recently came out with a water pitcher that has a computer embedded into it for purposes of connecting the water pitcher onto the Internet. The user of the water pitcher can setup an automatic reorder for water filters, doing so via the Internet (using the Amazon Dash technology).

Gradually, we will see more and more of these kinds of additions onto the Internet. For example, clothing makers are anticipating having a computer chip embedded into the label of your shirt, pants, dress, etc. This will allow the clothing item to be connected to the Internet. You would be able to track where the clothing item is and other facets about the clothing item.

Some say that instead of calling this the Internet of Things that we should call it the Internet of Everything (IoE), implying that the word “things” is just not wide enough in terms of encompassing “everything” that will ultimately be connected onto the Internet.

Another phrase is that the IoT or IoE will be a Digital Mesh. This suggests that we should think of not just things but also think of how they are interconnected via the Internet, along with the information and services associated with them.

Either way that you opt to name it, the point is that for businesses the advent of having all sorts of products being able to connect on the Internet will be a boon for some businesses and a bust for other businesses. Similarly, there will be old services that will no longer be needed or might need to be changed, and there are new services that might arise that are not needed or known today but that are feasible and needed once we have IoT or IoE.

IT executives need to be working collaboratively with the business to ascertain how IoT et al will impact their business and their industry.



User Experience (UX) or Digital Mesh

In earlier times of IT, we focused on the User Interface (UI) of how a system was designed and built to interact with the user of the system. This was broadened to focus on the entire sense of a User Experience (UX) when using a system. As an example of the power of UI and UX, Uber’s interface has been recognized as one of the most well designed UI and UX’s, and is considered one of the key factors in the rapid adoption and growth of the use of Uber.

An expression “Zero UI” has arisen to suggest that a user interface should be so easy to use that it is almost like the user doesn’t even realize it is there at all.

Another new buzzword is the “Ambient Experience” which refers to a UI or UX that remains continuous over time and adaptable across devices. For example, you might be using a system first on your mobile phone, then switch to using a laptop, then switch to using Augmented Reality via glasses, then switch to using Virtual Reality using a headset.

For CIO’s and CTO’s, consider what kinds of UI and UX you currently have on your existing systems. If the existing systems are hard to use and the user interface is clunky or obtrusive, you might then consider some of the newer technologies and techniques for user interfaces. The nature of the UI or UX can be a substantial determiner of whether or not a system is actively used or not. Sometimes firms put a lot of effort into the guts of a system but little effort into the interface, and then find to their chagrin a low adoption rate, due to the lousy interface in spite of whatever beauty might lie underneath it.

Chat Bots and Virtual Personal Assistants (VPA’s)

Chat bots have become popularized in messenger applications. When using Facebook messenger, for example, you can have a software add-on invoked that tries to “chat” with you using natural language processing. The pizza chain Dominos has a chat bot that allows you to order a Dominos pizza. These chat bots will gradually become more pervasive and we’ll see them occurring in all sorts of systems.

Virtual Personal Assistants (VPA’s) are similar to chat bots. They usually though are about so-called personal assistant activities. For example, you are in Facebook messenger and suddenly realize that you wanted to have a meeting put onto your schedule for next Thursday. You bring up your VPA and in a natural language processing mode indicate to it that you want to meet with George on next Thursday, and the VPA then tries to figure out how to put this onto your schedule, akin to what a human personal assistant would do.

As indicated by the example above about the pizza chain Dominos, businesses need to be considering what chat bots and VPAs they should be involved in. If chat bots and VPAs are going to be “conferring” with customers and consumers, you don’t want to be left out of that kind of conversation. Consider in what ways that chat bots and VPA’s might impact your Value Chain at the primary activities, your company products and services, and in what way might it impact your internal support activities.


We have touched upon some of the top technologies such as machine learning, neural networks, IoT, chat bots, VPA’s, and so on, but there are many others that continue to emerge. You need to establish an organizational mechanism to help remain aware and familiar with what is emerging.

As discussed earlier, it is useful for an IT function to establish a Center of Excellence or the equivalent, and use that CoE to keep up with the latest in technology, and how it fits or does not fit to your business. The CoE should be continually scanning the horizon for new technologies, even ones in their most nascent form. The CoE needs to be assessing the new technologies and be ready to communicate its findings. Sometimes this is done proactively by various communiques throughout the firm, and sometimes it happens reactively, as mentioned next.

There is an often told story of the CEO that while flying on an airplane informally talks with someone seated next to them, gets told about some new whiz bang technology, and the CEO immediately after getting off the flight contacts the entire C-suite executive team and orders that by gosh the company needs to adopt that technology.

This happens more frequently than you might think.

Staying ahead of that kind of occurrence is important, and doing so by proactively having a CoE would allow the CIO or CTO to be able to quickly respond by saying that the company has looked at that technology and based on collaboration with the business has determined that it is this or that in terms of readiness for adoption by the business. This is better than madly reactively scrambling to figure out what the technology is and whether or not it might be applicable to the business.

If the CIO or CTO does not take on these kinds of trends tracking and assessing efforts, it then leaves a vacuum as to whom in the company should be doing this. Some CIO’s or CTO’s will think that perhaps the head of R&D for the company should be doing this. That might be appropriate, but at least the IT executive should have a hand in it.

Some companies don’t trust or believe that their IT function is up to the task of running a CoE on technology. Thus, it can be difficult for those CIO’s or CTO’s to try and break into it within their firm. For them, it is often a long, slow, painful process of gradually gaining trust that the IT function can have a role and be a significant player in this.

Some CIO’s or CTO’s will say that they don’t have the budget monies available for something like a CoE. This can at times be overcome by starting small, providing proof of the value of the CoE, and then going to set aside larger amounts for expanding the CoE. It also can be wise to seek out other fellow non-IT executives that could help shoulder the cost of the CoE, involving the business units or other functional areas, which collaboratively is desired anyway.

There are many ways to skin a cat, as they say, and it is vital to find a means to keep up with the latest technologies and technology trends, and do so in business case manner.

Enough said.