August 07, 2019 in Business

Analysing Third-Way Change: Management and Policy

With the advent of the Third-Wave, ICT driven solutions that marked a productivity upheaval, a host of disparate yet intertwined policy and change-management issues have gained extra relevance. The increasingly divergent or polarised nature of derivative demand for labour has persisted throughout as the burning agenda affecting the bulk of agency-based trade-offs. At the same time, the seemingly stand-alone areas of change management, notably the PDCA (plan-do-check-act) as in Vora and theory of change a la Brest have revealed natural overlaps as part of socially responsive planning. 

The present report seeks to detect whether these emerging areas could be reconciled to the long-established approaches to cyclical project management and budgeting as shaped by the managerial-accounting analysis of variances to inform relevance-driven management-by-exceptions (MbE). Apart from the positively refutable and partially formalised analytic approaches and implications, it remains to be seen exactly how the new industrial organisation that is characteristic of the knowledge-intensive and largely derivative segments is to resolve in between the corners, e.g. old-hat giant barons versus brand-new micro-MNE (multinational enterprises) subject to the endogenous or ICT induced contingency. In fact, it is this inconclusive layer of structural uncertainty that may outmatch strategic facility and policy response efficacy in the interim run.

Discussion

Background

Over the past two decades, a dual trend has loomed large pertaining to how the “positive” or non-emotive allocation impacts have triggered some conflict and hard bargaining around the distributional implications. The bulk of the effect has mimicked the way technology would usher in extra productivity as well as efficiency savings while suppressing adequate deployment of factors of production early on. Although it is unlikely that human capital ever rests idle, the interim-run consequences for low-skilled labour may well prove incentives-distorting or irreversible vis-à-vis the economy’s sectoral structure. It remains to be seen whether the ICT excesses can possibly suffice as automatic stabilisers or sustainability tools in their own right, over a reasonably finite time horizon.  

Type of service
Type of assignment
Urgency
Academic Level
Spacing
Pages
Currency

First Order Discount 15% OFF

Order total: 11.9911.99
 

Third Great Wave: Positive Trade-Offs

The first two waves of industrial revolution, occurring in the 1800s and 1900s respectively, have secured dramatic productivity boosts, with the employment flip side having been rather mixed or inter-temporally distributed. In particular, the First Wave spot-lighted some central inventions such as the steam engine, with its internal combustion counterpart marking the Second Wave. For all the apparent advancement along the comparable lines, however, the advent of electricity and radio on a large scale was even more characteristic of the latter development, as these acted to augment living and trade alike with an eye on the ever lower communication and powering cost. 

That said, the consequences were mixed or inconclusive from day one. For instance, at the outset of the first Industrial Revolution, reduced transportation costs ushered in an era of colonisation as plagued by looting and slavery. In other words, this was the instance of supply or ‘better’ opportunities giving rise to extra demand, be it in terms of plantation employment or monopolistic trade with the colonies. Although some technological and infrastructure spill-overs did carry over to these territories eventually, alongside a favourable productivity impact on the demand for slave labour known as emancipation, still the early savings were only partially sustainable, as far as scrupulous global and inter-generation agency is concerned. The latter could in fact be seen as centred on the environmental externalities or otherwise social costs that took the social planner a long while prior to becoming observable and relevant to the standard of living. 

One way of rationalising asynchronicity of the sort would be to appreciate that rapid technological revolution cannot possibly be accommodated by as prompt an adaptive change in the management style along each and every possible line. In other words, weak policy response may prove even slower than strong political reaction, as too many uncontrollable parameters capture the setup yet to be arrived at or pinned down as the dimensions of structural uncertainty. 

Incidentally, the latter is most manifest in the complexity of the multi-dimensional inter-linkage as well as possible complementarity across the distinct yet entangled domains. Needless to say, the policy maker or social planner, no matter how benevolent and rational, would hardly have responded ahead of the research community—bearing in mind that the latter has long seen its reports overlooked or only partly acted upon. One other facet of inertia might have to do with the inherently elusive nature of scientific evidence, drawing upon the weak and non-intuitive criterion of refutation alongside a host of trade-offs or alternative scenarios, which may not garner the policy buy-in. 

In any event, the domestic as well as geopolitical objectives may long have prevented many of the afore-mentioned concerns being incorporated into agency-laden reasoning, with short-term windfalls possibly counting more heavily than marginal increments of health or social cohesion. Class struggle may indeed have intensified at early stages as a matter of the capital owners being more concerned with productivity or supply-side shifts while overlooking the labourers’ dissipating buying power as coupled with the demand side of equilibrium and growth alike. As tension mounted, so did escalation amidst the employers seeking to ensure that unionism does not sterilise their unqualified bargaining power. Since scale efficiency beyond the break-even point was essential, the allegedly positive returns to scale would appear enough to justify monopolisation or oligopoly in the first place.

In fact, the very scientific revolution that had underpinned the early waves of industrial upheaval was not immune to lasting excesses. To begin with, there appears little in the first wave that could not have been arrived at or invented millennia back, as part of the antique notions of engineering. On the other hand, the inherently mechanical and hence deterministic nature of the early revolution may have paved way for similar approaches to or philosophy of the grand design, natural and social alike. Put simply, one wonders how come the 20th century rethinking of Newtonian and Cartesian mechanics advanced to the more abstract and full-fledged visions for “mechanisms” and non-deterministic structures or representations. Induction of the sort may have questioned continuity while amounting to “big bang” as opposed to gradual shifts. Since this did not occur on every level of change management, complementarity being forgone has long remained of central and lasting concern. 

This may for one have been ushered in by modernity or post-modern paradigms stressing inherent complexity and change – while simultaneously giving rise to early perspectives on change management beyond large-scale and repetitive scientific management. At the same time, probabilistic and statistical methods, which may have stemmed from the early war census records and surface-to-air algorithms as well as thermodynamics theorising, could not have been put to wide use until the advent of powerful enough computing or processing devices. 

In fact, this could illustrate another gap, pertaining to how research or knowledge creation may long fare idle or under-utilised until after adequate applications have come right up its alley. By and large, this might be typical of abstract mathematics, with the present-day smart phones hardly making full use of even the early 1930s group theory, let alone quantum mechanics, despite the underlying algorithms having grown so much more efficient as to outmatch the memory storage doubling, or Moore regularity.

Apparently, all of the above qualifies smooth transition to the Third Wave, as earmarked by utter reliance on ICT in creating the very basic output or securing bare-bones survival edge for the players involved. In other words, as all the industries have grown information-intensive (which need not suggest knowledge intensity as the next stage) amid the processing cost going down and efficacy exploding, the phasing in of ICT becomes more of a cut-off prerequisite or weak competitive response rather than sufficing as a proactive change-managerial action in its own right. The bulk of scarce time and labour alike have been freed up toward more meaningful, creative, and edge-informing choices or solutions, thus rendering the ICT leverage a power tool. At the same time, this has favoured the slim minority of skilled as well as versatile employees while leaving out the blue-collar mass as a largely under-employed host “up North” as well as “down South.” 

As ever, this early surge of enthusiasm will come at a cost, with the data-processing, search-intensive, and what-if routines increasingly turning so complex as to deny intuitive insight into (much less adequate control of) the underlying planning or project-management cycles. In other words, cyclical change management (whether it be in line with the PDCA or project-managerial continuity and ongoing variances-based budgeting) is about as much called for as it is denied under the planning setup growing more and more prohibitively complex. 

Varian showed early on that the explosive platformisation and digital revolution has triggered a dual pattern. On the one hand, the bulk of innovation or differentiation within the emerging knowledge-intensive sector has accrued to the inherently combinatorial nature of spawning further applications and standards from within, based on a handful of encoding principles or protocols (e. g. C++, Java, and ready-made HTTP based platforms alongside open-source OS such as Linux and Android). Although such knowledge products do not involve a high marginal or variable cost component for them to get replicated over an arbitrarily large scale, they border on progressively exploding complexity. This amounts to extra risks, with an eye towards the added contingencies as well as derivative demand to be addressed later in text, albeit in ways that do not posit the digital or e-commerce leverage as unconditional boon or a source of superior edge. 

At the same time, an entirely new notion of industrial organisation and vision for the size-versus-growth trade-off has been ushered in beyond the BCG matrix and in line with the “blue ocean” strategic change. The latter refers to superior opportunities as created by those reluctant to compete amidst intense rivalry and sheer lack of edge-slacks to be picked at no extra risk or cost. Varian has argued that the emerging market can hover anywhere in between a low-concentration form (with micro-MNEs expanding globally on the initially small size and on the strength of low-cost online sourcing) and a highly concentrated, monopoly-dominated setup whereby large standard- and trend-setters seek to gobble as many small contractors as there gets to be in order to preserve the high switch cost or specificity lock-in for their external as well as internal customers. 

In particular, the entry barriers may well be rather low ex ante, which lures many start-ups in on the superior valuation outlook. However, these barriers may prove all too high ex post, as the major rivals end up forced to incur heavy R&D outlays as part of their incremental, or strategy- and project-specific, expenses. In other words, the operating leverage (referring formally to the proportion of fixed cost or overhead within the structure of the contribution margin) will inevitably turn out to be near-explosive at one point, and hard to minimise for that matter. Along with the mounting risks that are hard to pin down and hence mitigate, the overall sector profile proves utterly “fast cycle” in nature, as the product life-cycles are inevitably too short or most uncertain, as many pilot value concepts or pillars of the wide moat have shown to be just that—flukes in retrospect, while revealing a mean-reversion on their initially hyper-optimistic valuations. This agenda will be treated in greater depth as part of the subsequent section.      

A tentative bottom line could be drawn at this stage, pointing out just how drastic the above-said transition away from conventional industrial barons and into sheer knife-edge instability has shown to be, with new challenges posed by the networks of support or battles of standards mapping into inconclusive scenarios, albeit amid well-defined structural trade-offs. In fact, this could be the single most characteristic hallmark of excessive complexity or post-modern “chaos” that questions continuity change management while also offering mitigating solutions.   

Political Response & Normative Policy Implications

Following the initial dotcom bubble burst back in the late 1990s through early 2000s, it appears that the investor community has learned enough to exercise due diligence, unless too hefty offers are at stake. The recently re-emerging instances of prior over-valuation at billion-worth levels, as followed by tantamount contraction and a call for major safety cushion or “retching” on the IPO bets, has showcased the phenomenon of “unicorns” as one interim and transient state in between the formerly dominant barons versus the recently preponderant small-sized and knowledge-intensive MNEs. 

In fact, the entire setup has been prone to incessantly recurring blunders that securitisation has for the most part accentuated and aggravated rather than curbing or mitigating. For instance, the early Enron scandal would usher in the Sarbanes-Oxley 2002 regulatory response that pertained to shared (as opposed to narrowly delegated) risk management. This could be referred to as an early, macro-level political response acting to sterilise the moral hazards of indulging in indiscriminate diversification in ways that might prove inordinately arcane to the professionals, let alone the lay investor public. 

However, it would appear to have been largely non-binding as well as inefficient. The latter finding has to do with the fact that SOX amounted to a lump-sum outlay or fixed cost that only the larger companies could scatter over an efficiency scale early on. The former dimension could prove of crucial importance in terms of micro-level policy response, inasmuch as the investors have ever since continued making heavy use of the naïve market multiples, notably P/S and P/E (price to sales versus earnings respectively) as one way of inferring a “reasonable” stock price on the unicorns IPO. Not only do these metrics scarcely ever converge within their respective categories to substantiate the deployment of an industry average or aggregate in the first place, they may moreover embark on largely recursive or circular assessment beyond the pre- versus post-money dichotomy. 

For starters, suppose a USD 1bn allegedly worth unicorn is seeking to further boost its perceived upside and keeps raising extra funds toward that end. The P/BV (price to book value) would appear to be the applicable multiple in such a setup, whereby a boost in the pricing or valuation is seen as a response in fund raising acting to boost (as well as dilute) the shareholder equity stakes:

P+∆P=P/BV*[BV+∆BV]

Whereas it would be somewhat legitimate for the valuation to build on a fixed multiple as above, the perceived intrinsic value might in fact follow the cyclical change by positing an adjusted right-hand side multiple as below:

P+∆P=(P+∆P)/BV*[BV+∆BV]

In this case, it is no wonder that the equation solves for either a penny stock (low price) or a unicorn one (prohibitively high), or a near-zero extra funding. In fact, the latter refers to how the investment policy responds at later stages, with retching or preferred-stock options effectively detracting from the expected BV increment thus resulting in a near-zero delta. As it happens, largely the same rationale may apply to P/S and P/E based valuation, as the stand-alone earnings or revenues are expected to be high based on the market-total, which itself might be subject to a zero-sum, knife-edge unstable setup. 

Insofar as manipulative securitisation has worked along similar lines of fallacious adaptability at odds with internal controls (or the upside efficacy counterpart of risk management) at a company level, it is no wonder it has induced self-spawned crises over the past two decades. 

At the “fundamentals” end, when it comes to complementarity between technology and training resulting in wage gaps subject to learning attainment, Acemoglu & Autor have seconded Goldin & Katz in observing that the job polarisation in between high versus low skilled sub-markets has been excessive or disproportionate with respect to the mid-skill layer. In a sense, this could be in line with the oft-overlooked role of the mid-stream core of processes bridging the upstream operations and downstream logistics or marketing. 

It should come as little surprise that the bulk of such polarisation has been observed due to low-skilled labour facing more direct ICT substitution, as per sparing the routine tasks. However, these findings could be reconciled most meaningfully in line with Baldwin & Venables, as strictly sequenced or complementary jobs could exhibit utmost outsourcing or off-shoring risks. Arguably, it is the mid-stream layer that could strike an optimum balance between complementarity gains versus costs, while affording the most room for creating shared value a la Porter & Kramer. 

It is important to note for change policy implications along top-down (education) and bottom-up (allocation of human capital) that this mid-layer will call for even more skilled labour, while disadvantaging the mid-skill host further still. 

Conclusion

It has been demonstrated how ICT centred Third Wave was prone to usher in dual trends and multi-faceted policy trade-offs from the outset. Insofar as these could apply as invariants to a variety of sector profiles, it is around the mid-stream slack that normative policy implications, pertaining to derivative demand and management-by-exceptions (MbE) or relevance-driven analysis of budgeting variances, could be wedded across the macro- and micro-levels. 

For the same token, the change-theoretic toolkit, as routinely applied to NGO and political setups, could carry over to regular B2B agency-based management, in that ends-to-means backwards induction lends itself to change cycle, project-managerial cycle, and MbE alike. Irrespective of how labour demand, which is derivative in nature, can be reshuffled in between the increasingly ICT intensive industries, human capital will remain the ultimate control tool to harness the excessive contingencies with, whether ICT specific or exogenous.    

order custom original writing