This article is the second installment of a three-part series. In the previous installment, we examined how technology evolved and how it has culminated in the streamlined, automated, intelligence of Internet of Things ecosystems. We paid special attention to the healthcare sector and looked at how the Internet of Medical Things is availing itself to new technologies as it pushes to open new healthcare frontiers and improve patient outcomes.
The last hundred years have seen a lot of technological changes to be sure, but it’s seen equal and often parallel changes to business practices. In the space of this article, we’ll unpack many of those changes, we’ll make note of the common threads that tie them together, and we’ll add that information to what we’ve already established in the previous article to predict the trends most likely to unfold going forward.
The Evolution of Business Practices
The invention of the wheel won’t take you very far without the vision to build some form of a basic suspension system. The same is true with most technologies – without changing your systems, processes, and practices to better support the new technology, the gains are largely theoretical.
Accordingly, the stages of technological evolution mentioned in the previous article – from the Machine Age (spanning the first half of the 20th century) to the Digital Revolution (starting in the 1950s) to the Information Age (beginning in the late ‘70s) to the current epoch of Digital Transformation (that began in the mid-2000s) – have seen parallel changes in the business practices of the industries looking to most effectively leverage them. Across it all, a fundamental embrace of business agility can be seen..
In examining the evolution of business practices, it is helpful to break the changes down into 4 cardinal groups:
As technology evolved, so did the methods for getting it to produce as much as possible for as long as possible. Over the last hundred years or so, maintenance models have all boiled down to one of the following paradigms:
- Reactive –
Most factors of production cannot run continuously without seeing degradation in their performance and depreciation in their value, which is why they need to be maintained. Since the dawn of the Industrial Revolution, the dominant model for asset maintenance has been reactive. That is to say, when things broke, they would be fixed. Simple and effective.
- Preventative –
Eventually, the owners of those factors of production realized that reactive maintenance models were inefficient and wasteful. The costs of repairs and replacement are almost always be higher than the costs of preemptive servicing. What’s more, the lost production that would result from the repairs and replacement would also exceed the cumulative effects of regular servicing.
The idea that overall costs would be lower and production higher with planned maintenance interventions really started gaining traction in the 1950s.
- Predictive –
Preventative maintenance offers huge improvements over a more reactive approach, but it’s far from perfect. For this approach to scale, it needs to work according to the law of averages. So if the goal is to prevent breakdowns, and the average factory press develops problems in its hydraulics system every 6 months, you’ll need to service your machines every 5 months. In effect, that means that you’re taking an equally large number of machines offline that don’t actually need servicing as those that do. (That’s how averages work!)
When decision makers realized that regularly shutting down equipment, irrespective of the specifics of the individual unit’s circumstances, was costing them unnecessarily, they started looking for a better alternative. With the computing boom of the 1980s and the advances in sensor technology that would soon follow, there was now the ability to collect data points (in both their individualized and aggregated forms) at scale and plug them into predictive models that would make more tailored maintenance recommendations.
As the technology improved, this became cheaper and easier to pull off as well as more and more accurate.
Just as maintenance practices shifted from more reactive and “heavier” models to more efficient and dynamic alternatives, so too have inventory management practices evolved. Since the Machine Age, inventory strategies have gone through a few major reviews. Here’s what that looked like:
- Maintained surplus –
There’s nothing worse than over-promising and under-delivering, which is why business all over the world have historically taken a better safe than sorry approach to inventory management. Starting in the 1940s, this approach inched toward greater precision and scalability with the introduction of punch card management systems.
Already by the 1950s businesses began using early barcode systems to improve record keeping – allowing them to more accurately compare demand and inventory patterns and move toward aligning them more tightly.
- Just-in-Time (JIT) stock management –
Pioneered by Japanese manufacturers (the Toyota Motor Corporation in particular) in the 1960s and 70s, Just-in-Time supply management was a re-imagining of the production cycle with the stated goal of increasing efficiencies, removing unnecessary process stages, and reducing operating overhead. In practice, this meant designing the production floor with greater forethought to eliminate all wasted motion and spatially concentrate operations. It also meant revising supplier arrangements to physically shorten supply chains and allow orders to be fulfilled in smaller volume and with shorter notice.
Breaking from the standard established by Henry Ford and his assembly line process, there was also a concerted effort to build and expand the skill-set of workers so that they would understand each other’s tasks, could smoothly swap in and out of roles when necessary, and collaborate more intelligently to solve problems and improve processes.
Combined with smart flow scheduling, strict oversight, and hardened operational discipline – empowered by then-new technologies such as Universal Product Codes and computerized tracking/management systems – these efforts helped JIT practitioners to dramatically reduce inventory holding costs, limit labor costs, accelerate production cycles, and increase total output. The principles guiding this breakthrough methodology came to be more broadly referred to as “lean.”
- Decentralized responsive fulfillment –
Consumer facing technologies like the mainstream arrival of the Internet in the late 1990s, along with business facing technologies like Radio-Frequency Identification would once again push the bounds of what was possible. Seizing on the principles of lean, forward-thinking strategists rebuilt their businesses around the idea that, with smart systems in place, supply can be paired to demand in near real-time, with little upfront investment, little overhead, little risk, and tremendous scale economies. The idea was essentially to use superior technology and processes to create and take advantage of an arbitrage opportunity between existing wholesale and retail models.
No one executed this strategy more famously or more successfully than Amazon. At the turn of the millennium, Amazon built its online book-selling empire through a partner-based distributed business model predicated on inventory management with virtually no excess stock. As the company grew bigger, their inventory management methods grew more advanced and the company relied increasingly on sophisticated predictive modeling to remain ahead of the curve, while keeping overhead to a minimum. Eventually, the company even built its own inventory management system to better facilitate their unique business model.
Continuing to innovate this decentralized responsive fulfillment model, Amazon has more recently been a champion and early adopter of using autonomous retrieval robots in their order fulfillment centers.
3. Organizational structure
Perhaps most profound of all is the impact that the push for greater business agility has had on the fundamental structure of organizations. Looking to enjoy the full range of benefits that improved technology has to offer, businesses have gone from simple to complex, from clunky to swift, rigid to flexible, and static to dynamic.
Consider the following:
Large operations need to organize to run effectively, which is why they’re called organizations. Historically most organizational structures have taken the form of simple hierarchies – with more people at the lower levels of the operation reporting to and doing the bidding of fewer people at the upper levels of the organization. Roles are clearly defined, and daily tasks are largely dictated by the chain of command. This basic design – whether implemented according to functional or divisional fault lines – continues all the way up until power concentrates into the hands of a small number of individuals.
The benefit of this model is that it is well-defined and simple, allowing it to scale with relative ease. The disadvantage is that it is not very efficient, discourages a big-picture orientation among employees and does not lend itself very well to meaningful collaboration or innovation.
- Dynamic hierarchies –
Over time, the shortcomings of these rigid hierarchy structures became increasingly apparent and forward-thinking business leaders began to rethink how they were organizing their companies. In the 1970s, the idea of matrix management structures began to gain prominence as a way of breaking down silos, boosting creative and pro-active thinking, fostering collaboration, and giving employees an improved sense of ownership over their work.
In this model, hierarchies remain but they are constant transition and exist multi-directionally. So, for example, a given employee might report to multiple supervisors based on the different projects he or she is working on. These different supervisors may express their oversight relationship in more direct or more Laissez-faire ways depending on the circumstances of the collaboration.
Focused on developing cross-functional business groups, this approach has everyone in the organization – from the bottom to the top – wearing multiple hats that change with the season, so to speak. For example, by putting people in your engineering teams who have marketing skills and who report to both engineering and marketing supervisors, you not only prevent competing interests from driving projects in different incongruous directions (that will later need to crudely patched together later), but you build more thoughtful, synergistic products from the outset – winning widespread internal buy-in as a natural consequence.
Around the same time that dynamic hierarchies were gaining popularity, some began to question the need for elaborate hierarchy altogether. These business leaders argued that layers of middle management only added organizational glut and inefficiency. These leaders redesigned their businesses to allow for greater professional independence for individual workers as well as for business teams. Flat organizations, say advocates, are able to respond to market conditions faster and with greater creativity.
It’s important to note, that this is not a binary choice: hierarchical or flat. Proponents of flattening point out that while truly flat organizational models in the style of the Valve Corporation may be rare and often fickle things, most organizations can benefit from some degree of flattening. Over the years, this has increasingly become accepted wisdom, as Wikipedia notes, “In general, over the last decade, it has become increasingly clear that through the forces of globalization, competition and more demanding customers, the structure of many companies has become flatter, less hierarchical, more fluid and even virtual.”
Tracing back to the early 2000s, wirearchies are the organizational structures that emerged from the societal and business conditions of the Information Age. Specifically, this model sees authority dynamically derived from the information, trust, credibility, and business results delivered by each employee.
Though this approach is still far from mainstream, it is likely that it will provide the framework on which more and more businesses will be built going forward.
4. Process strategies
From the beginning of the 20th century to today, the fundamental view of how processes ought to be undertaken and managed evolved too. All told, we can see at least four distinct stages in this evolution. These include:
- Production oriented –
At the turn of the century, business processes and practices were mostly governed by production-oriented economic philosophies of the sort described by Robert Keith. The underlying logic was simple: due to scale advantages, the producer that outproduces his competition will come out on top.
- Value oriented –
Then beginning in the 30s with the Great Depression, being able to do more was no longer considered the goal; increasingly the goal was to do better. In practice, this boiled down to a simple mantra: don’t make what you can’t sell. Specifically, that meant more thoughtfully mapping internal business motivations to external market values.
Notably this period saw those market values as being soft and somewhat suggestible. So, more important than changing your products and services to better align with the public’s wants was adding complementary distribution and sales capabilities to help them “see the value”.
- Lean oriented –
While not departing from the value-driven ethos of the previous period, process strategies continued to evolve. In the 1970s, organizations of all types began to more deliberately embrace lean principles. This saw administrators focus more on improving efficiencies and removing any procedures that added cost or time without improving output/outcome quality.
This period ushered in an explosion of organizational creativity and diversification – with managers all over the world suddenly taking little-to-nothing for granted and constantly asking if there wasn’t a better way to do x, y, or z. This period also gave rise to Six Sigma strategies that aimed to systematically seek out and relieve process bottlenecks and create highly customized processes to produce very specific comparative advantages.
- Agile oriented –
Less a distinct approach to business processes than it is taking lean practices a step further, agile process management strategies emerged in the 1990s. This strategy gained its first footholds through the worlds of manufacturing and software development and later percolated through to the wider business world.
Based on principles of interoperability, de-siloing, strong interdepartmental communication and collaboration, smart tooling, and automation, agile businesses respond quickly to changing demands without seeing a decrease in product quality or reliability.
Putting Business Agility in Context
Whether in the approach to asset maintenance, inventory management, organizational structure, or process strategies, the last hundred-some years have seen tremendous innovation in pursuit of greater business agility. Through it all, there are some common patterns and themes that have guided these forces of change.
- Brute logic has been replaced by precision thinking, enabling leaner operations that work – by design – within tighter margins.
- Wherever possible, predictive modeling and proactive planning have supplanted reactive methodologies. Organizations have been re-imagined and redesigned to better capture and transfer information at every level of their operation.
- Static processes have been fundamentally reconstructed to continuously adapt and be refined on the basis of data insights.
- Tools and techniques have been developed to systematically seek and destroy waste and inefficiencies.
- Automation has been introduced to reduce errors, drive down costs, increase scalability, and streamline workflows.