With technological change accelerating, companies need to make four fundamental shifts: Innovation at the Edge, Perpetual-learning Culture, IT-as-a-Service, Expanded trust boundaries.
It is easy to become numb to the onslaught of new technologies hitting the market, each with its own promise of changing (more often “revolutionizing”) the business world. But our analysis of some of the more meaningful tech trends lays out a convincing case that something significant is happening.
These tech trends are generally accelerating the primary characteristics that have defined the digital era: granularity, speed, and scale. But it’s the magnitude of these changes—in computing power, bandwidth, and analytical sophistication—that is opening the door to new innovations, businesses, and business models.
The emergence of cloud and 5G, for example, exponentially increases compute power and network speeds that can enable greater innovation. Developments in the metaverse of augmented and virtual reality open the doors to virtual R&D via digital twins, for example, and immersive learning. Advances in AI, machine learning, and software 2.0 (machine-written code) bring a range of new services and products, from autonomous vehicles to connected homes, well within reach.
Much ink has been spilled on identifying tech trends, but less attention has been paid to the implications of those changes. To help understand how management will need to adapt in the face of these technology trends in the next three to five years, we spoke to business leaders and leading thinkers on the topic. We weren’t looking for prognostications; we wanted to explore realistic scenarios, their implications, and what senior executives might do to get ready.
The discussions pinpointed some broad, interrelated shifts, such as how technology’s radically increasing power is exerting a centrifugal force on the organization, pushing innovation to expert networks at the edges of the company; how the pace and proliferation of these innovations calls for radical new approaches to continuous learning built around skills deployed at points of need; how these democratizing forces mean that IT can no longer act as a centralized controller of technology deployment and operations but instead needs to become a master enabler and influencer; and how these new technologies are creating more data about, and touchpoints with, customers, which is reshaping the boundaries of trust and requiring a much broader understanding of a company’s security responsibilities.
1. Innovation at the edge
Key tech trends
We estimate that 70 percent of companies will employ hybrid or multicloud management technologies, tools, and processes. At the same time, 5G will deliver network speeds that are about ten times faster than current speeds on 4G LTE networks, with expectations of speeds that are up to 100 times faster with 40 times faster latency. By 2024, more than 50 percent of user touches will be augmented by AI-driven speech, written word, or computer-vision algorithms, while global data creation is projected to grow to more than 180 zettabytes by 2025, up from 64.2 zettabytes in 2020.6 The low-code development platform market‘s compound annual growth rate (CAGR) is projected at about 30 percent through 2030.
Shift: Innovation develops around personal networks of experts at the porous edge of the organization and is supported by capabilities that scale the benefits across the business.
These technologies promise access to virtually unlimited compute power and massive data sets, as well as a huge leap in bandwidth at low cost, making it cheaper and easier to test, launch, and scale innovations quickly. The resulting acceleration in innovation will mean that companies can expect more disruptions from more sources. Centralized strategic and innovation functions cannot hope to keep pace on their own. Companies will need to be much more involved in networks outside their organizations to spot, invest in, and even acquire promising opportunities.
Corporate venture-capital (VC) funds with centralized teams have looked to find and fund innovation, but their track record has been spotty, often because the teams lack the requisite skills and are simply too far removed from the constantly evolving needs of individual business units. Instead, companies will need to figure out how to tap their front lines, particularly business domain experts and technologists, to enable them to act, in effect, as the business’s VC arm. That’s because the people who are writing code and building solutions are often well plugged into strong external networks in their fields and have the expertise to evaluate new developments. One pharma company, for example, taps its own expert researchers in various fields, such as gene expression, who know well the people outside the company who are leaders in the field.
While companies will need to create incentives and opportunities for engineers to build up and engage with their networks, the key focus must be on empowering teams so they can spend their allocated budget as they see fit—for example, experimenting and failing without penalty (within boundaries) and deciding on technologies to meet their goals (within prescribed guidelines).
The IT organization of the future can play an important role in building up a scaling capability to make that innovation work for the business, something that has traditionally been a challenge. Individual developers or small teams working fast don’t tend to naturally think about how to scale an application. That issue is likely to be exacerbated as nontechnical users working in pockets across organizations use low-code/no-code (LC/NC) applications to design and build programs with point-and-click or pull-down-menu interfaces.
One pharma company has taken this idea to heart by giving local business units the flexibility to run with a nonstandard idea when it has proven to be better than what the company is already doing. In return for that flexibility, the business unit must commit to helping the rest of the organization use the new idea, and IT builds it into the company’s standards.
In considering how this scaling capability might work, companies could, for example, assign advanced developers to “productize” applications by refactoring code so they can scale. IT leadership can provide tools and platforms, reusable-code libraries that are easily accessible, and flexible, standards-based architecture so that innovations can be scaled across the business more easily.
Questions for leadership
- What incentives will best encourage engineers and domain experts to develop, maintain, and tap into their networks?
- What processes are in place for tracking and managing VC activity at the edge?
- What capabilities do you need to identify innovation opportunities and “industrialize” the best ones so they can be shared across the organization?
For more on how to empower workers at the edge, see “Tech companies innovate at the edge. Legacy companies can too,” in Harvard Business Review.
2. A perpetual-learning culture
Key tech trends
Advances in AI, machine learning, robotics, and other technologies have increased the pace of change tenfold. By 2025, we estimate that 50 billion devices will be connected to the Industrial Internet of Things (IIoT), while 70 percent of manufacturers are expected to be using digital twins regularly (by 2022). Some 70 percent of new applications will use LC/NC technologies by 2025, up from less than 25 percent in 2020.9 The global metaverse revenue opportunity could approach $800 billion in 2024, up from about $500 billion in 2020.10 This proliferation of technological innovations means we can expect to experience more progress in the next decade than in the past 100 years combined, according to entrepreneur and futurist Peter Diamandis.
Shift: Tech literacy becomes core to every role, requiring learning to be continuous and built at the level of individual skills that are deployed at the point of need.
With the pace and proliferation of technologies pushing innovation to the edge of the organization, businesses need to be ready to incorporate the most promising options from across the front lines. This will create huge opportunities, but only for those companies that develop true tech intelligence through a perpetual-learning culture. The cornerstone of this effort includes training all levels of personnel, from “citizen developers” working with easy-to-use LC/NC tools or in entirely new environments such as the metaverse, to full-stack developers and engineers, who will need to continually evolve their skills to keep up with changing technologies. We’re already seeing situations where poorly trained employees use LC/NC to churn out suboptimal products.
While there will always be a need for more formalized paths for foundational learning, we anticipate an acceleration in the shift from teaching curricula periodically to continuous learning that can deliver varying technical skills across the entire organization. In practice, that will mean orienting employee development around delivering skills. This requires breaking down a capability into its smallest sets of composite skills. One large tech company, for example, created 146,000 skills data points for the 1,200 technical skills it was assessing.
The key point is that these skills “snippets”—such as a block of code or a video of a specific negotiating tactic—need to be integrated into the workflow so that they’re delivered when needed. This might be called a “LearnOps” approach, where learning is built into the operations. This integration mentality is established at Netflix, where data scientists partner directly with product managers, engineering teams, and other business units to design, execute, and learn from experiments.
As important as being able to deploy learning is building a learning culture by making continuous learning expected and easy to do. The way top engineers learn can be instructive. This is a community that is highly aware of the need to keep their skills up to date. They have ingrained habits of sharing code, and they gravitate to projects where they can learn. One advantage of using open source, for example, is the built-in community that constantly updates and reviews code. In the same spirit, we’re seeing companies budget extra time to allow people to try new tools or technologies when they’re building a product. Other companies are budgeting for “learning buffers” to allow for setbacks in product development that teams can learn from.
Netflix, which makes broad, open, and deliberate information sharing a core value, built the Netflix experimentation platform as an internal product that acts as a repository of solutions for future teams to reuse. It has a product manager and innovation road map, with the goal of making experimentation a simple and integrated part of the product life cycle.
To support this kind of continuous learning and experimentation, companies will need to accept mistakes. The art will be in limiting the impact of potentially costly mistakes, such as the loss or misuse of customer data. IT will need to architect protocols, incentives, and systems to encourage good behaviors and reduce bad ones. Many companies are beginning to adopt practices such as automated testing to keep mistakes from happening in the first place; creating spaces where mistakes won’t affect other applications or systems, such as isolation zones in cloud environments; and building in resiliency protocols.
Questions for leadership
- Do you have a list of the most important skills your business needs?
- What is the minimum level of learning needed for advanced users of analytics and manipulators of data?
- How do you track what people are learning and whether that learning is effective and translating into better performance?
3. IT as a service
Key tech trends
It is estimated that the global cloud microservices platform market will generate $4.2 billion in revenue by 2028, up from $952 million in 2020. GitHub has more than 200 million code repositories and expects 100 million software developers by 2025. Nearly 90 percent of developers already use APIs. Software 2.0 creates new ways of writing software and reduces complexity. Software sourced by companies from cloud-service platforms, open repositories, and software as a service (SaaS) is growing at a CAGR of 27.5 percent from 2021 to 2028.
Shift: IT becomes the enabler of product innovation by serving small, interoperable blocks of code.
When innovation is pushed to the edge and a perpetual-learning culture permeates an organization, the role of IT shifts dramatically. IT can’t support this dynamic environment by sticking to its traditional role as a controlling entity managing technology at the center. The premium will now be on IT’s ability to enable innovation, requiring a shift in its traditional role as protector of big tech assets to a purveyor of small blocks of code. The gold standard of IT effectiveness will be its ability to help people stitch together snippets of code into a useful product.
We are already seeing what that might look like. Employees at G&J Pepsi-Cola Bottlers with little to no experience at software development created an app that examines images of a store shelf to identify the number and type of bottles on it, then automatically restocks it based on historic trends. One pharmaceutical company grew its low-code platform base from eight users to 1,400 in just one year. Business users outside of IT are now building applications with thousands of monthly sessions. Companies that empower “citizen developers” score 33 percent higher on innovation compared with bottom-quartile companies that don’t provide that level of support, according to a McKinsey survey.
These developments point toward much more of a “buffet” approach to technology, where IT builds useful blocks of reusable code, sometimes assembles them into specific products, and makes them available through a user-friendly cataloging system for the business to use to create the products it needs. IT provides guiderails, such as API standards and directives on the environments in which the code might be most useful; protects the most sensitive information, such as customer data and financial records; and tracks their adoption. This tracking capability will become particularly crucial as bots, AI, algorithms, and APIs proliferate. Transparency isn’t sufficient. IT will need to make sense of all the activity through advanced tech performance and management capabilities and the development of new roles, such as data diagnosticians and bot managers.
This IT-as-a-service approach puts the product at the center of the operating model, requiring a commitment to organizing IT around product management. Some companies have been moving in this direction. But reaching the scale needed to support fast-paced and more diffuse innovation will require a deeper commitment to product owners, working with leaders in the business side of the house, to run teams with real P&L responsibility. Many organizations, from traditional enterprises to digital natives, have found that putting in place product leaders who set overall product and portfolio strategy, drive execution, and empower product owners to drive innovation aligned with business outcomes and P&L metrics can increase the return on the funding that flows to technology delivery and quicken the pace of innovation.
Questions for leadership
- Do you have a vision for how the role of the IT organization will change to enable democratization of technology?
- How will you elevate the role of the technology product manager, and do you have a road map for developing that role?
- What systems will you need to put in place to manage and track the use, reuse, and performance of code?
4. Expanded trust boundaries
Key tech trends
It was estimated that almost 100 percent of biometrics-capable devices (such as smartphones) will be using biometrics for transactions by 2022. The effectiveness of these technologies has advanced dramatically, with the best facial-identification algorithms having improved 50 times since 2014. These developments are contributing to profound unease in the relationship between technology and consumers of technology. The Pearson Institute and the Associated Press-NORC Center for Public Affairs Research shows that “about two-thirds of Americans are very or extremely concerned about hacking that involves their personal information, financial institutions, government agencies, or certain utilities.”
Shift: Trust expands to cover a broader array of stakeholder concerns and become an enterprise-wide responsibility.
These enormous shifts in technology power and capacity will create many more touchpoints with customers and an exponential wave of new data about customers. Even as IT’s role within the organization becomes more that of an enabler, the expanding digital landscape means that IT must broaden its trust capabilities around security, privacy, and cyber. To date, consumers have largely embraced the convenience that technology provides, from ordering a product online to adjusting the temperature in their homes remotely to monitoring their health through personal devices. In exchange for these conveniences, consumers have traditionally been willing to provide some personal information. But a steady undercurrent of privacy and trust concerns around these ever-more-sophisticated conveniences is raising the stakes on the broad topic of trust. Consumers are becoming more aware of their identity rights, making decisions based on values, and demanding the ethical use of data and responsible AI.
The most obvious concern is around cybersecurity, an ongoing issue that is already on the board-level agenda. But tech-driven trust issues are much broader and are driven by three characteristics. One is the sheer quantity of personal data, such as biometrics, that companies and governments collect, creating concerns about privacy and data misuse. The second is that personal security issues are becoming more pervasive in the physical world. Wired homes, connected cars, and the Internet of Medical Things, for example, are all vectors for attack that can affect people’s well-being. Third is the issue that advanced analytics seem too complex to be understood and controlled, leading to deep unease about people’s relationship with technology. This issue is driving the development of “explainable AI” and the movement to debias AI.
Adding to the complexity is the frequent need to manage and secure trust across an entire ecosystem of technologies. Take the wired home, for example. The proliferation of devices—think virtual assistants, security, communications, power management, and entertainment systems—means that a large group of providers will need to agree on standards for managing, in effect, an interconnected security net in the home.
These developments require a complex extension of the boundaries of trust. The significant advantages that many incumbents enjoy—existing relationships with customers and proprietary data—are at risk unless businesses rethink how they manage and nurture that trust. Companies need to consider putting identity and trust management at the core of their customer experience and business processes. That can happen effectively only when companies assign a dedicated leader with real power and board-level prioritization with enterprise-wide responsibility across the entire trust and security landscape. Given the tech underpinnings of this trust environment, IT will need to play a key role in monitoring and remediating, such as assessing the impact of new legislation on AI algorithms, tracking incidents, identifying the number and nature of high-risk data-processing activities and automated decisions, and—perhaps most important—monitoring consumer trust levels and the issues that affect them.
Questions for leadership
- Who is responsible for the enterprise-wide trust and risk landscape?
- How have you integrated your efforts around customer trust with overall cybersecurity processes?
- What privacy, trust, and security processes are in place to manage the entire life cycle of your data?