Top 8 Emerging Information Technology Trends

10 min read
man and woman looking at computer screen at work

For any information technology professional, staying on top of IT trends and developments does more than satisfy casual interest — it’s essential for staying grounded in an environment of constant disruptions.

The year 2023 proved to be a watershed moment for artificial intelligence, which has helped to spur the development and adoption of other technologies. Where are the information technology trends likely to emerge in 2024?

Here are the top eight trends in information technology as identified by consultants, analysts and technologists.

1. Further Advances in Artificial Intelligence

The reality of applied AI became even more apparent after generative AI (GenAI) broke big in 2023 thanks to programs like ChatGPT, Dall·E 3, Scribe, Google Bard and AlphaCode. The GenAI movement is expected to continue strong into 2024 with more companies developing assisted programs and application tools such as Amazon CodeWhisperer, HubSpot’s ChatSpot and SalesForce’s Einstein.

According to Gartner, currently less than 5% of enterprises either use generative AI models or have deployed GenAI-enabled applications in production environments. However, that percentage is predicted to rise to more than 80% of enterprises by 2026, helping to drive a projected Compound Annual Growth Rate (CAGR) of 35.6% for the GenAI market from 2023 to 2030. That aligns with Accenture’s surveys, which found that 95% of global executives and 86% of executives in the software industry believe GenAI will play a central role in enterprise intelligence.

The promise of GenAI is that it will democratize access to knowledge and skills while minimizing or eliminating menial work. This will enable faster information retrieval, more time for complex tasks and a potential rise of citizen developers utilizing low and no-code product development applications. GenAI will impact nearly every industry, but will likely have the biggest repercussions for fintech, telecommunications, healthcare, aerospace and defense.

Expect further advances in predictive AI as well, as this field also becomes more democratized and continues to gain in popularity. However, this drive for more predictive analytics will demand an increase in real-time data, and could possibly push organizations to embrace “synthetic” data — artificially generated data that closely mimics real-world use cases — in order to fill in the gaps.

Uncertainty about the use of AI and concerns about its risks are likely to continue as the companies face questions about inaccuracy, security, intellectual property, privacy concerns and more. The question of how AI is used and what it should be used for led to turmoil at OpenAI at the end of 2023, when CEO Sam Altman was fired and then reinstated within the span of a week.

Expect disruption to be the driving force behind future AI advancements — both for industries and AI development itself. We should expect many of these issues to raise the possibility of federal AI regulations over the next few years.

2. Expansion of 5G and IoT Networks

Will 2024 be the year that 5G broadband comes into its own? There are certainly signs as 5G networks continue to come online and we see more supporting applications and tools.

  • In December 2023 Verizon and Zebra Technologies announced the launch of purpose-built mobile device and software solutions designed to help their 5G private-network customers.
  • Also in December, Huawei Technologies promised to launch “leading, innovative and disruptive” following the release of its 5G-capable Mate 60 smartphone series.
  • Juniper research predicts 2024 will see the first commercial launch of a 5G satellite network, eventually leading to over “110 million 3GPP‑compliant 5G satellite connections in operation by 2030.”
  • Samsung expects 2024 to be a “breakout year” for commercial 5G network slicing services, which would enable features such as enhanced live event broadcasting or mission-critical push-to-talk devices.

Network slicing, which is the overlay of multiple virtual networks on top of a shared physical network, is a key feature of 5G. With slicing, a 5G network operator has more control over traffic to meet the many requirements of different customers, resulting in greater network performance, capacity and security. For example, one 5G network could simultaneously support an ultra-low latency slice for internet of things (IoT) devices while also streaming video on a high bandwidth slice.

The development of 5G and IoT will also determine exactly how far vehicle to everything (V2X) technology will take us to fully autonomous vehicles. A robust infrastructure will need to exist alongside refined V2X technology — from vehicle to vehicle (V2V) communication to vehicle to infrastructure (V2I) and vehicle to pedestrian (V2P) capabilities — as well as using Machine Learning (ML) to enabling better communication between cars and city infrastructure to better predict collisions and reduce accidents. Expect to see more development in V2X technology in 2024 as companies work to win over consumers who are more cautious about the idea of self-driving cars.

Expect further innovations in 5G to accelerate the creation and use of IoT technology, such as connected vehicles, smart buildings, agriculture drones, supply chain management and remote visual inspection for tasks ranging from building inspections to remote surgery. As 5G is further integrated into connectivity technologies such as Wi-Fi, cellular data and low-Earth-orbit (LEO) satellites, there will be stronger calls for defining standards of security, privacy and device interaction.

3. Advances in Quantum Computing

2024 could be the year that we see tangible gains in quantum computing. In October 2023, Atom Computing announced it had created a 1,225-site atomic array, currently populated with 1,180 qubits, in its next-generation quantum computing platform. It expects its systems to be available in 2024, with a potential of realizing fault-tolerant quantum computing within a decade.

So what does that mean? If you’re unfamiliar with quantum computing, it operates on the unique principles of quantum mechanics, utilizing units called qubits. Unlike classical bits that can be either in a state of 0 or 1, qubits leverage the concept of superposition. Thanks to superposition, a qubit can have a value of 1 or 0, or even be a 1 and 0 at the same time, existing in multiple states at once, allowing it to represent multiple possibilities at the same time.

Quantum computing can exploit this superposition to perform calculations in parallel. Where traditional computers use serial processing to check all possible combinations of states to find a solution to a complex problem, a quantum computer can directly arrive at a solution by leveraging the parallelism of qubits without needing to examine every potential variant of the system’s states. A process that could take the most advanced supercomputer years to calculate could be done by a quantum computer in seconds.

Because quantum computers and classical computers process information differently, they’re better suited to solving different problems. Quantum computers are intended to tackle extremely complex computational challenges, such as novel drug discovery, genome sequencing, cryptography, financial modeling and meteorology. It’s likely the first real utilization of quantum processors will be in hybrid high-performance computing (HPC) centers that combine the unique capabilities of both classical and quantum machines.

As quantum computing continues to mature from theoretical possibilities to hard reality, it’s expected that companies will shift away from pushing higher qubit counts and toward the incremental work of error correction and developing practical problem-solving applications. While analysts such as Gartner do not expect quantum computing to have direct impacts on enterprises in the immediate future, they do expect to see actionable developments by 2030.

4. Greater Adoption of Edge Computing

The rise of AI-powered devices and increased availability of 5G networks is also driving another emerging trend in information technology — the adoption of edge computing. We’re seeing predictions that the edge computing market is expected to expand at a 37.9-38.9% CAGR from 2023 to 2030.

Unlike cloud computing, where data is sent to an offsite server for storage or processing, in edge computing the data processing nodes are situated closer to the data source and the user. This allows for quicker and more efficient processes than having to transfer data to centralized platforms. Edge computing allows for an instant response, tightens the feedback loop and helps to improve resilience against network downtime.

Part of what’s helping to support recent efforts in edge computing is containerization. As a single software package, a container allows a user to effectively run multiple isolated applications on a single device. Containers are a key tool for enabling microservice architecture as they can be utilized to package up an application, put it onto a piece of hardware and allow it to function, increasing reliability, scalability, portability and security.

By processing data closer to its origin, edge computing can drastically reduce latency, which is especially valuable for companies with customer-facing digital applications in remote or distant locations. Industries such as retail, manufacturing, utilities and transportation and logistics will benefit the most from these edge, real-time operations.

Edge computing can also improve cybersecurity, as its distributed nodes are less vulnerable to cyberattacks than a single platform. This is a great benefit to locations that process sensitive data and or power critical systems, such as hospitals, factories and government agencies. Edge computing can allow these places to operate without a network connection and at low latency, which can improve overall reliability.

5. Exploration of Augmented and Virtual Reality

While the Metaverse concept itself fell flat in 2023, the virtual reality (VR) and augmented reality (AR) sectors are likely to expand in 2024. The AR market in particular is projected to have a CAGR of 36% through 2030 for related software and hardware.

While it’s true that both AR and VR are still in their infancy and aren’t projected to pass 100 million users until 2027 (far below the billions of people worldwide who use smartphones or computers), more industries will likely use AR and VR on a regular basis thanks to increased fidelity, better ease of use and most importantly, greater affordability.

In particular, the use of digital twins — a virtual representation of a real-world object, system or process — is expected to see an even greater degree of growth with a projected CAGR of 39% through 2030. A Deloitte survey of manufacturing executives found that 92% have companies that are experimenting with or implementing at least one metaverse-related use case (that’s the general metaverse concept, not affiliated with the “capital M” Metaverse), and are averaging six or more different use cases.

AR and VR technologies are set to have a powerful impact on managing product lifecycle management (PLM):

  • VR enables remote teams to walk, talk and work together in a shared virtual space where they can manipulate 3D models in real-time and iterate visual designs in real time to elicit instant feedback.
  • AR allows designers to project 2D or 3D virtual models of potential modifications and enhancements over physical locations or projects, improving comprehension and helpling to inform decision-making.

The extent of AR and VR influence on PLM could be as simple as improving an individual component or as complex as planning out the entire development and delivery process. Being able to superimpose virtual minimum viable products into actual scenarios has the potential to save manufacturers valuable time and resources in the product development lifecycle.

And the casual adoption of AR isn’t entirely dead. While Google and Apple have put their plans for wearable augmented reality glasses on hold, Meta recently released its own first generation Ray-Ban Meta smart glasses collection, which it claims is a major step toward true AR glasses.

6. Enhanced Cybersecurity Risks

Unfortunately, cybersecurity trending in 2024 isn’t a good sign. Research suggests that one in two businesses has been the victim of a successful cyberattack in the past three years, and the cost of these attacks to industry is expected to grow to over $10 trillion by the end of 2024.

That’s because cyberattacks are projected to become even more intricate, as hackers and cybercriminals look for ways to exploit AI while an expanding array of connected devices will make networks even more vulnerable. The outlook is, honestly, rather grim:

To counter this new threat, you can expect to see advancements in software security, updated training for employees and personnel and the development of more sophisticated automated responses to cyberattacks.

7. Investment in Sustainable Technology

If combatting the existential threat of climate change isn’t enough of a reason to spur the market to invest in sustainable technology, research shows that consumers will increasingly make sustainability a priority and demand products and practices that are environmentally and socially responsible. Whether you call it green tech or sustainable technology, there will be increased pressure in 2024 to both realize sustainability within technology and to develop technology that can improve sustainability.

Technology has an important role to play, as IoT devices can provide a higher level of transparency across the supply chain. This would support traceability and more effective material flows as well as ensure the ethical sourcing of materials. IoT platforms could also be used to support sustainable land and water use, whether its monitoring water levels, the movement of grazing animals or nutrient levels in soil.

Companies and organizations will increasingly look to AI for its potential to both support the development of new sustainable technology and act as a tool to repair and protect the environment. AI models can offer an unprecedented analysis of environmental data, highlighting areas of ecological impacts and suggesting areas that require immediate action. AI can also be used in product development to help craft innovative sustainable solutions, from more efficient electrical systems to designs that use less materials and produce less waste.

These concepts of recyclability and reusability are central to the idea of the circular economy, which will become an increasingly important concept across all industries. According to the EPA, a circular economy is one that “keeps materials, products, and services in circulation for as long as possible.” It is an economic model designed around reducing material use by redesigning products and services to be less resource intensive trying to recapture “waste” as a resource.

The implications of the circular economy for IT includes:

  • Designing electrical devices to be easier and simpler to repair
  • Designing a product life cycle that allows for materials to be reused as much as possible
  • Recycling materials such as plastic into materials for making new products
  • Conceptualizing products to support consumers in using things longer or repairing them

8. Renewed Interest in Blockchain

The last few years have, to put it mildly, not been kind to the concept of cryptocurrency. It remains to be seen how far cryptocurrency can bounce back, with future value depending on how it can mitigate its environmental impacts and if the U.S. Securities and Exchange Commission approves a spot Bitcoin exchange-traded fund. What is certain is that blockchain technologies and decentralized systems remain a powerful tool for the financial industry and for enterprise business in the immediate future.

As a tool for the permanent, immutable and transparent recording of data and transactions, blockchain’s applications go beyond finances. Blockchain can be used to help businesses meet critical security needs while enhancing efficiency and transparency within their operations. For example, blockchain can be used to create automated smart contracts that have instructions coded into tokens that can self-execute when specific conditions are met, immediately fulfilling the contract.

Blockchain will also be used to create more decentralized systems that can enable trustless, peer-to-peer interactions that don’t need intermediaries. We’ll likely see greater use of AI analytics and ML to develop more efficient data analysis methods, enhance data verification procedures and quickly detect cyberattacks or fraudulent transactions in the blockchain.

While the mix of recent cryptocurrency crashes and low adoption rates have called the concept of Web3 into question, Web3 devs are still claiming that blockchain-based video games and NFT collectables are set to take off in 2024. Whether it will actually happen or not is up for debate, but with the massive amount of money in play, expect to hear more hype around the myriad possibilities of blockchain.

Are you excited about the future of IT and the evolution of our modern IT ecosystem? Then you may be ready for a career at the higher levels of information technology. The Master of Science in Information Technology Leadership program at the University of San Diego was designed to provide IT professionals with the skills and knowledge to critically analyze complex technical systems and guide their teams to success.

Is this the right master’s degree for your technical career? Get our free eBook Choosing an Online Master’s Degree to learn more.