Whether you’re a CIO wondering how to stay ahead of the competition or a tech enthusiast thinking about a career change to information technology, staying on top of IT trends and developments does more than satisfy casual interest — it’s essential for staying grounded in an environment of constant disruptions.
Artificial Intelligence continues to be the most important and innovative area within IT. This shouldn’t be too surprising, since it affects all other trends on this list. Whether it’s the continuing interest in generative AI or explorations into agentic AI, organizations across every industry are investigating how they can stay ahead of the ever-accelerating curve. But when it comes to the latest trends in information technology, that’s just the tip of the iceberg.
Here’s what we’re seeing for the next big IT trends in 2025 and heading into 2026 as identified by consultants, analysts and technologists.
1. Advances in Artificial Intelligence
The next big thing in AI looks to be agentic AI. Agentic AI takes generative AI a step further by performing autonomous tasks, planning and achieving goals on behalf of the user or another system, all without relying solely on human prompts or oversight. While agentic AI was virtually unused just last year, Gartner now estimates that by 2028, at least 15% of day-to-day work decisions will be made autonomously using this technology.
There were continued advances in generative AI, with programs such as ChatGPT, Dall·E 3, Scribe and Google Gemini joined by other tools such as Amazon Q Developer, HubSpot’s Breeze Copilot and SalesForce’s Einstein. According to a McKinsey survey, 65% of respondents claimed to be using generative AI in at least one business function, up from just one-third of users in 2023.
That increase in adoption has helped to raise the projected Compound Annual Growth Rate (CAGR) of the GenAI market from 35.6% to 37.6%. GenAI increasingly looks to impact nearly every industry, but will likely have the biggest repercussions for fintech, telecommunications, healthcare, aerospace and defense. But don’t expect GenAI to be ubiquitous just yet — according to an IEEE survey, 1 in 5 respondents are either just beginning to explore the technology or are still working to overcome challenges.
Applied AI — especially the ability of AI to enhance efficiency and accuracy, improve efficiency and cost savings and offer better decision-making through predictive analytics — will continue to gain in popularity. However, this drive for larger sources of data will demand an increase in real-time data and raise concerns about implementation costs and risks of bias and privacy violations.
AI governance will increasingly become a priority for organizations, and Gartner predicts that by 2028, enterprises using AI governance platforms will achieve higher customer trust ratings and better regulatory compliance scores than their competitors. We’ll also see the first adoption of AI regulations in different regions, such as with the EU’s Artificial Intelligence Act and The Artificial Intelligence Research, Innovation, and Accountability Act of 2023 in the U.S.
2. Push for Greater Connectivity
The implementation of 5G by the telecommunications industry continues, as does the private adoption of 5G networks across different industries. Enterprise adoption is progressing at a slow but steady pace, as 5G integration — despite successful pilot programs —remains costly and is delivering lower-than-expected ROI. Expect the conversation around 5G to shift more toward 5G Advanced, especially with T-Mobile announcing the first nationwide 5G Advanced network, and the future possibilities of 6G cellular.
As 5G matures, we’ll continue to see growth in IoT devices — such as connected vehicles, smart buildings, agricultural drones, and supply chain management tools — along with the expansion of remote visual inspection for tasks ranging from building assessments to remote surgery. The development of 5G and IoT will also determine exactly how far vehicle-to-everything (V2X) technology will take us to fully autonomous vehicles. A robust infrastructure will need to exist alongside refined V2X technology and machine learning (ML) to enable better communication between cars and city infrastructure. Companies still have work to do if they want to win over consumers who are cautious about the idea of self-driving cars.
If space is the final frontier, then low-Earth-orbit (LEO) satellites might be part of our daily commute. Amazon is launching its own series of satellite constellations to compete with SpaceX in bringing internet access and communication to remote parts of the world. The new space race is already raising concerns about space debris — especially as more orbital objects are seen burning up on re-entry — and is likely to prompt stronger calls for regulation and oversight.
3. Greater Adoption of Edge Computing
The rise of AI-powered devices and increased availability of 5G networks is also driving another emerging trend in information technology — the adoption of edge computing. We’re seeing predictions that the edge computing market is expected to expand at a 33% CAGR from 2025 to 2033.
Unlike cloud computing, which sends data to off-site servers for storage or processing, edge computing places processing nodes closer to the data source and the end user. This allows for quicker and more efficient processes than having to transfer data to centralized platforms. Edge computing allows for an instant response, tightens the feedback loop and helps to improve resilience against network downtime.
Part of what’s supporting efforts in edge computing is containerization. As a single software package, a container allows a user to effectively run multiple isolated applications on a single device. Containers are a key tool for enabling microservice architecture as they can be utilized to package an application, put it onto a piece of hardware and allow it to function, increasing reliability, scalability, portability and security.
By processing data closer to its origin, edge computing can drastically reduce latency, which is especially valuable for companies with customer-facing digital applications in remote or distant locations. Industries such as retail, manufacturing, utilities and transportation and logistics will benefit the most from these edge, real-time operations.
Edge computing can also improve cybersecurity, as its distributed nodes are less vulnerable to cyberattacks than a single platform. This is a great benefit to locations that process sensitive data and/or power critical systems, such as hospitals, factories and government agencies. Edge computing can allow these places to operate without a network connection and at low latency, which can improve overall reliability.
4. More Defenses Against Cyber Attacks and Disinformation of Edge Computing
Advances in technology also mean advances in cyberattacks. Research suggests that one in two businesses has been the victim of a successful cyberattack in the past three years, with one of the primary accelerators being the use of AI by cybercriminals and state-sponsored hackers to aid in attacks. The cost of these attacks to global industry is expected to grow steadily over the next several years, potentially reaching up to $15.63 trillion by 2029.
What could be an even greater threat is the rise of disinformation. The world is seeing a marked rise in disinformation from malicious actors, powered by generative AI. Disinformation can aid in cyberattacks by creating more convincing phishing emails or similar social engineering deceptions. Disinformation campaigns aren’t just a risk to individuals; they can lead directly to losses from fraud, boycotts and reputational damage. Other top cybersecurity trends to be aware of:
- Deepfake images and videos will be nearly unrecognizable from authentic images. This will coincide with a rise in hyper-personalized scams that directly target individuals.
- Identity theft and account takeovers are likely to increase as a result of social-engineering data theft and breaches.
- The constant demands and work stressors on cybersecurity teams could increase burnout among cybersecurity professionals. The combination of high demand and turnover may continue to make cybersecurity an appealing field for aspiring IT leaders.
To counter these threats, you can expect to see continued advancements in software security, updated training for employees and personnel and the development of more sophisticated automated responses to cyberattacks.
Stay informed in the ever-evolving world of cybersecurity with our list of the 24 Best Cybersecurity Podcasts for 2025.
5. Investments in Sustainable Technology
If combating the existential threat of climate change isn’t enough of a reason to spur the market to invest in sustainable technology, research shows that consumers will increasingly make sustainability a priority and demand products and practices that are environmentally and socially responsible. Whether you call it green tech or sustainable technology, tech will play an essential role in achieving the United Nation’s net-zero emission goals by 2030.
For example, IoT devices can provide a higher level of transparency across the supply chain. This would support traceability and more effective material flows as well as ensure the ethical sourcing of materials. IoT platforms could also be used to support sustainable land and water use, whether it’s monitoring water levels, the movement of grazing animals or nutrient levels in soil. We’re also likely to see increased interest in carbon capture utilization and storage (CCUS) technologies and start-up companies with rising private sector interest.
Companies and organizations will increasingly look to AI for its potential to support the development of new sustainable technology and act as a tool to repair and protect the environment. AI models can offer an unprecedented analysis of environmental data, highlighting areas of ecological impacts and suggesting areas that require immediate action. AI can also be used in product development to help craft innovative sustainable solutions, from more efficient electrical systems to designs that use fewer materials and produce less waste.
These concepts of recyclability and reusability are central to the idea of the circular economy, which will become an increasingly important concept across all industries. According to the EPA, a circular economy is one that “keeps materials, products, and services in circulation for as long as possible.” This economic model focuses on reducing material use by redesigning products and services to be less resource intensive, aiming to recapture “waste” as a resource.
The implications of the circular economy for IT includes:
- Designing electrical devices to be easier and simpler to repair
- Designing a product life cycle that maximizes the reuse of materials
- Recycling materials such as plastic into materials for making new products
- Conceptualizing products that encourage consumers to use them longer or repair them
6. Continued Adoption of Blockchain
Cryptocurrency continues its bounce-back, with multiple Bitcoin exchange-traded funds (ETFs) having been approved to be listed and traded by the Securities and Exchange Commission (SEC) at the end of 2024. It remains to be seen how far and fast cryptocurrency will rise, though at this point it’s clear that blockchain technologies and decentralized systems remain a powerful tool for the financial industry and for enterprise business in the immediate future.
As a tool for the permanent, immutable and transparent recording of data and transactions, blockchain’s applications go beyond finances. Blockchain can be used to help businesses meet critical security needs while enhancing efficiency and transparency within their operations. For example, blockchain can be used to create automated smart contracts that have instructions coded into tokens that can self-execute when specific conditions are met, immediately fulfilling the contract.
Blockchain will also be used to create more decentralized systems that can enable trustless, peer-to-peer interactions that don’t need intermediaries. We’ll likely see greater use of AI analytics and ML to develop more efficient data analysis methods, enhance data verification procedures and quickly detect cyberattacks or fraudulent transactions in the blockchain.
The use of tokenization — the substitution of a sensitive data element with a non-sensitive equivalent — will continue to scale as more financial institutions adopt it as a security measure. The increased use of digital assets and tokens is expected to further drive the adoption of Web3 technology — a decentralized internet built on blockchains. Whether widespread adoption of Web3 will happen or not is up for debate, but with the massive amount of money in play, expect to hear more hype around the myriad possibilities of blockchain.
7. Developments in Quantum Computing
2025 is the International Year of Quantum Science and Technology (IYQ) — but will that translate into advances in quantum technology? Following excitement about developments in quantum computing in 2024, Nvidia CEO Jensen Huang threw cold water on the hype by suggesting he felt that practical quantum computers were still 15-30 years away. Others are more bullish on the idea that we’ll see more breakthroughs in quantum computing in 2025, with Microsoft saying that companies should be “quantum ready.”
If you’re unfamiliar with quantum computing, it operates on the unique principles of quantum mechanics, utilizing units called qubits. Unlike classical bits that can be either in a state of 0 or 1, qubits leverage the concept of superposition. Thanks to superposition, a qubit can have a value of 1 or 0, or even be a 1 and 0 at the same time, existing in multiple states at once, allowing it to represent multiple possibilities at the same time.
Quantum computing can exploit this superposition to perform calculations in parallel. Where traditional computers use serial processing to check all possible combinations of states to find a solution to a complex problem, a quantum computer can directly arrive at a solution by leveraging the parallelism of qubits without needing to examine every potential variant of the system’s states. A process that could take the most advanced supercomputer years to calculate could be done by a quantum computer in seconds.
Because quantum computers and classical computers process information differently, they’re better suited to solving different problems. Quantum computers are intended to tackle extremely complex computational challenges, such as novel drug discovery, genome sequencing, cryptography, financial modeling and meteorology. The first real utilization of quantum processors will likely be in hybrid high-performance computing (HPC) centers that combine the unique capabilities of both classical and quantum machines.
As quantum computing continues to mature from theoretical possibilities to hard reality, it’s expected that companies will shift away from pushing higher qubit counts and toward the incremental work of making quantum computers more stable and scalable. Analysts such as Gartner are already suggesting that we start transitioning to post-quantum cryptography, as quantum computers will be able to completely break existing forms of encryption.
8. Greater Proliferation of Robotics
In 2025, robotics is set to rapidly advance due to breakthroughs in generative and agentic AI, enabling machines to become more autonomous, adaptable and humanoid in behavior. These technologies are expanding the roles robots can play — from assisting in surgery and elder care in healthcare, to supporting daily living in homes. According to Gartner, by 2030, 80% of people will interact with smart robots daily — up from just 10% today — signaling a profound shift in how we work and live.
In industry, rising labor costs and operational demands are accelerating adoption in manufacturing and logistics, driving investment in polyfunctional robots that can learn by example and perform a variety of tasks. The capability and flexibility of these robots allow industries to streamline operations, reduce costs and respond quickly to changing production demands.
Autonomous transport is also gaining momentum, with companies exploring robo-taxis and self-driving trucks to address driver shortages and optimize delivery networks. For IT leaders, this trend presents opportunities and challenges: integrating robotics into systems, ensuring data security, addressing ethical concerns and preparing the workforce for human-robot collaboration.
9. Breakthroughs in BioEngineering
One trend that’s set to make headlines in 2025 is biotechnology, which is on the cusp of several big advances. Investment in and adoption of biotech remains relatively limited due to high upfront investment costs and its specialized industry focus. Despite these barriers, innovations in bioengineering are gaining momentum across healthcare, agriculture, consumer products, sustainability and materials. In healthcare, technologies such as gene therapy and CRISPR-based editing are reshaping treatment possibilities. One such milestone was the FDA and European Commission’s approval of Casgevy by Vertex Pharmaceuticals — the first CRISPR-Cas9 gene therapy, offering new hope for curing genetic disorders.
One particular area of research that sounds almost like science fiction is neurotechnology. From helping restore mobility and speech to enhancing human cognition through brain-computer interfaces, these developments could eventually improve communication, memory and learning. For IT and innovation leaders, these trends signal a growing need for secure data infrastructure, interdisciplinary collaboration and ethical oversight.
Other examples of bioengineering developments include:
- Advancements in genomics can evolve personalized medicine, allowing doctors to tailor treatments to individual genetic profiles, increasing effectiveness and minimizing side effects.
- In agriculture and food production, bioengineering supports more resilient crops and alternative-protein production, which contributes to food security and sustainability.
- Consumer goods are increasingly integrating bio-based materials, and biotechnology is enabling the development of biodegradable plastics and synthetic biofuels.
Though slower to commercialize than other technologies, biotechnology’s transformative potential — particularly in health and environmental impact — is becoming increasingly evident and is poised to reshape multiple sectors in the coming decade.
10. Exploration of Immersive Reality
The promise of immersive reality still hasn’t rebounded since the 2023 decline in interest and demand, but there’s still movement in the virtual reality (VR) and augmented reality (AR) sectors. While it’s true that both AR and VR are still in their infancy and aren’t projected to pass 100 million users until 2027 (far below the billions of people worldwide who use smartphones or computers), more industries will likely use AR and VR regularly thanks to increased fidelity, better ease of use and most importantly, greater affordability.
In particular, the use of digital twins — a virtual representation of a real-world object, system or process — is expected to see an even greater degree of growth with a projected CAGR of 37.9% through 2030. AR and VR technologies are set to have a powerful impact on managing product lifecycle management (PLM):
- VR enables remote teams to walk, talk and work together in a shared virtual space where they can manipulate 3D models in real-time and iterate visual designs in real time to elicit instant feedback.
- AR allows designers to project 2D or 3D virtual models of potential modifications and enhancements over physical locations or projects, improving comprehension and helping to inform decision-making.
The extent of AR and VR influence on PLM could be as simple as improving an individual component or as complex as planning out the entire development and delivery process. Being able to superimpose virtual minimum viable products into actual scenarios has the potential to save manufacturers valuable time and resources in the product development life cycle.
And the casual adoption of immersive reality isn’t entirely dead. The popular digital gaming platforms Roblox and Fortnite now regularly offer digital events that encourage users to log on and engage in active events. Disney is one company exploring the viability of a persistent, virtual entertainment universe, having acquired a $1.5 billion stake in Epic Games.
Are you excited about the future of IT and the evolution of our modern IT ecosystem? If so, then you may be ready for a career at the higher levels of information technology. The Master of Science in Information Technology Leadership program at the University of San Diego was designed to provide IT professionals with the skills and knowledge to critically analyze complex technical systems and guide their teams to success.




