Share this

What's the next step for mobile computing, looking at 5G and AI?

2026-04-06 05:17:11 · · #1

What will be the next screen after smartphones?

AR contact lenses can perform image calculations on the eye anytime, anywhere. They integrate image processing chips, display control, and wireless communication systems on a lens with a diameter of less than 2cm.

Wireless design references for AR glasses have also emerged, allowing for the distribution of computing load between smartphones and AR glasses, resulting in a smoother integration and raising expectations for consumer-grade AR glasses.

On the other hand, ARM for PC is gradually becoming an industry consensus. The low power consumption and low power consumption advantages of the ARM architecture have been ported to the PC, thus breaking the long-standing problem of mobile PCs' poor battery life.

Whether it's the currently popular metaverse field or the previous generation of mobile computing platforms, they all seem to be developing towards the same trend—

High mobility, low power consumption, long battery life, and high performance are also desired.

These are precisely the characteristics that smartphones possess, and they are also the most important reason why they have become the mainstream mobile computing platform in just over a decade.

It's unclear when it started, but smartphones are no longer the sole protagonist among various mobile computing platforms—at least from the perspective of the underlying technological trends, this is quite obvious.

The new mobile computing platform is based on the upgrading and integration of the past.

We have indeed reached a new juncture in the development of mobile computing.

By definition, mobile computing encompasses the entire process of generating, sharing, and displaying various types of data on mobile devices. This data may originate from human-computer interaction or from device sensing.

The most popular mobile computing platform today is undoubtedly the mobile phone.

With the popularization of 5G and AI technologies, human-computer interaction is being further innovated, the form of device perception is gradually changing, and new forms of mobile computing are emerging, such as XR hardware and traditional PC upgrades.

First, let's look at 5G. No matter how the platform form of mobile computing iterates, communication technology is an indispensable part, and it will even put forward higher quality requirements for network connectivity.

Looking back at the development from 2G to 3G and then to 4G, the focus was actually on expanding applications on mobile phones. But with the advent of 5G, this trend has quietly changed.

The latest 5G features released by the standards-setting organization 3GPP have applications that are not limited to mobile phones, but are more diversified.

Underlying AI capabilities are another major factor driving the development of mobile computing. They ensure that mobile devices have sufficient data processing capabilities under limited computing power and power conditions to cope with diverse scenarios, such as games, entertainment, photography, and other functions that are closely related to our lives.

In today's era of intelligence and digitalization, AI has undoubtedly become the object of fierce competition among major platform manufacturers and chip suppliers.

Therefore, it is only natural that mobile computing, with the natural expansion and migration of 5G and AI, has given rise to a brand new platform with newer and better experiences.

If we consider 5G as a fundamental capability, and use AI capabilities as a dividing line, the development direction of mobile computing can be roughly divided into two categories:

One type is dimensional upgrading, and the other is fusion.

Let's start with dimensional elevation, taking XR-based mobile computing platforms as a prime example. For millennia, the mediums through which humans transmit information have been two-dimensional and planar; however, the XR devices of the metaverse directly elevate this two-dimensionality to three-dimensional. Our entire space becomes virtualized and computable. This places higher demands on the underlying algorithms and computing power.

Taking the visual algorithms that we can perceive most as an example, mobile phones mainly focus on two-dimensional photography scenarios, including face detection, super-resolution noise reduction, image quality enhancement, and ultra-high-definition portraits.

Taking Qualcomm, a player at the bottom level, as an example, Qualcomm AI Engine on the Snapdragon platform has made continuous progress in this regard: the current seventh-generation Qualcomm AI Engine has increased the number of facial feature recognition points to 300, the face detection speed has been improved by 300% compared to the previous generation, and there are also functions such as multi-frame noise reduction and local motion compensation.

On the XR device side, real-time computing across the entire scene begins as soon as it is put on, including position tracking, 3D object tracking, plane detection, spatial mapping and meshing, scene understanding, etc.; not only are the algorithms upgraded from two-dimensional to three-dimensional, but higher requirements are also placed on realism and low latency.

Another type is convergence, a trend exemplified by PC-based mobile computing. With the rise in demand for mobile and remote work, the PC market has experienced unprecedented growth in recent years. Last year, the global PC market saw its largest shipment volume in nearly a decade, reaching 341 million units.

It's worth noting that, aside from some chip manufacturers starting to use ARM for PCs, more mobile phone manufacturers and internet companies are also deploying tablet and laptop products, and software developers are beginning to break down the boundaries between mobile and PC.

The reasons are not hard to understand; it boils down to the advantages of an integrated ecosystem. Mobile phones are developing rapidly, while the value of PCs remains irreplaceable. Therefore, it's better to integrate the two, allowing for seamless switching and achieving full-scenario coverage.

This has become an industry consensus. For example, in order to achieve better "integration," some manufacturers have gradually migrated some algorithms that were originally on mobile devices, such as facial recognition, voice recognition, and edge AI acceleration in video conferencing, to PCs. Emerging functions such as multi-screen collaboration and universal control are becoming essential for productivity.

In summary, it's clear that the evolution of AI-driven mobile computing is rapidly shifting from a single form factor based on smartphones to a multi-converged, multi-scenario approach encompassing smartphones, XR, and new mobile PCs. For example, according to Counterpoint data from January of this year, XR headset shipments are projected to reach 105 million units by 2025, a tenfold increase from 11 million units in 2021.

However, how to proceed and some unresolved technical issues, such as computing power, algorithms, and architecture compatibility, are significant challenges for enterprises and will also affect the future development and iteration of mobile computing.

What are your thoughts on the future development and iteration of mobile computing?

From an external perspective, the present and future of mobile computing seem to lie in hardware with different forms, such as smartphones, XR devices, and mobile PCs.

However, regardless of how the form of the terminal or platform used for mobile computing changes, the fundamental elements that determine its attributes remain the chip, the underlying software, and the algorithm.

As the representative of current mobile computing platforms, smartphones have matured in terms of technology, and their mobile computing capabilities are constantly being iterated and upgraded.

Therefore, the question of "what should be the next step for mobile computing" is no longer entirely about technological innovation starting from scratch, but rather about how to enable existing technologies to be migrated across different fields.

How exactly should I do it?

Qualcomm, an industry player, has provided a technical roadmap reference—

Using the general hardware architecture and software features that originated in smartphones as key factors, the technology will be gradually extended to new mobile computing platforms such as XR and PCs.

The most fundamental capabilities of this approach can be traced back to 5G and AI.

On the one hand, driven by both factors, the human-computer interaction methods on mobile devices are no longer limited to touch screens.

Features such as voice assistants and gesture interaction, based on NLP and CV technologies, enable human-computer interaction that is more in line with the natural way humans communicate, and have inspired many unprecedented applications. For example, the seventh-generation Qualcomm AI Engine can analyze a user's voice to determine whether they are depressed or physically healthy.

However, these large amounts of AI application data need to be transmitted quickly on mobile computing platforms, which requires the platforms to have strong 5G communication capabilities.

These features, which are "icing on the cake" on smartphones, may become essential on XR devices, further increasing the demand for 5G capabilities.

On the other hand, the support of massive algorithms has allowed mobile phone performance to break through limits time and time again.

For example, in terms of photography, features such as noise reduction, autofocus, filter application, and 8K HDR can all be achieved with the help of algorithms, which have become the key to shooting blockbuster videos with mobile phones.

In the gaming sector, deep learning super sampling technology (DLSS) can be used to extract multi-dimensional features of the rendered scene on mobile phones and intelligently combine the details of multiple frames to construct high-quality images, achieving performance that surpasses traditional rendering, such as cloud rendering. This places higher technical requirements on 5G and AI.

Issues such as stable signal transmission and battery life can also be intelligently optimized through AI.

These methods and approaches can also be used to improve the performance of XR devices and mobile PCs.

In fact, using AI as a common underlying capability to provide technological support for smartphones, XR devices, and mobile PCs is a technology route that has been recognized in the industry.

For example, based on a "unified technology roadmap," Qualcomm's AI capabilities have gradually penetrated into almost all types of terminals, such as XR and PCs.

In the XR field, almost all mainstream XR manufacturers, including Meta Oculus, Microsoft HoloLens, ByteDance Pico, and Skyworth VR, have adopted the Snapdragon XR series platform for their head-mounted displays. Among them, the Snapdragon XR2 is one of the current representative solutions, which Qualcomm claims is the first XR chip to combine 5G and AI.

It incorporates seven-camera support and a customized computer vision processor, enabling real-time tracking of the user's head, lips, and eyes, as well as 26-point hand skeleton tracking.

Scene understanding and 3D reconstruction can better integrate virtual information with the physical world, bringing a more immersive interactive experience.

At the same time, the device will also sense the user's external environment and use AI to recognize sounds such as doorbells and children crying to alert people to handle emergencies. The voice assistant will also be on standby in real time and can recognize commands in noisy environments, sending messages to the user's terminal (mobile computing platform) in a timely manner based on 5G signal capabilities.

Qualcomm recently released its latest first-generation Snapdragon XR2+ platform, which not only brings significant improvements in battery life and heat dissipation, but also introduces a new image processing pipeline that supports parallel perception technology, including head, gesture and controller tracking, 3D reconstruction and low-latency video perspective. The platform's high pixel density can support PC-level virtual landscapes and can simultaneously support multiple sensors and cameras, giving more realistic virtual characters detailed facial expressions.

Meta has released its first product based on this platform, the Meta Quest Pro. Combined with a controller also powered by the Snapdragon platform, it features self-tracking via multiple embedded positioning cameras and ultra-low latency with the headset. When combined with facial and eye tracking, it can create a more natural virtual avatar for users in VR. This can also be considered a fusion focused on enhancing the user experience.

In the mobile PC field, Qualcomm is also trying to use 5G and AI to improve the productivity of mobile office and protect terminal privacy.

For example, when holding a remote meeting, the device can accurately sense the subject's face, and can achieve precise focusing even in a noisy street-side coffee shop, so that passersby will not appear in the meeting.

On the third-generation Snapdragon 8cx computing platform, the Qualcomm AI Engine can provide 29+ TOPS of acceleration, with up to 3x performance improvement. Cloud deployment and local operation also allow thin and light laptops to tackle high-performance tasks without consuming excessive resources.

Beyond its own application implementation, Qualcomm integrates these underlying AI capabilities into a unified AI software stack, achieving a unified foundation for mobile computing. At the very top is the unified AI framework and AI engine, Direct. Subsequently, it distributes these capabilities to different mobile computing platforms, such as smartphones, XR, and ACPC (Always Connected PC), through various developer services, system software, and operating systems.

Qualcomm also provides developers with a range of toolkits, including AI model enhancement toolkits, neural network architecture search, and model analyzers.

In this way, it can achieve interoperability of application development between different terminals, such as from mobile phones to XR devices, so as to realize the capability integration between different terminals.

Starting with mobile phones, expanding into new forms such as XR and PCs, and finally integrating them with developers to build a diverse and open ecosystem, it has a bit of a "one begets two, two begets three, three begets all things" feel to it.

However, if you observe Qualcomm's exploration of cutting-edge 5G and AI technologies, you will find that their vision for mobile computing goes far beyond this.

Qualcomm's leadership in communications needs no introduction. In March of this year, Qualcomm released its fifth-generation 5G baseband and RF solution – the "Snapdragon X70". It is not only the only vendor in the world that supports all commercial 5G frequency bands from 600MHz to 41GHz, but also the world's first to integrate a 5G AI processor. It can use AI to optimize 5G links in the Sub-6GHz and millimeter-wave bands, improving speed, network coverage, mobility, link robustness, energy efficiency, and reducing latency.

In its exploration of AI, Qualcomm published a paper in April this year introducing a new neural network architecture for panoptic segmentation. Based on learning from instances and semantic relationships, it can label images at the pixel level, effectively identify features and make predictions, and automatically focus on important things. It also achieved state-of-the-art performance in all benchmark tests.

This research can be applied to scenarios such as autonomous driving and AR, and has been accepted by CVPR 2022.

Imagine this technology being put into practice—wouldn't that be like Free Guy (from the sci-fi movie "Out of Control") becoming a reality?

Future mobile computing will also carry the transformation of human-computer interaction.

Undeniably, Qualcomm and many other players have recognized the unprecedented market potential and value of emerging scenarios spawned by new mobile computing platforms such as XR and ACPC.

The metaverse unlocked by XR devices can be applied to all virtualizable real-world scenarios, including industry, entertainment, gaming, and social interaction. With widespread adoption, it could potentially unlock a market exceeding $800 billion in the coming years. Furthermore, as mobile computing platforms, represented by new mobile PCs, become more versatile and collaborate more effectively with other computing platforms, even more scenarios and values ​​await discovery…

If in the past, the scenarios extended from mobile phones only resulted in a few hundred vertical software ecosystems, then the new computing era brings about hundreds of horizontal industrial expansions, which in turn bring thousands or tens of thousands of developer ecosystems and business opportunities.

This is driven by AI, a fundamental technology, leading to power-set innovations that extend from points to lines to surfaces.

Including the automotive and IoT scenarios mentioned in previous issues, the future mobile computing system will also carry a more profound transformation in human-computer interaction.

Throughout the entire machine revolution, every innovation in human-computer interaction has brought about tremendous changes to human lifestyles.

Looking at it from a timeline, isn't the current innovation in mobile computing platforms a direct reflection of the ongoing human-computer interaction revolution?

The era of personal computers began with traditional PCs and advanced semiconductor technology, which propelled the development of human-computer interaction 1.0. Inventions such as the mouse, keyboard, and graphical user interface made computers accessible to ordinary people, even those without formal training.

Currently, mobile computing platforms, led by smartphones, have begun to dominate the human-computer interaction 2.0 paradigm.

Around 2007, touchscreen interaction, primarily driven by iPhones and various Android devices, gradually became the mainstream in the mobile era. People replaced keyboards with their fingers, and many everyday applications could be handled with just a mobile phone. Due to its portability, everyone could own their own human-computer interaction device.

In the foreseeable future, driven by 5G and AI technologies, human-computer interaction 3.0, led by new mobile computing platforms such as XR hardware, will make interaction anytime and anywhere possible. Just like those AR contact lenses, you can receive information as soon as you open your eyes and process information by speaking. Once it is widely adopted, it will mark the beginning of a new era of human-computer interaction. Now, only the last 100 meters remain, which will be driven by underlying capabilities.

In the era of Human-Computer Interaction 3.0, some old mobile computing forms are also bringing new value innovations driven by AI and 5G connectivity, such as a more intelligent and convenient new mobile PC experience.

However, changes in interaction methods and other aspects of the user experience are merely superficial; the fundamental and far-reaching impact of mobile computing innovation lies in the widespread adoption and accessibility of the technology.

From a PC in every household to a smartphone for everyone, human-computer interaction has undergone a tremendous transformation. In the not-too-distant future, everyone will be able to access various new forms of human-computer interaction experiences anytime, anywhere. As technologies such as AI and 5G further influence each of us, we too can perceive the world in more diverse ways, just like the protagonists in science fiction movies.

This, to some extent, confirms that any disruptive change is not only felt by enabling companies like Qualcomm, which are at the forefront of foundational technologies such as AI and 5G. In fact, we are all in the midst of this wave, able to personally experience and measure the new opportunities of the times, and become a part of Miji Innovation.


Read next

Application Research of Variable Frequency Coordinated Control Technology in Primary Air Systems

Abstract: This paper focuses on the application of variable frequency coordinated control technology in the retrofitting...

Articles 2026-02-22