Share this

Can artificial intelligence enable machines to possess fluid intelligence?

2026-04-06 05:09:32 · · #1

In fields where data streams change over time, developing more flexible artificial intelligence that can learn quickly is a key task. Practical applications with time-series data include video processing, epidemiology, financial markets, economics, GDP, health monitoring, weather forecasting, air pollution, autonomous vehicles, robotics, aviation, and medical imaging, among others.

The concepts of fluid intelligence and crystallized intelligence can be traced back to 1963, when they were proposed by Raymond Cattell (1905-1998), one of the most influential psychologists of the 20th century. Fluid intelligence is the ability to think flexibly, reason, and process new information in real time. In contrast, crystallized intelligence refers to knowledge acquired from previously learned facts, skills, and experiences.

Fluid intelligence is a physiologically based cognitive ability, including perception, memory, processing speed, and reasoning ability. It is the opposite of crystallized intelligence, and fluid intelligence declines with age. Fluid intelligence is a fundamental human ability, significantly influenced by genetic factors and less by education and culture. The development of fluid intelligence is closely related to age: it generally peaks after age 20 and declines after age 30. Crystallized intelligence, on the other hand, does not decline with age. It primarily refers to learned skills, language and literacy abilities, judgment, and associative thinking.

“We introduce a new class of time-continuous recurrent neural network models,” the study’s authors wrote. Ramin Hasani, a postdoctoral researcher at the Computer Science and Artificial Intelligence Laboratory (CSAIL), is the lead author of the study. Other researchers on the team include Daniela Rus, a professor at MIT and director of CSAIL; Alexander Amini, a doctoral student at MIT; Mathias Lechner of the Austrian Institute of Science and Technology; and Radu Grosu of the Vienna University of Technology.

When time-series data is available, recurrent neural networks (RNNs) are often used to determine the continuous-time hidden states using ordinary differential equations (ODEs). The research team set out to improve this structure to "achieve richer representation learning and expressive capabilities."

The researchers wrote, "Instead of declaring the dynamics of the learned system through implicit nonlinearity, we construct a network of linear first-order dynamic systems modulated by nonlinear interconnect gates."

As an alternative, researchers created a liquid time constant (LTC) recurrent neural network (RNN). The advantage of this new type of recurrent neural network is that it is more expressive in its design and therefore more transparent and interpretable in nature.

This expressiveness allows researchers to better understand some of the “thinking” processes of neural networks, a benefit that helps to unveil some of the complex cognitive processes of the “black box” of artificial intelligence machine learning.

The research team wrote, "The resulting model represents a dynamic system whose varying (i.e., liquid) time constant is coupled to its hidden state, and the output is computed by a numerical differential equation solver." "These neural networks exhibit stable and bounded behavior, producing superior performance in the family of neural differential equations and improving the performance of time series prediction tasks."

To evaluate their new model, the team conducted extensive experiments on their liquid-time-constrained recurrent neural network. Experiments included training a classifier to recognize gestures from motion data, predict room occupancy from sensor data streams (temperature, carbon dioxide levels, humidity, and other sensors), and identify human activities (e.g., standing, walking, and sitting) from smartphone data. Other tests included sequential MNIST, motion dynamics modeling and traffic prediction, hourly household electricity consumption, ozone concentration levels, and many other types of human activity.

Compared to other recurrent neural network models (LSTM, CT-RNN, Neural ODE, and CT-GRU), researchers observed improvements of 5% to 70% in four out of seven experiments on time series prediction.

Artificial intelligence is rapidly expanding across industries and into many functions. The more flexible, fluid, and transparent AI machine learning becomes, the greater its potential to improve AI security and performance in the future.

Read next

CATDOLL 146CM Jing TPE

Height: 146cm A-cup Weight: 26kg Shoulder Width: 32cm Bust/Waist/Hip: 64/54/74cm Oral Depth: 3-5cm Vaginal Depth: 3-15c...

Articles 2026-02-22