I. New Breakthroughs in Machine Learning
The research findings have been accepted by the world's top academic conference in machine learning and computational neuroscience, "NeurIPS 2022," and stood out from more than 10,000 submissions worldwide. The paper will be published at the end of November.
As we understand it, the term "barren plateau" refers to the fact that when the number of bits in a quantum computer is large, the current framework of quantum neural networks can easily become ineffective for training, and its objective function becomes very flat, leading to excessively long training times or training failures.
Regarding the research findings, Director Hsieh Ming-hsiu stated that the proposed solution to the "barren plateau phenomenon" allows quantum learning machines to demonstrate their true advantages over traditional machines. Combining this solution, the Quantum Computing Institute showcased its research results on quantum simulation in battery development at this year's Foxconn Technology Day (HHTD22), significantly reducing the required quantum resources.
Regarding the phenomenon of barren plateaus, Director Hsieh Ming-hsiu further pointed out that, generally speaking, in the process of quantum machine learning, we learn by controlling the adjustable parameters of logic gates to obtain the desired quantum circuit model. However, in the learning process, the excessive number of logic gates and the deep structure often make it difficult to update the parameters.
Director Hsieh Ming-hsiu stated that by appropriately providing initial values for the adjustable parameters, they have improved the barren plateau phenomenon and solved a long-standing problem that has plagued the field of quantum machine learning, resulting in a breakthrough in research in this area.
II. Application Areas of Machine Learning
1. Data Mining
Data mining = machine learning + database. The concept of data mining has become so ubiquitous in recent years that it's almost synonymous with hype. Anyone talking about data mining touts its wonders, such as finding gold in data or transforming discarded data into value. However, while I might find gold, I might also find "stones." This means that data mining is merely a way of thinking, telling us to try to extract knowledge from data, but not every piece of data can yield gold, so don't mythologize it. A system will never become omnipotent simply by adding a data mining module (this is IBM's favorite claim). On the contrary, the key is a person with a data mining mindset, and they must also have a deep understanding of the data to derive patterns that guide business improvements. Most data mining algorithms are optimizations of machine learning algorithms within a database.
2. Statistical Learning
Statistical learning is roughly equivalent to machine learning. It's a discipline that highly overlaps with machine learning. Most methods in machine learning originate from statistics; in fact, one could argue that the development of statistics has fostered the flourishing of machine learning. For example, the well-known Support Vector Machine (SVM) algorithm stems from statistics. However, there are differences between the two. Statistical learners focus on the development and optimization of statistical models, leaning towards mathematics, while machine learning researchers focus more on problem-solving, leaning towards practical application. Therefore, machine learning researchers primarily study improving the efficiency and accuracy of learning algorithms on computers.
3. Computer Vision
Computer vision = image processing + machine learning. Image processing techniques are used to transform images into suitable inputs for machine learning models, while machine learning is responsible for identifying relevant patterns from the images. Computer vision has numerous applications, such as Baidu Image Search, handwritten character recognition, and license plate recognition. This field has a very promising future and is also a popular research area. With the development of deep learning, a new field within machine learning, the effectiveness of computer image recognition has been greatly improved; therefore, the future development prospects of the computer vision field are immeasurable.