A Review on Self-Supervised Learning |
( Volume 9 Issue 1,January 2023 ) OPEN ACCESS |
Author(s): |
Athul Raj, Srinjoy Dutta |
Keywords: |
Self-supervised learning, Artificial Intelligence |
Abstract: |
In recent years, the field of AI has made great strides in developing AI systems that learn from massive amounts of carefully labeled data. This supervised learning model has proven to be successful in training specialized models to perform exceptionally well on the task for which they were trained. Unfortunately, there is a limit to what the field of AI can go with supervised learning alone. Supervised learning is a bottleneck for building smarter general-purpose models that can multitask and learn new skills without the need for large amounts of labeled data. Practically speaking, it is impossible to label everything in the world. There are also some tasks that don't have enough labeled data, such as training a translation system for resource-limited languages, or data that requires experts to label, such as medical data. If AI systems can gather deeper and more nuanced insights into reality beyond what is specified in the training dataset, they will be more useful and ultimately bring AI closer to human-level intelligence. Self-supervised learning (SSL), also known as self-supervision, is an emerging solution to the challenge posed by data labeling. By building models autonomously, self-supervised learning reduces the cost and time to build machine learning models. In our paper, we first look at SSL and how it can solve the challenge of data labeling. Then we look at some approaches to SSL that have been developed through the past years. And we finally conclude with what the future holds for SSL in the domain of Artificial Intelligence. |
DOI :
|
Paper Statistics: |
Cite this Article: |
Click here to get all Styles of Citation using DOI of the article. |