Prototype-based calibration distribution for few-shot learning
DOI:
Author:
Affiliation:

School of Information Engineering, Jiangxi University of Science and Technology, Ganzhou 341000, China

Clc Number:

TP391.4

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Aiming at the problem that the number of samples in few-shot learning is too small to represent the characteristics of data categories, a few-shot learning method combining prototype calibration data distribution is proposed. First, the embedded network is used to preprocess the image to obtain new class features, and the extracted new class features are processed by power transformation. Then, the prototype of the new sample is characterized by the weighting of the similarity of the base class, and the learned knowledge of the base class is fully utilized to reduce the deviation between the calculated prototype and the actual prototype. Finally, we sample from the uniform distribution between the features of the new class instance and its prototype representation. It generates a large amount of feature data to expand the support set of the new class. At the same time, we propose a method to change the boundary of the uniform distribution according to the number of samples. As a result, the samples are concentrated in areas with high confidence. The accuracy of 5-way 1-shot and 5-way 5-shot of our method are 68.94% and 84.75% on the miniImageNet dataset, respectively, and the accuracy on the CUB dataset are 81.75% and 91.88%, respectively, which are better than the best results of existing methods. The experimental results show that our method can effectively improve the model’s prediction accuracy in few-shot classification.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: May 30,2024
  • Published: