Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Unsupervised Learning using Pretrained CNN and Associative Memory Bank (1805.01033v1)

Published 2 May 2018 in cs.CV, cs.LG, and stat.ML

Abstract: Deep Convolutional features extracted from a comprehensive labeled dataset, contain substantial representations which could be effectively used in a new domain. Despite the fact that generic features achieved good results in many visual tasks, fine-tuning is required for pretrained deep CNN models to be more effective and provide state-of-the-art performance. Fine tuning using the backpropagation algorithm in a supervised setting, is a time and resource consuming process. In this paper, we present a new architecture and an approach for unsupervised object recognition that addresses the above mentioned problem with fine tuning associated with pretrained CNN-based supervised deep learning approaches while allowing automated feature extraction. Unlike existing works, our approach is applicable to general object recognition tasks. It uses a pretrained (on a related domain) CNN model for automated feature extraction pipelined with a Hopfield network based associative memory bank for storing patterns for classification purposes. The use of associative memory bank in our framework allows eliminating backpropagation while providing competitive performance on an unseen dataset.

Citations (35)

Summary

We haven't generated a summary for this paper yet.