Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Crop mapping in the small sample/no sample case: an approach using a two-level cascade classifier and integrating domain knowledge (2302.10270v1)

Published 26 Dec 2022 in cs.CV and cs.LG

Abstract: Mapping crops using remote sensing technology is important for food security and land management. Machine learning-based methods has become a popular approach for crop mapping in recent years. However, the key to machine learning, acquiring ample and accurate samples, is usually time-consuming and laborious. To solve this problem, a crop mapping method in the small sample/no sample case that integrating domain knowledge and using a cascaded classification framework that combine a weak classifier learned from samples with strong features and a strong classifier trained by samples with weak feature was proposed. First, based on the domain knowledge of various crops, a low-capacity classifier such as decision tree was applied to acquire those pixels with distinctive features and complete observation sequences as "strong feature" samples. Then, to improve the representativeness of these samples, sample augmentation strategy that artificially remove the observations of "strong feature" samples according to the average valid observation proportion in target area was applied. Finally, based on the original samples and augmented samples, a large-capacity classifier such as random forest was trained for crop mapping. The method achieved an overall accuracy of 82% in the MAP crop recognition competition held by Syngenta Group, China in 2021 (third prize, ranked fourth). This method integrates domain knowledge to overcome the difficulties of sample acquisition, providing a convenient, fast and accurate solution for crop mapping.

Summary

We haven't generated a summary for this paper yet.