Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Gradient-based Quadratic Multiform Separation (2110.13006v2)

Published 25 Oct 2021 in stat.ML and cs.LG

Abstract: Classification as a supervised learning concept is an important content in machine learning. It aims at categorizing a set of data into classes. There are several commonly-used classification methods nowadays such as k-nearest neighbors, random forest, and support vector machine. Each of them has its own pros and cons, and none of them is invincible for all kinds of problems. In this thesis, we focus on Quadratic Multiform Separation (QMS), a classification method recently proposed by Michael Fan et al. (2019). Its fresh concept, rich mathematical structure, and innovative definition of loss function set it apart from the existing classification methods. Inspired by QMS, we propose utilizing a gradient-based optimization method, Adam, to obtain a classifier that minimizes the QMS-specific loss function. In addition, we provide suggestions regarding model tuning through explorations of the relationships between hyperparameters and accuracies. Our empirical result shows that QMS performs as good as most classification methods in terms of accuracy. Its superior performance is almost comparable to those of gradient boosting algorithms that win massive machine learning competitions.

Citations (1)

Summary

We haven't generated a summary for this paper yet.