bibtype |
J -
Journal Article
|
ARLID |
0551866 |
utime |
20230418204528.9 |
mtime |
20220117235959.9 |
SCOPUS |
85121331364 |
WOS |
000728657100001 |
DOI |
10.1080/10556788.2021.1965601 |
title
(primary) (eng) |
General framework for binary classification on top samples |
specification |
page_count |
32 s. |
media_type |
P |
|
serial |
ARLID |
cav_un_epca*0254588 |
ISSN |
1055-6788 |
title
|
Optimization Methods & Software |
volume_id |
37 |
volume |
5 (2022) |
page_num |
1636-1667 |
publisher |
|
|
keyword |
general framework |
keyword |
classification |
keyword |
ranking |
keyword |
accuracy at the top |
keyword |
Neyman–Pearson |
keyword |
Pat&Mat |
author
(primary) |
ARLID |
cav_un_auth*0313213 |
name1 |
Adam |
name2 |
L. |
country |
CZ |
|
author
|
ARLID |
cav_un_auth*0422257 |
name1 |
Mácha |
name2 |
V. |
country |
CZ |
|
author
|
ARLID |
cav_un_auth*0101207 |
name1 |
Šmídl |
name2 |
Václav |
institution |
UTIA-B |
full_dept (cz) |
Adaptivní systémy |
full_dept |
Department of Adaptive Systems |
department (cz) |
AS |
department |
AS |
full_dept |
Department of Adaptive Systems |
fullinstit |
Ústav teorie informace a automatizace AV ČR, v. v. i. |
|
author
|
ARLID |
cav_un_auth*0307300 |
name1 |
Pevný |
name2 |
T. |
country |
CZ |
|
source |
|
source |
|
cas_special |
project |
project_id |
GA18-21409S |
agency |
GA ČR |
ARLID |
cav_un_auth*0374053 |
|
abstract
(eng) |
Many binary classification problems minimize misclassification above (or below) a threshold. We show that instances of ranking problems, accuracy at the top, or hypothesis testing may be written in this form. We propose a general framework to handle these classes of problems and show which formulations (both known and newly proposed) fall into this framework. We provide a theoretical analysis of this framework and mention selected possible pitfalls the formulations may encounter. We show the convergence of the stochastic gradient descent for selected formulations even though the gradient estimate is inherently biased. We suggest several numerical improvements, including the implicit derivative and stochastic gradient descent. We provide an extensive numerical study. |
result_subspec |
WOS |
RIV |
BC |
FORD0 |
10000 |
FORD1 |
10100 |
FORD2 |
10102 |
reportyear |
2023 |
num_of_auth |
4 |
inst_support |
RVO:67985556 |
permalink |
https://hdl.handle.net/11104/0337818 |
confidential |
S |
mrcbC91 |
A |
mrcbT16-e |
COMPUTERSCIENCESOFTWAREENGINEERING|MATHEMATICSAPPLIED|OPERATIONSRESEARCHMANAGEMENTSCIENCE |
mrcbT16-j |
1.039 |
mrcbT16-s |
1.079 |
mrcbT16-D |
Q1 |
mrcbT16-E |
Q2 |
arlyear |
2022 |
mrcbU14 |
85121331364 SCOPUS |
mrcbU24 |
PUBMED |
mrcbU34 |
000728657100001 WOS |
mrcbU63 |
cav_un_epca*0254588 Optimization Methods & Software 1055-6788 1029-4937 Roč. 37 č. 5 2022 1636 1667 Taylor & Francis |
|