Al-Subaie, Roda A.; Al-Qahtani, Shaikha S.; Boutros, Milan J.; Boutros, Joseph J.
Loading…
Permalink
https://hdl.handle.net/2142/130254
Description
Title
Soft Training: Method for Neural Network Learning
Author(s)
Al-Subaie, Roda A.
Al-Qahtani, Shaikha S.
Boutros, Milan J.
Boutros, Joseph J.
Issue Date
2025-09-17
Keyword(s)
Artificial neural networks
Regression
Classification
Soft-decision decoding
Differential entropy
Abstract
Inspired by soft-decision decoding in coding theory and by the assumptions of the universal approximation theorems on the continuity of a function to be approximated over a compact region, we introduce a novel training paradigm, termed soft training, that leverages the continuous nature of neural networks by training them to approximate a continuous function (regression) before deploying them for classification tasks. By using soft labels, in contrast to the traditional “hard labels”, we incorporate properties such as differentiability and differential entropy to construct a new labeling framework. Training and inference over different datasets demonstrate that soft training improves test accuracy while significantly reducing classification errors in regions far from the dataset points.
Publisher
Allerton Conference on Communication, Control, and Computing
Series/Report Name or Number
2025 61st Allerton Conference on Communication, Control, and Computing Proceedings
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.