IDEALS Home University of Illinois at Urbana-Champaign logo The Alma Mater The Main Quad

Neural Network Learning for Time-Series Predictions Using Constrained Formulations

Show full item record

Bookmark or cite this item: http://hdl.handle.net/2142/10982

Files in this item

File Description Format
PDF Neural Network ... nstrained Formulations.pdf (4MB) (no description provided) PDF
Title: Neural Network Learning for Time-Series Predictions Using Constrained Formulations
Author(s): Qian, Minglun
Subject(s): Machine Learning
Abstract: In this thesis, we propose a recurrent FIR neural network, develop a constrained formulation for neural network learning, study an e_cient violation guided backpropagation algorithm for solving the constrained formulation based on the theory of extended saddle points, and apply neural network learning for predicting both noise-free time series and noisy time series. The recurrent FIR neural-network architecture combines a recurrent structure and a memory-based FIR structure in order to provide a more powerful modeling ability. The constrained formulation for neural network learning incorporates the error of each learning pattern as a constraint, a new cross-validation scheme that allows multiple validations sets to be considered in learning, and new constraints that can be expressed in a procedure form. The violation-guided back propagation algorithm first transforms the constrained formulation into an l1-penalty function, and searches for a saddle point of the penalty function. When using a constrained formulation along with violation guided backpropagation to neural network learning for near noiseless time-series benchmarks, we achieve much improved prediction performance as compared to that of previous work, while using less parameters. For noisy time-series, such as financial time series, we have studied systematically trade-offs between denoising and information preservation, and have proposed three preprocessing techniques for time-series with high-frequency noise. In particular, we have proposed a novel approach by first decomposing a noisy time series into different frequency channels and by preprocessing each channel adaptively according to its level of noise. We incorporate constraints on predicting low-pass data in the lag period when a low-pass filter is employed to denoise the band. The new constraints enable active trainiii ing in the lag period that greatly improves the prediction accuracy in the lag period. Extensive prediction experiments on financial time series have been conducted to exploit the modeling ability of neural networks, and promising results have been obtained.
Issue Date: 2005-04
Genre: Technical Report
Type: Text
URI: http://hdl.handle.net/2142/10982
Other Identifier(s): UIUCDCS-R-2005-2437
Rights Information: You are granted permission for the non-commercial reproduction, distribution, display, and performance of this technical report in any format, BUT this permission is only for a period of 45 (forty-five) days from the most recent time that you verified that this technical report is still available from the University of Illinois at Urbana-Champaign Computer Science Department under terms that include this permission. All other rights are reserved by the author(s).
Date Available in IDEALS: 2009-04-17
 

This item appears in the following Collection(s)

Show full item record

Item Statistics

  • Total Downloads: 259
  • Downloads this Month: 6
  • Downloads Today: 0

Browse

My Account

Information

Access Key