Enhanced nadaraya-watson kernel surface approximation for extremely small samples

The function approximation problem is to find the appropriate relationship between a dependent and independent variable(s). Function approximation algorithms generally require sufficient samples to approximate a function. Insufficient samples may cause any approximation algorithm to result in unsati...

Full description

Saved in:
Bibliographic Details
Main Authors: Shapiai @ Abd. R., Mohd. Ibrahim, Ibrahim, Zuwairie, Khalid, Marzuki, Lee, Jau Wen, Pavlovich, Vladimir
Format: Conference or Workshop Item
Published: 2011
Online Access:http://eprints.utm.my/id/eprint/45824/
http://dx.doi.org/10.1109/AMS.2011.13
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The function approximation problem is to find the appropriate relationship between a dependent and independent variable(s). Function approximation algorithms generally require sufficient samples to approximate a function. Insufficient samples may cause any approximation algorithm to result in unsatisfactory predictions. To solve this problem, a function approximation algorithm called Weighted Kernel Regression (WKR), which is based on Nadaraya-Watson kernel regression, is proposed. In the proposed framework, the original Nadaraya-Watson kernel regression algorithm is enhanced by expressing the observed samples in a square kernel matrix. The WKR is trained to estimate the weight for the testing phase. The weight is estimated iteratively and is governed by the error function to find a good approximation model. Two experiments are conducted to show the capability of the WKR. The results show that the proposed WKR model is effective in cases where the target surface function is non-linear and the given training sample is small. The performance of the WKR is also compared with other existing function approximation algorithms, such as artificial neural networks (ANN).