World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×
Our website is made possible by displaying certain online content using javascript.
In order to view the full content, please disable your ad blocker or whitelist our website www.worldscientific.com.

System Upgrade on Mon, Jun 21st, 2021 at 1am (EDT)

During this period, the E-commerce and registration of new users may not be available for up to 6 hours.
For online purchase, please visit us again. Contact us at [email protected] for any enquiries.

A MODIFIED ERROR BACKPROPAGATION ALGORITHM FOR COMPLEX-VALUE NEURAL NETWORKS

    The complex-valued backpropagation algorithm has been widely used in fields of dealing with telecommunications, speech recognition and image processing with Fourier transformation. However, the local minima problem usually occurs in the process of learning. To solve this problem and to speed up the learning process, we propose a modified error function by adding a term to the conventional error function, which is corresponding to the hidden layer error. The simulation results show that the proposed algorithm is capable of preventing the learning from sticking into the local minima and of speeding up the learning.

    References

    • N. Benven and F. Piazza, IEEE Trans. on Signal Processing 40, 967 (1992). Crossref, ISIGoogle Scholar
    • G. M. Georgio and C. Koutsougeras, IEEE Trans. on Circuits. Syst-II: Analog Digital Signal Processing 39, 330 (1992). Crossref, ISIGoogle Scholar
    • M. S. Kim and C. C. Guest, Modification of backpropagation networks for complex-valued signal processing in frequency domain, Proc. Internat. Joint Conf. Neural Networks3 (1990) pp. 27–31. Google Scholar
    • T. Nitta and T. Furuya, Trans. of Information Processing Society of Japan 32, 1319 (1991). Google Scholar
    • T. Nitta, Neural Networks 10, 1391 (1997). Crossref, Medline, ISIGoogle Scholar
    • T. Nitta, Neural Networks 16, 1101 (2003). Crossref, Medline, ISIGoogle Scholar
    • A. Von Lehmenet al., Factors influencing learning by backpropagation, Proc. IEEE Int. Conf. Neural NetworksI (1988) pp. 335–341. Google Scholar
    • C. B. Owen and A. M. Abunawass, Application of simulated annealing to the backpropagation model improves convergence, Proc. SPICE Conf. Science of Artifical Neural NetworksII (1993) pp. 269–276. Google Scholar
    • C. Wang and J. C. Principe, IEEE Trans. on Neural Networks 10, 1511 (1999). Crossref, Medline, ISIGoogle Scholar
    • A. Hadjiprocopis, Feed forward neural network entities, Ph.D. thesis, Department of Computer Science, City University, UK (2000) . Google Scholar
    • G. Goerick and W. V. Seelen, On unlearnable problems or a model for premature saturation in backpropagation learning, Proc. of the European Symposium on Artifical Neural Networks (1996) pp. 13–18. Google Scholar
    • S.   Haykin , Neural Networks a Comprehensive Foundation ( Macmillan Publishing , New York , 1994 ) . Google Scholar
    • X. G. Wang and Z. Tang, Neurocomputing 57, 477 (2004). Crossref, ISIGoogle Scholar
    • G. Cybenko, Mathematics of Control, Signals, and Systems 2, 303 (1989). CrossrefGoogle Scholar
    • T. Kim and T. Adali, J. VLSI Signal Processing Systems for Signal, Image, and Video Technology 32, 29 (2002). Crossref, ISIGoogle Scholar
    • T. Kim and T. Adali, Neural Computation 15, 1641 (2003). Crossref, Medline, ISIGoogle Scholar