Temporal Data Processing Using Genetic Programming   [GP]

by

Iba, H., Garis, H., D., Garis, H., . and Sato, T.

Literature search on Evolutionary ComputationBBase ©1999-2013, Rasmus K. Ursem
     Home · Search · Adv. search · Authors · Login · Add entries   Webmaster
Note to authors: Please submit your bibliography and contact information - online papers are more frequently cited.

Info: Genetic Algorithms: Proceedings of the Sixth International Conference (ICGA95) (Conference proceedings), 1995, p. 279-286
Keywords:Genetic Programming, Genetic Algorithms
Abstract:
This paper reports an extension of STROGANOFF called R-STROGANOFF which uses special memory terminal nodes to provide a form of recurrancy to process time ordered events. All functions are polynomials (quadratics in the examples), terminals are either inputs or memories. Each memory terminals hold the value of a function node on the previous time step. The coeffients of the polynomials are learnt by trying to match the training data using a "Generalised Error Proporgation Algorithm". This is determinstic. Seems like STROGANOFF's (but different?), time sequence based, based on back-propagation. The coefficients are recalculated each generation (assuming tree has changed). Fitness function [FF] used "minimum description length" (MDL). Quadratic coefficients mya be limited to 0<=x<=1 to avoid divergence. Examples: 2 step 0-1 oscilator, 4 Tomita languages (on binary alphabet). Tree could be converted to finite state automata, which was more general than tree, ie works in all cases including those not in the training set. On the tomita languages problems "R-STROGANOFF works almost as well as (the best) best recurrent networks"
Internet search:Search Google
Search Google Scholar
Search Citeseer using Google
Search Google for PDF
Search Google Scholar for PDF
Search Citeseer for PDF using Google

Review item:

Mark as doublet (will be reviewed)

Print entry



BibTex:
@InProceedings{Iba:1995:tdpGP,
  author =       "Hitoshi Iba and Hugo {de Garis} and Taisuke Sato",
  title =        "Temporal Data Processing Using Genetic Programming",
  booktitle =    "Genetic Algorithms: Proceedings of the Sixth
                 International Conference (ICGA95)",
  year =         "1995",
  editor =       "L. Eshelman",
  pages =        "279--286",
  address =      "Pittsburgh, PA, USA",
  publisher_address = "San Francisco, CA, USA",
  month =        "15-19 " # jul,
  publisher =    "Morgan Kaufmann",
  keywords =     "Genetic Programming, Genetic Algorithms",
  ISBN =         "1-55860-370-0",
  abstract =     "This paper reports an extension of STROGANOFF called
                 R-STROGANOFF which uses special memory terminal nodes
                 to provide a form of recurrancy to process time ordered
                 events.

                 All functions are polynomials (quadratics in the
                 examples), terminals are either inputs or memories.
                 Each memory terminals hold the value of a function node
                 on the previous time step.

                 The coeffients of the polynomials are learnt by trying
                 to match the training data using a {"}Generalised Error
                 Proporgation Algorithm{"}. This is determinstic. Seems
                 like STROGANOFF's (but different?), time sequence
                 based, based on back-propagation. The coefficients are
                 recalculated each generation (assuming tree has
                 changed).

                 Fitness function used {"}minimum description length{"}
                 (MDL).

                 Quadratic coefficients mya be limited to 0<=x<=1 to
                 avoid divergence.

                 Examples: 2 step 0-1 oscilator, 4 Tomita languages (on
                 binary alphabet).

                 Tree could be converted to finite state automata, which
                 was more general than tree, ie works in all cases
                 including those not in the training set.

                 On the tomita languages problems {"}R-STROGANOFF works
                 almost as well as (the best) best recurrent
                 networks{"}",
}