Evolving Visual Routines   [VR]

by

Johnson, M., P., Maes, P. and Darrell, T.

Literature search on Evolutionary ComputationBBase ©1999-2013, Rasmus K. Ursem
     Home · Search · Adv. search · Authors · Login · Add entries   Webmaster
Note to authors: Please submit your bibliography and contact information - online papers are more frequently cited.

Info: ARTIFICIAL LIFE IV, Proceedings of the fourth International Workshop on the Synthesis and Simulation of Living Systems (Conference proceedings), 1994, p. 198-209
Keywords:genetic algorithms, genetic programming
Abstract:
Traditional machine vision [MV] assumes that the vision system recovers a complete, labeled description of the world [Marr]. Recently, several researchers have criticized this model and proposed an alternative model which considers perception as a distributed collection of task-specific, task-driven visual routines [VR] [Aloimonos, Ullman]. Some of these researchers have argued that in natural living systems these visual routines [VR] are the product of natural selection [NS] [ramachandran]. So far, researchers have hand-coded task-specific visual routines [VR] for actual implementations (e.g. [Chapman]). In this paper we propose an alternative approach in which visual routines [VR] for simple tasks are evolved using an artificial evolution approach. [AE] We present results from a series of runs on actual camera images, in which simple routines were evolved using Genetic Programming techniques [GP] [Koza]. The results obtained are promising: the evolved routines are able to correctly classify up to 93% of the images, which is better than the best algorithm we were able to write by hand.
Notes:
alife-4
URL(s):(G)zipped postscript

Review item:

Mark as doublet (will be reviewed)

Print entry




BibTex:
@InProceedings{johnson:1994:EVR,
  author =       "Michael Patrick Johnson and Pattie Maes and Trevor
                 Darrell",
  title =        "Evolving Visual Routines",
  booktitle =    "ARTIFICIAL LIFE IV, Proceedings of the fourth
                 International Workshop on the Synthesis and Simulation
                 of Living Systems",
  year =         "1994",
  editor =       "Rodney A. Brooks and Pattie Maes",
  pages =        "198--209",
  address =      "MIT, Cambridge, MA, USA",
  month =        "6-8 " # jul,
  publisher =    "MIT Press",
  keywords =     "genetic algorithms, genetic programming",
  URL =          "ftp://media.mit.edu/pub/agents/autonomous-agents/alife-iv.ps.Z",
  abstract =     "Traditional machine vision assumes that the vision
                 system recovers a complete, labeled description of the
                 world [Marr]. Recently, several researchers have
                 criticized this model and proposed an alternative model
                 which considers perception as a distributed collection
                 of task-specific, task-driven visual routines
                 [Aloimonos, Ullman]. Some of these researchers have
                 argued that in natural living systems these visual
                 routines are the product of natural selection
                 [ramachandran]. So far, researchers have hand-coded
                 task-specific visual routines for actual
                 implementations (e.g. [Chapman]). In this paper we
                 propose an alternative approach in which visual
                 routines for simple tasks are evolved using an
                 artificial evolution approach. We present results from
                 a series of runs on actual camera images, in which
                 simple routines were evolved using Genetic Programming
                 techniques [Koza]. The results obtained are promising:
                 the evolved routines are able to correctly classify up
                 to 93% of the images, which is better than the best
                 algorithm we were able to write by hand.",
  notes =        "alife-4

                 ",
}