Support

Support Options

Submit a Support Ticket

 

[Illinois]: Error Gradient Estimations Due to Parallel Perturbation of Weights

By AbderRahman N Sobh

University of Illinois at Urbana-Champaign

This tool trains two-layered networks of sigmoidal units to associate patterns using simultaneous perturbation of weights.

Launch Tool

You must login before you can run this tool.

Version 1.0a - published on 19 Aug 2013

doi:10.4231/D3GX44V05 cite this

This tool is closed source.

View All Supporting Documents

See also

No results found.

Default Input Simulation

Category

Tools

Published on

Abstract

From Tutorial on Neural Systems Modeling, Chapter 7:

This tool trains two-layered, feedforward networks of sigmoidal units on pattern association tasks by estimating the network error gradient using parallel weight pertur­bation, and by updating all network weights simultaneously. This is similar to the one-weight-at-a-time tool pertGradient1By1, except that all weights are perturbed and updated simultaneously. Also the network is not limited to one output unit only.

Sponsored by

NanoBio Node, University of Illinois Champaign-Urbana

Cite this work

Researchers should cite this work as follows:

  • Tutorial on Neural Systems Modeling, Copyright 2010 Sinauer Associates Inc. Author: Thomas J. Anastasio
  • AbderRahman N Sobh (2013), "[Illinois]: Error Gradient Estimations Due to Parallel Perturbation of Weights," http://nanohub.org/resources/pertgradll. (DOI: 10.4231/D3GX44V05).

    BibTex | EndNote

Tags

No classroom usage data was found. You may need to enable JavaScript to view this data.

nanoHUB.org, a resource for nanoscience and nanotechnology, is supported by the National Science Foundation and other funding agencies. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.