Exploring Weight Agnostic Neural Networks

By on August 31, 2019
Pin It

When training a neural network to accomplish a given task, be it image classification or reinforcement learning, one typically refines a set of weights associated with each connection within the network. Another approach to creating successful neural networks that has shown substantial progress is neural architecture search, which constructs neural network architectures out of hand-engineered components such as convolutional network components or transformer blocks. It has been shown that neural network architectures built with these components, such as deep convolutional networks, have strong inductive biases for image processing tasks, and can even perform them when their weights are randomly initialized. While neural architecture search produces new ways of arranging hand-engineered components with known inductive biases for the task domain at hand, there has been little progress in the automated discovery of new neural network architectures with such inductive biases, for various task domains.

 

More info

About Boris Landoni

Boris Landoni is the technical manager of Open-Electronics.org. Skilled in the GSM field, embraces the Open Source philosophy and its projects are available to the community.

Leave a Reply

Your email address will not be published. Required fields are marked *