Title: Global optimization methods applied to neural learning

  • Objective: test different global optimization methods applying them to neural networks
  • Participants: Włodzisław Duch, Jerzy Korczak
  • Dates: started in  July 1997 with one month visit to LPU Strasbourg
  • Result(s): technical report written in July/August 1997
  • Problems: implement, test, develop; try some global methods on optimization of logical rules.
  • Working log:
  • Neural networks are usually trained using gradient-based procedure but these methods find suboptimal solutions being trapped in local minima. Recently genetic algorithms have also been used for optimization of neural architectures. Other global optimization methods suitable for neural networks are competetive with genetic algorithms and are worth trying. A review: "Optimization and global minimization methods for neural networks" has been written but little implementations done so far.

    Good topic for a PhD - still open !