• myGriffith
    • Staff portal
    • Contact Us⌄
      • Future student enquiries 1800 677 728
      • Current student enquiries 1800 154 055
      • International enquiries +61 7 3735 6425
      • General enquiries 07 3735 7111
      • Online enquiries
      • Staff phonebook
    View Item 
    •   Home
    • Griffith Research Online
    • Journal articles
    • View Item
    • Home
    • Griffith Research Online
    • Journal articles
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

  • All of Griffith Research Online
    • Communities & Collections
    • Authors
    • By Issue Date
    • Titles
  • This Collection
    • Authors
    • By Issue Date
    • Titles
  • Statistics

  • Most Popular Items
  • Statistics by Country
  • Most Popular Authors
  • Support

  • Contact us
  • FAQs
  • Admin login

  • Login
  • Adaptive grey wolf optimizer

    Author(s)
    Meidani, Kazem
    Hemmasian, AmirPouya
    Mirjalili, Seyedali
    Farimani, Amir Barati
    Griffith University Author(s)
    Mirjalili, Seyedali
    Year published
    2022
    Metadata
    Show full item record
    Abstract
    Swarm-based metaheuristic optimization algorithms have demonstrated outstanding performance on a wide range of optimization problems in both science and industry. Despite their merits, a major limitation of such techniques originates from non-automated parameter tuning and lack of systematic stopping criteria that typically leads to inefficient use of computational resources. In this work, we propose an improved version of grey wolf optimizer (GWO) named adaptive GWO which addresses these issues by adaptive tuning of the exploration/exploitation parameters based on the fitness history of the candidate solutions during the ...
    View more >
    Swarm-based metaheuristic optimization algorithms have demonstrated outstanding performance on a wide range of optimization problems in both science and industry. Despite their merits, a major limitation of such techniques originates from non-automated parameter tuning and lack of systematic stopping criteria that typically leads to inefficient use of computational resources. In this work, we propose an improved version of grey wolf optimizer (GWO) named adaptive GWO which addresses these issues by adaptive tuning of the exploration/exploitation parameters based on the fitness history of the candidate solutions during the optimization. By controlling the stopping criteria based on the significance of fitness improvement in the optimization, AGWO can automatically converge to a sufficiently good optimum in the shortest time. Moreover, we propose an extended adaptive GWO (AGWO Δ) that adjusts the convergence parameters based on a three-point fitness history. In a thorough comparative study, we show that AGWO is a more efficient optimization algorithm than GWO by decreasing the number of iterations required for reaching statistically the same solutions as GWO and outperforming a number of existing GWO variants.
    View less >
    Journal Title
    Neural Computing and Applications
    DOI
    https://doi.org/10.1007/s00521-021-06885-9
    Note
    This publication has been entered as an advanced online version in Griffith Research Online.
    Subject
    Cognitive and computational psychology
    Science & Technology
    Computer Science, Artificial Intelligence
    Metaheuristic optimization
    Publication URI
    http://hdl.handle.net/10072/411873
    Collection
    • Journal articles

    Footer

    Disclaimer

    • Privacy policy
    • Copyright matters
    • CRICOS Provider - 00233E
    • TEQSA: PRV12076

    Tagline

    • Gold Coast
    • Logan
    • Brisbane - Queensland, Australia
    First Peoples of Australia
    • Aboriginal
    • Torres Strait Islander