Plenary Speaker:

When Evolutionary Computation Meets Large Language Models

Kay Chen Tan

The Hong Kong Polytechnic University

Abstract:

Large Language Models (LLMs) have revolutionized natural language processing, achieving remarkable success across various applications. This presentation explores the compelling synergy between evolutionary computation and LLMs, investigating how these advanced models can revolutionize traditional optimization and search methods. We will start by examining the development and inherent abilities of LLMs, emphasizing their potential to bolster and refine evolutionary computation. We will then discuss various methods of embedding LLMs within evolutionary computation frameworks, tackling simple and intricate optimization challenges. We will outline the distinctive benefits and potential pitfalls of utilizing LLMs in these scenarios, supported by case studies from our research in areas like automated machine learning, causal discovery, materials science, and logistics optimization. These studies aim to illustrate how LLMs can enhance the efficiency and effectiveness of evolutionary algorithms, thus offering new avenues for addressing complex optimization problems. Lastly, we will explore the broader implications of this integration, offering insights into future research directions and applications.

Biography:

Kay Chen Tan is a Chair Professor (Computational Intelligence) of the Department of Data Science and Artificial Intelligence, The Hong Kong Polytechnic University. He has co-authored eight books and published over 300 peer-reviewed journal articles. Prof. Tan currently serves as the Vice-President (Publications) of the IEEE Computational Intelligence Society, USA. He was the Editor-in-Chief of IEEE Transactions on Evolutionary Computation from 2015-2020 and IEEE Computational Intelligence Magazine from 2010-2013. Prof. Tan is an IEEE Fellow and an Honorary Professor at the University of Nottingham in the UK. He is also the Chief Co-Editor of the Springer Book Series on Machine Learning: Foundations, Methodologies, and Applications.