Home

Awesome

LLM4MOEA

Large Language Model (LLM) for Multiobjective Evolutionary Algorithm (MOEA)

This Rep consists the implementation of:

Below is an illustration of the general framework of integrating LLM in MOEA/D, for more information and results please refer to manuscript LLM4MOEA

If you find the code helpful, please cite : Fei Liu, Xi Lin, Zhenkun Wang, Shunyu Yao, Xialiang Tong, Mingxuan Yuan, and Qingfu Zhang. "Large language model for multi-objective evolutionary optimization." arXiv preprint arXiv:2310.12541 (2023).

ArXiv paper link

If you are interested in LLM for algorithm design, we recommend:

<img src='./figures/Framework.JPG' alt='image' width='500' height='auto'>

Usage

MOEA/D-LLM

Implemented in Pymoo

Set your LLM Endpoint, key, and model before start !

cd MOEAD-LLM

python run.py

MOEA/D-LO

Implemented in PlatEMO

copy [MOEAD-LO.m] to fold "Algorithms/Multi-objective optimization/MOEA-D-LO10" in PlatEMO

copy [OperatorLO.m] to fold "Algorithms/Utility functions" in PlatEMO

Test it using PlatEMO GUI or terminal command

LLM

API for LLM

In our implementation, we use API2D API to request response from GPT3.5

Step 1: Create your key from API2D

Step 2: Copy your own API2D API Key to run.py. No additional settings are required.

You can also

From LLM Results to Linear Operator

cd LLM2LO

python LinearReg.py

python PolyReg.py

where two regressions are in two levels

The resulted regression of weight vs. rank should be as follows (slightly different from the results in manuscript with different random seeds):

<img src='./figures/WeightVsRank.png' alt='image' width='500' height='auto'>