Low-Rank Factorizations are Indirect Encodings for Deep Neuroevolution -Skip to main content

Dithered Tree

LOW-RANK FACTORIZATIONS ARE INDIRECT ENCODINGS FOR DEEP NEUROEVOLUTION

My latest paper is available on arxiv: Low Rank Factorizations are Indirect Encodings for Deep Neuroevolution.

The general idea is that we can search for stronger neural networks in a gradient-free fashion by restricting search to networks of low-rank. We show that it works well for language modeling reinforcement learning tasks. It’s essentially a crossover between the following papers:

I’ll be presenting it virtually for the Neuroevolution@Work workshop at GECCO 2025.