You must log in or register to comment.
the paper https://arxiv.org/abs/2304.12240
Gradient decent does strike me as the sort of problem that would work amazingly on a quantum computer. I think the issue is byte limitations. This paper is certainly over my head, but aren’t we struggling to get a 265 bit number calculated through a quantum computer, and wouldn’t a 48 gigabyte language model be basically impossible with current methods?