I wouldn't be surprised to see the M3 have an emphasis on AI. The current M2 Ultra is about ½ as fast as a 4090 for applications like LLMs, but since it's unified, Apple can use much more generalized RAM for these operations, and fast RAM is where things really take off. 36GB is already better than the 24GB available in top-end consumer AI capable GPUs, and this is their mainstream offering. 36GB is also suspiciously a lot more than the meagre amounts Apple usually includes, and there really isn't anything else that would call for more than quadrupling RAM.
Apple is apparently pouring billions into AI development, and being able to run decent *standalone* AI would be a compelling feature. Although RAM is cheap for PCs, Intel's VPUs (and probably AMD's) are only 1660 Super level, anything else requires add-ons that are very expensive or RAM limited.