RTX 3000 (especially its tensor cores) are very suitable for machine learning or its applications whenever the VRAM is sufficient. If each sample is small, 4GB can be enough for their (example: Go game) application (when the previously trained net is already available) and all that matters is numbers of cores and bandwidth. Advanced ML (example: image recognition) might need more VRAM when training, e.g., deep neural networks, then RTX 3090 or A-series cards with more than 24GB can be needed. Only specific ML softwares (Igor's Lab has mentioned Siemens) profits from drivers etc. of A-series cards or their predecessors. Many ML softwares just need good numbers of tensor cores and CUDA cores, a few softwares even run faster on RTX 3000 than on mid- to high-end A-series cards. Some ML softwares profits from SLI while other softwares just profit proportionally from more GPUs due to the increased total number of cores.
At German Amazon, I have never seen Amazon itself selling RTX 3000 but only seen third party scalpers. RTX 3080 have cost 3x - 5x MSRPs even when some other retailers offer 2x MSRPs, which is still way too high unless one joins the mining Ponzi scheme or absolutely needs some card roughly in the price range of A-series cards. As somebody merely wishing to apply already trained ML, these prices (even the lowest 1,5x MSRP 10 months after launch for 2 weeks) are way too high and I skip RTX 3000. I do not join the Ponzi scheme and the same situation with RTX 4000 means I might just get an APU then.