We are running a survey to help us improve the experience for all of our members. If you see the survey appear, please take the time to tell us about your experience if you can.
Hi there !I need some advice. I'm starting a project at my school and I'd like to purchase a graphics card to use GPUs in my research. Which one would you recommend for learning Transformers and small models at a lower cost?In the future, I would also like to switch or test this project on a 100% ARM architecture using an Ampere CPU-based environment. To be able to compare solutions with and without a GPU.Thank you in advance for your answers !Regards,
Sounds like a fun project! Just curious—are you considering using something lightweight like llamacpp for local inference? I’ve been testing it for smaller setups and wondering how it performs in real IA applications. Would love to know what direction you're taking!