# graphicards for future --> ai ready 16GB ram ## grid - passive cooling - https://forums.developer.nvidia.com/t/using-grid-cards-in-a-render-farm/162255/8 ## tesla k40/k80 - https://github.com/openai/jukebox/issues/142 - >You need a stack of VRAM to run this. I didn't bother with it for this reason. > 12gb of VRAM. cards that are suitable are k80 - 24gb/32gb of VRAM / RTX 5000 - 16gb RTX 6000 / RTX 8000. Not even the 2080ti is appropriate at 11gb. the just released nvidida 3090 - the latest / 24gb of VRAM. AMD cards will not work. Wait another month - get the 3090 card or two / there's a limit of these cards being released. This card is better than the one below. (don't get the 3080 it can't be joined together with nvlink) - you'll be in a better position to get anywhere. You'll need a 750Watt power supply. I recommend a HP workstation. - https://github.com/openai/jukebox/issues/136 - https://www.reddit.com/r/gpumining/comments/mtx5qs/tesla_k80_in_2021/ - >The main disadvantage to Tesla cards is they’re made for floating point processing (AI/ML use) while gaming cards are for integer processing. Most crypto algos are integer based. Single vs floating point processing speed specs are know so ROI is gonna favor gaming cards. > I guess you could look up K80 single point processing speed and find gaming card of similar spec as a pseudo equivalent, haven’t done it so don’t hold me to that lol