cwbe coordinatez:
101
63540
63541
1834976
9058425
9058426
9058518
9058541
9058544

ABSOLUT
KYBERIA
permissions
you: r,
system: public
net: yes

neurons

stats|by_visit|by_K
source
tiamat
K|my_K|given_K
last
commanders
polls

total descendants::1
total children::1
show[ 2 | 3] flat


napriklad tu:
https://timdettmers.com/2023/01/30/which-gpu-for-deep-learning/


problem je ale, ze tam opisuju karty na samotne trenovanie asi, a nie end level consumer karty na pouzivanie uz natrenovanych softwarov, akym bude (alebo uz je) aj photoshop. v tomto uz nemam vobec prehlad




00000101000635400006354101834976090584250905842609058518090585410905854409059446
drakh
 drakh      28.04.2023 - 09:55:03 , level: 1, UP   NEW
tak ano na trening su trochu ine poziadavky ako na sampling.

ak sa pozrieme na karpatyho github https://github.com/karpathy/nanoGPT

..."8X A100 40GB node".. ..."This will run for about 4 days"

Tam uz celkom pocitis rozdiel ci mas low consumption karty alebo nenazranu gejmerske (ktore ani nemavaju tolko VRAM)