LLaMA

Wikipediamanta

LLaMAqa (Large Language Model Meta AI) huk hatun simi wallpamakuna ayllum (HSWkuna), Meta AIpa kacharimusqan Pawqar waray killa, 2023 watapi qallarispa.

LLaMap ñawpaq kaq niraqpaqmi, tawa wallpama sayayninkuna yachaqachisqa karqanku: 7, 13, 33 hinaspa 65 waranqa hunu kuskanachina-tupukuna.LLaMA wiñarichiqkunaqa chay 13 waranqa hunu kuskanachina tupup alllin-llamk'aynin aswan PST benchmarkkunapi aswan hatun GPT-3manta (175 waranqa hunu kuskanachina-tupuwan) chaymanta aswan hatun wallpamaqa atipanakuq karqa kapchiy kana wallpamakunawan, PaLM, Chinchillahinamanta ima willarqanku.[1] Maypichus aswan atiyniyuq LLMkuna sapsilla saywasqa APIkunallawan yaykuyta atikunku (kanman kanqa chayqa), Meta LLaMAp wallpama llasayninkunata mask'aykachay aylluman mana qhatuypaq lisinsiyayuq kacharichimurqa.[2] LLaMA kacharimusqanmanta huk simanallapi, llasayninkunam 4chanpi BitTorrentwan tukuy runakunaman lluptichikamurqa.[3]

LlaMA 2[llamk'apuy | pukyuta llamk'apuy]

Hawa t'inkikuna[llamk'apuy | pukyuta llamk'apuy]

Pukyukuna[llamk'apuy | pukyuta llamk'apuy]

  1. Touvron, Hugo; Lavril, Thibaut; Izacard, Gautier; Martinet, Xavier; Lachaux, Marie-Anne; Lacroix, Timothée; Rozière, Baptiste; Goyal, Naman; Hambro, Eric; Azhar, Faisal; Rodriguez, Aurelien; Joulin, Armand; Grave, Edouard; Lample, Guillaume (2023). "LLaMA: Open and Efficient Foundation Language Models". arXiv:2302.13971.
  2. "Introducing LLaMA: A foundational, 65-billion-parameter large language model". Meta AI. 24 ñiqin pawqar waray killa, 2023 watapi. (Inlis simipi)
  3. Vincent, James (8 ñiqin ayamarq'a killa, 2023 watapi). "Meta's powerful AI language model has leaked online — what happens now?". The Verge. (Inlis simipi)
"https://qu.wikipedia.org/w/index.php?title=LLaMA&oldid=660680" p'anqamanta chaskisqa (Wikipedia, Qhichwa / Quechua)