๋ฐ˜์‘ํ˜•

์ „์ฒด ๊ธ€ 174

์นด์นด์˜ค๋ธŒ๋ ˆ์ธ Multimodal LLM Honeybee ๋…ผ๋ฌธ ๋ฆฌ๋ทฐ

์นด์นด์˜ค๋ธŒ๋ ˆ์ธ์—์„œ ์ž‘๋…„ ๋ง Multimodal LLM์ธ Honeybee๋ฅผ ๋ฐœํ‘œํ–ˆ๋‹ค. ์•„์‰ฝ๊ฒŒ๋„ ํ•œ๊ตญ์–ด ๋ชจ๋ธ์€ ์•„๋‹ˆ๊ณ  ์˜์–ด ๋ชจ๋ธ์ด๊ณ , 5๊ฐœ์˜ ๋ฒค์น˜๋งˆํฌ์—์„œ SoTA๋ฅผ ๋‹ฌ์„ฑํ–ˆ๋‹ค๊ณ  ํ•ด์„œ ๋‰ด์Šค๊ฐ€ ์—„์ฒญ ๋งŽ์ด ๋‚˜์™”๋‹ค. ๋…ผ๋ฌธ: https://arxiv.org/pdf/2312.06742.pdf ๊นƒํ—™: https://github.com/kakaobrain/honeybee GitHub - kakaobrain/honeybee: The official implementation of project "Honeybee" The official implementation of project "Honeybee". Contribute to kakaobrain/honeybee development by creating an account o..

[๋”ฅ๋Ÿฌ๋‹ ๋…ผ๋ฌธ๋ฆฌ๋ทฐ] MeZO: Fine-Tuning Language Models with Just Forward Passes (NeurIPS 2023)

๋…ผ๋ฌธ ๋งํฌ: https://arxiv.org/pdf/2305.17333.pdf ๋ฐœํ‘œ ์˜์ƒ: https://neurips.cc/virtual/2023/poster/71437 ์ฝ”๋“œ: https://github.com/princeton-nlp/MeZO NeurIPS 2023 Abstract: Fine-tuning language models (LMs) has yielded success on diverse downstream tasks, but as LMs grow in size, backpropagation requires a prohibitively large amount of memory. Zeroth-order (ZO) methods can in principle estimate gradients us..

[๋”ฅ๋Ÿฌ๋‹ ๋…ผ๋ฌธ๋ฆฌ๋ทฐ] AIM: Scalable Pre-training of Large Autoregressive Image Models (Apple, 2024)

Apple์—์„œ 2024๋…„ 1์›” large pretrained image model์ธ AIM(Autoregressive Image Models)์„ ๋ฐœํ‘œํ–ˆ๋‹ค. ์ฝ”๋“œ์™€ model weight์ด Github์— ๊ณต๊ฐœ๋˜์–ด ์žˆ๋‹ค. ๋…ผ๋ฌธ ๋งํฌ: https://arxiv.org/pdf/2401.08541.pdf GitHub: https://github.com/apple/ml-aim/tree/main AIM์€ LLM์— ์˜๊ฐ์„ ๋ฐ›์•„ ๋งŒ๋“ค์–ด์ง„ ๋Œ€๊ทœ๋ชจ vision ๋ชจ๋ธ์ด๋‹ค. BEiT (2021), Masked autoencoder(MAE) (2021) ๋“ฑ์ด masked language modeling (MLM)์„ ํ†ตํ•ด ์‚ฌ์ „ํ•™์Šต ์‹œํ‚จ ๊ฒƒ๊ณผ ๋‹ค๋ฅด๊ฒŒ, ์ฃผ์–ด์ง„ ํŒจ์น˜๋กœ ๋‹ค์Œ ํŒจ์น˜๋ฅผ ์˜ˆ์ธกํ•˜๋Š” autoregressive object๋ฅผ ์ด์šฉ..

ํ•œ๊ตญ์–ด ์˜คํ”ˆ์†Œ์Šค ๋ฉ€ํ‹ฐ๋ชจ๋‹ฌ ๋ชจ๋ธ ๋ชจ์Œ (image-text)

ํ˜น์€ awesome-korean-multimodal ๊ฐ™์€๊ฒƒ ์‚ฌ์‹ค ํ•œ๊ตญ์–ด LLM๋„ ๋งŽ์ด ์—†๊ฑฐ๋‹ˆ์™€, ์˜คํ”ˆ์†Œ์Šค๋กœ ๊ณต๊ฐœ๋œ ํ•œ๊ตญ์–ด ๋ฉ€ํ‹ฐ๋ชจ๋‹ฌ LLM(MLLM)์€ ์ •๋ง ์–ผ๋งˆ ์•ˆ๋˜๋Š”๋“ฏ ํ•˜๋‹ค. (์ฐธ๊ณ : ํ•œ๊ตญ์–ด LLM ๋ชจ๋ธ ๋ชจ์Œ - awesome-korean-llm) GitHub - NomaDamas/awesome-korean-llm: Awesome list of Korean Large Language Models. Awesome list of Korean Large Language Models. Contribute to NomaDamas/awesome-korean-llm development by creating an account on GitHub. github.com ํ•œ๊ตญ์–ด multimodal llm ๋ฟ๋งŒ ์•„๋‹ˆ๋ผ m..

Apple์˜ Multimodal LLM Ferret ๋…ผ๋ฌธ ๋ฆฌ๋ทฐ

Apple์—์„œ 2023๋…„ 10์›” ๋‚ด๋†“์€ Multimodal LLM์ธ Ferret์˜ ๋…ผ๋ฌธ์ด๋‹ค. ๋ชจ๋ธ ํฌ๊ธฐ๋Š” 7B, 13B ๋‘๊ฐ€์ง€์ด๋ฉฐ Github์— ์ฝ”๋“œ์™€ checkpoint๊ฐ€ ๊ณต๊ฐœ๋˜์–ด ์žˆ๊ณ , ๋น„์ƒ์—…์  ์šฉ๋„๋กœ ์‚ฌ์šฉ๊ฐ€๋Šฅํ•˜๋‹ค. ๋…ผ๋ฌธ ๋งํฌ: https://arxiv.org/pdf/2310.07704.pdf Github: https://github.com/apple/ml-ferret GitHub - apple/ml-ferret Contribute to apple/ml-ferret development by creating an account on GitHub. github.com Introduction Vision-language learning ๋ชจ๋ธ์˜ ์ฃผ์š”ํ•œ ๋‘ capability๋Š” referring๊ณผ groun..

[Optuna] ๋”ฅ๋Ÿฌ๋‹ ํ•˜์ดํผํŒŒ๋ผ๋ฏธํ„ฐ ์ตœ์ ํ™”ํ•˜๊ธฐ

Optuna๋Š” ํŒŒ์ด์ฌ ๊ธฐ๋ฐ˜์˜ ํ•˜์ดํผํŒŒ๋ผ๋ฏธํ„ฐ ์ตœ์ ํ™” (hyperparameter optimization) ํ”„๋ ˆ์ž„์›Œํฌ๋กœ, ์‹ฌํ”Œํ•˜๊ณ  ์œ ์—ฐํ•œ API๋ฅผ ์ œ๊ณตํ•œ๋‹ค. ๋ณธ ๊ธ€์—์„œ๋Š” Optuna์˜ ์ฃผ์š” ๊ธฐ๋Šฅ๊ณผ ์‚ฌ์šฉ๋ฐฉ๋ฒ•์„ ๊ฐ„๋‹จํžˆ ์†Œ๊ฐœํ•˜๊ณ ์ž ํ•œ๋‹ค. ๊ณต์‹ Docs: https://optuna.readthedocs.io/en/stable/index.html Optuna: A hyperparameter optimization framework โ€” Optuna 3.4.0 documentation ยฉ Copyright 2018, Optuna Contributors. Revision 4ea580fc. optuna.readthedocs.io Basic concepts Optuna๋Š” study์™€ trial์„ ๋‹ค์Œ๊ณผ ๊ฐ™์ด ์ •์˜ํ•œ๋‹ค. Stu..

[PyTorch] Autograd ์ž‘๋™๋ฐฉ์‹ ์•Œ์•„๋ณด๊ธฐ

์œ„ ๋™์˜์ƒ์—์„œ PyTorch Autograd๋ฅผ ์ดํ•ดํ•˜๊ธฐ ์‰ฝ๊ฒŒ ์„ค๋ช…ํ•ด์ฃผ๊ณ  ์žˆ๋‹ค. ๋‹ค์Œ์€ ์œ„ ๋™์˜์ƒ์„ ๊ฐ„๋‹จํžˆ ์ •๋ฆฌํ•œ ๊ธ€์ด๋‹ค. 1. torch.Tensor ๊ฐ tensor์€ ๋‹ค์Œ์˜ attr์„ ๊ฐ–๋Š”๋‹ค data: tensor์˜ ๊ฐ’ grad: tensor์˜ gradient ๊ฐ’. is_leaf์ธ ๊ฒฝ์šฐ์—๋งŒ gradient๊ฐ€ ์ž๋™์œผ๋กœ ์ €์žฅ๋œ๋‹ค. grad_fn: gradient function. ํ•ด๋‹น tensor๊ฐ€ ์–ด๋–ค ์—ฐ์‚ฐ์„ ํ†ตํ•ด forward๋˜์—ˆ๋Š”์ง€์— ๋”ฐ๋ผ ๊ฒฐ์ •๋œ๋‹ค. ex) a * b = c ์ธ ๊ฒฝ์šฐ c์˜ grad_fn์€ MulBackward์ด๋‹ค. is_leaf์ธ ๊ฒฝ์šฐ None is_leaf: (backward ๊ธฐ์ค€) ๊ฐ€์žฅ ๋งˆ์ง€๋ง‰ tensor์ธ์ง€ requires_grad: ๊ณ„์‚ฐ ๊ทธ๋ž˜ํ”„์˜ ์ผ๋ถ€๋กœ ๋“ค์–ด๊ฐˆ ๊ฒƒ์ธ์ง€ 2. gr..

[๋Ÿฌ๋‹ ์ŠคํŒŒํฌ] ๋ฐ์ดํ„ฐํ”„๋ ˆ์ž„ ์—ฐ์‚ฐ๊ณผ ์ „์ฒ˜๋ฆฌ

spark์˜ ๋ฐ์ดํ„ฐํ”„๋ ˆ์ž„ ์—ฐ์‚ฐ๋“ค์„ ์ด์šฉํ•ด ๋ฐ์ดํ„ฐ ์ „์ฒ˜๋ฆฌ, ๋ณ€ํ™˜, ํ†ต๊ณ„ ๋“ฑ ๋‹ค์–‘ํ•œ ์ผ์„ ์ˆ˜ํ–‰ํ•  ์ˆ˜ ์žˆ๋‹ค. ๋‹ค์Œ์€ ๋ช‡๊ฐ€์ง€ ์—ฐ์‚ฐ๋“ค๊ณผ ํ™œ์šฉ ์˜ˆ์‹œ์ด๋‹ค. ํ”„๋กœ์ ์…˜๊ณผ ํ•„ํ„ฐ df = df.select(df.colA, df.colB) # ํ”„๋กœ์ ์…˜ (colA์™€ colB๋งŒ ์„ ํƒ) df = df.where(df.colB 10000")) # colA์˜ ๊ฐ’์ด 10000์ด์ƒ์ด๋ฉด True๋ฅผ ๊ฐ–๋Š” column largeA๋ฅผ ์ถ”๊ฐ€ df = df.drop("colA") # colA ์‚ญ์ œ ์ฐธ๊ณ ) alias์™€..

[๋Ÿฌ๋‹ ์ŠคํŒŒํฌ] ๋ฐ์ดํ„ฐํ”„๋ ˆ์ž„ ์ฝ๊ณ  ๋‚ด๋ณด๋‚ด๊ธฐ

๊ตฌ์กฐํ™”๋œ ์™ธ๋ถ€ ๋ฐ์ดํ„ฐ ์†Œ์Šค์—์„œ ๋ฐ์ดํ„ฐ๋ฅผ ์ฝ์–ด Spark ๋ฐ์ดํ„ฐํ”„๋ ˆ์ž„์œผ๋กœ ๋กœ๋“œํ•˜๊ณ , ํŠน์ • ํฌ๋งท์œผ๋กœ ๋ฐ์ดํ„ฐํ”„๋ ˆ์ž„์˜ ๋ฐ์ดํ„ฐ๋ฅผ ์จ์„œ ๋‚ด๋ณด๋‚ด๊ธฐ ์œ„ํ•ด DataFrameReader์™€ DataFrameWriter ์ธํ„ฐํŽ˜์ด์Šค๋ฅผ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ๋‹ค. pyspark.sql.DataFrameReader โ€” PySpark 3.5.0 documentation Interface used to load a DataFrame from external storage systems (e.g. file systems, key-value stores, etc). Use SparkSession.read to access this. Changed in version 3.4.0: Supports Spark Connect. spark.apache.or..

[๋Ÿฌ๋‹ ์ŠคํŒŒํฌ] Column๊ณผ Row

์ปฌ๋Ÿผ Column ์ŠคํŒŒํฌ ๋ฐ์ดํ„ฐํ”„๋ ˆ์ž„์—์„œ๋Š” Column์˜ ์ด๋ฆ„์„ ์ด์šฉํ•ด ๋‹ค์–‘ํ•œ ์—ฐ์‚ฐ์„ ์ˆ˜ํ–‰ํ•  ์ˆ˜ ์žˆ๋‹ค. pyspark.sql.Column โ€” PySpark 3.5.0 documentation A column in a DataFrame. Changed in version 3.4.0: Supports Spark Connect. Select a column out of a DataFrame >>> df.name Column >>> df[โ€œnameโ€] Column spark.apache.org Pyspark์—์„œ column์— ์ ‘๊ทผํ•˜๋Š” ๋ฐฉ์‹์€ ์—ฌ๋Ÿฌ ๊ฐ€์ง€๊ฐ€ ์žˆ๋Š”๋ฐ, ํ•˜๋‚˜๋Š” col("columnName") ํ•จ์ˆ˜๋ฅผ ์‚ฌ์šฉํ•˜๋Š” ๊ฒƒ,๋‹ค๋ฅธ ํ•˜๋‚˜๋Š” df.columnName์„ ์‚ฌ์šฉํ•˜๋Š” ๊ฒƒ์ด๋‹ค. ๋‹ค์Œ์€ Column์„ ์ด์šฉํ•œ ์—ฐ์‚ฐ์˜..

๋ฐ˜์‘ํ˜•