Trend 현자의 돌

현자의 돌

[융합연구팀 김무성] 최근 스탠포드의 연구자들이 “On the Opportunities and Risks of Foundation Models”라는 논문을 냈습니다 [1]. 여기서 Foundation Model이란 GPT-3와 같은 대규모 데이터로 사전학습된, 전이학습이 일어나는 딥러닝 모델을 말합니다. 자연어처리…

InteractionTrend 현자의 돌

Instruction tuning – FLAN

[Convergence Research Team Hongmae Shim] If we were to pick the top 10 keywords for 2020 in the field of NLP, of course it would be in the GPT-3 (Language Models are Few shot Learners) ranking. To this day, the enormous amount of parameters and excellent performance of GPT-3 is still in the field of NLP…

InteractionTrend 현자의 돌

GPT-3 based game NPC demo

Comparing the games from 10 years ago to the current games, there are a lot of differences, especially in terms of graphics. For example, 4K or higher resolution, sophisticated graphic textures, natural 3D model animations, physics engines, light engines, etc...

InteractionCodeData 현자의 돌

GPT-Neo: Open Source GPT-3 Project

OpenAI's GPT-3 is a large language model with a parameter count of up to 175B. Despite the surprising results of GPT-3, it is not open source, so if you want to try it, try AI Dungeon ( or Philosopher AI ( ) Through a site such as...

VisualInteraction 현자의 돌

OpenAI DALL-E: Creating images from text

DALL-E, released by OpenAI, is a technology that generates images from natural language text. Previously, there were technologies for the same purpose, such as StackGAN and OP-GAN, but DALL-E has the advantage that the quality of the final result is remarkably excellent because it is made based on GPT-3, a super-scale language model.

InteractionTrend 현자의 돌

Can BERTology understand language?

The large-scale language model based on deep learning represented by BERT shows excellent performance in various tasks related to natural language such as Q&A, document summarization, document generation, and conversation. In particular, the recently appeared GPT-3 is an artificial general intelligence (AGI)...