[Prior Research Team Sung-Hyun Kim]
The causal language model (eg GPT-3), which made a big impact on natural languages, has now been applied to programming languages beyond natural languages! 🤗
Last June, Github and OpenAI jointly developed CopilotThis has been revealed.
Copilot, like GPT-3 in natural language, can analyze and understand the context of a given code, and then generate the appropriate code.
You can apply for use of Copilot by clicking the link above.
Once licensed, you can test it either through the API or through vscode's plugin.
So let's do a test, shall we?
First, I wrote a code that generates natural language using huggingface's gpt-2!
Since it is to create a specific library by designating it, it would be an impossible task without knowledge of that library, right?
Very well created! 🙂
Next, let's start the algorithm, create sort algorithms!
It has some errors, but it still generates very well the way you want it!
In addition, let's test the flower of natural language preprocessing, regular expressions.
Actually, I am coding with copilot enabled all the time while coding recently.
It felt like I was doing pair coding with a helper who was very good at coding 🙂
These Copilots are based on Codex developed by OpenAI, which is said to have learned several terabytes of Github code.
An example of codex was recently released, and it is said that it can show great results like the video below.