Monday, 20 July 2020

OpenAI GPT-3: Impressive web development on demand


Dictation instead of programming: Developer Sharif Shameem takes advantage of the quick grasp of OpenAI's speech AI GPT-3 and teaches it rudimentary web development with two examples.
In early 2019, OpenAI introduced the language AI GPT-2, on the basis of which developer Jacob Jackson developed the code editor Deep TabNine : It offers AI-based auto-completion for programming code .

Jackson only had to train GPT-2 with source codes from the GitHub database and the AI ​​could recognize for 22 programming languages which command would probably have to appear next in a line of code .

Deep TabNine showed that language AIs like GPT-2 are more than text generators. Programmers were enthusiastic and described the AI ​​addition as "amazing", "crazy" and "incredible".

Programming with AI - level two


In May 2020, OpenAI launched GPT-3 . The successor to GPT-2 is a hundred times larger, writes even better texts - and is, particularly importantly, suitable for so-called Few-Shot-Learning: GPT-3 masters new tasks with just a few training examples instead of those for Deep TabNine, for which 
two million excerpts of code was necessary .

This adaptability also comes about through the comprehensive basic training: While 40 gigabytes of text from the online platform Reddit were used to train GPT-2, GPT-3 is trained with a total of 570 gigabytes of text .

Part of the huge data set comes from the Common Crawl text archive. The non-profit organization has been systematically collecting Internet texts since 2011 - regardless of whether it's a blog post, Reddit comment or just a code example.

Websites on demand: GPT-3 as a web developer
Computer specialist Sharif Shameem has now trained a kind of AI front-end developer via the newly available commercial GPT3 interface . Frontend developers mainly build user interfaces for an app or a website. Shameem only needed two code examples for the AI training , which he fed GPT-3.
Following these two simple examples, GPT-3 can generate JSX code based on well-written descriptions . For example, when entering “One button for each color of the rainbow”, the AI ​​writes the appropriate JSX code for seven differently colored buttons.

The voice command "A table of the richest countries in the world with the columns Name and BiP" generates a corresponding table. The sentence “A button the color of Donald Trump's hair” produces a yellow button.

The same results are also possible for pure HTML and CSS code, Shameem writes. All that would be needed to train the AI ​​were two other corresponding code examples.

Fast learner AI: The trick is diverse pre-training
The extensive pre-training of GPT-3 is what makes Shameem's programming AI possible. While Deep TabNine had to be trained with a large number of code examples , GPT-3 already has the basic understanding of code and language.

With just a few examples, AI only needs to point a finger in the right direction to convert voice commands into suitable code. In addition, she has more context knowledge in the neural network thanks to extensive preliminary training - such as Trump's hair color.

No developer has yet to fear for his or her job
Shameem's example shows that OpenAI's GPT-3 brings us closer to a future where developers can tell the computer - even literally - what code to write instead of typing line by line . Deep TabNine inventor Jackson describes AI-based programming as his vision of the future.

Jobs will not replace an AI tool like Shameem's for the time being, but will simplify and speed up the work of developers by quickly rudimentary code.

In the long term, AI tools could revolutionize developer work wherever repetitive programming tasks are commonplace. In addition to web front-end professionals, this also applies to back-end developers, according to Shameem. You would have to "prepare for a wild ride".

Based on his GPT-3 experience, the developer even assumes that a general, versatile AI can be reached in less than ten years - previously he had predicted more than 50 years.

No comments:

Post a comment