
EDMONTON, March 23, 2021 – After further experimentation, I’m getting the feeling that GPT-3 knows something about everything. Any fact that has a related Wikipedia page can be accessed by GPT-3 and it does a great job of rewriting the result so that it looks new. But GPT-3 goes well beyond Wikipedia — almost any question that you can think of can be answered by GPT-3, not always correctly, but correct most of the time. In the information intensive world in which we live, a know-it-all assistant like GPT-3 would be a valuable add-on to any application.
In a previous article, I wrote about my experience getting GPT-3 to generate health tips (https://link.medium.com/2WQkZmOaw9). In this context, I wanted GPT-3 to simply generate its own health tips without prompting on the subject. It performed remarkably well. The input to GPT-3 was a collection of tips from a health web site. From this input alone, GPT-3 was able to invent its own health tips with corresponding explanations.
For these new experiments, I was interested in the ability of GPT-3 to write about a specified subject. The first examples I gave it were of the following form: “Input => write: Output”. Below are some examples in Python.
gpt.add_example(Example('Write on the topic of soup. => write: ', 'Soup is a primarily liquid food, generally served warm or hot, that is made by combining ingredients of meat or vegetables with stock, or water. Hot soups are additionally characterized by boiling solid ingredients in liquids in a pot until the flavors are extracted, forming a broth.')) gpt.add_example(Example('Write on the topic of meat. => write: ', 'Meat is animal flesh that is eaten as food.[1] Humans have hunted and killed animals for meat since prehistoric times. The advent of civilization allowed the domestication of animals such as chickens, sheep, rabbits, pigs and cattle. Many religions have rules about which meat may or may not be eaten. Vegetarians and vegans may abstain from eating meat because of concerns about the ethics of eating meat, environmental effects of meat production or nutritional effects of consumption.')) |
Both of these are from Wikipedia. I added a few more examples in this style and ran GPT-3. Surprisingly, when I started asking questions, I found that GPT-3 was able to answer questions on absolutely any topic! Here are some examples of questions and GPT-3’s responses:
Input: “Write on the topic of disease. => write:” GPT-3 Output: “A disease is a particular abnormal condition that negatively affects the structure or function of part or all of an organism. It may be caused by external factors such as pathogens, or it may be caused by internal dysfunctions, such as autoimmune disorders. In humans, “disease” is often used more broadly to refer to any condition that causes pain, dysfunction, distress, social problems, or death to the person afflicted, or similar problems for those in contact with the person. …” |
Other topics explored included sports, honey, animals, bears, poop, marriage, god and much more. There doesn’t seem to be a limit.
At this point, the only examples provided to GPT-3 were of the style “Write on the topic of … => write:”. To test the intelligence of GPT-3 I decided to ask a more general question in a completely different format.
Input: “Find the athlete with the best time. => write:” GPT-3 Output: “Usain Bolt is a Jamaican sprinter. He is the first person to hold both the 100 metres and 200 metres world records since fully automatic time became mandatory. … “ |
It wrote a whole paragraph. I decided to try for more detail.
Input: “Find the swim athlete with the best time. => write:” GPT-3 Output: “ Michael Phelps has set many world records in long course (50 m) and short course (25 m) swimming events. …” |
Again a whole paragraph was produced.
GTP-3 Knows Everything
After more experiments, I found that the limited input that I provided GPT-3 as a starting point affected its ability to answer questions of the type “who is” and “what is”. Sometimes the answer was incorrect. I found that by simply adding “who is” and “what is” questions as examples, this made GPT-3 smarter at answering these types of questions. After making this change, and testing for several hours, I found that many of the “who/what/when/where/how” questions I asked it returned more complete answers.
With only 8 simple examples GPT-3 seemed able to figure out how to answer most questions, including simple math questions, questions about places, times, people, countries. It can even answer questions about import levels, exports, economic statistics, sports and more.
Questions included What is the age of the earth? How many planets are there? Where is God? What is fusion? What is 2 times 2 times 2? Who wrote Alice in Wonderland? And many others.
Asking the Right Question
Like talking to people, sometimes you have to rephrase your question to get a good answer. If you ask “What are the rules of Go?” GPT-3 may or may not answer fully. Change the answer to something more explicit and GPT-3 will happily list all of the rules of Go or any other game. List were also problematic, with the current level of training. For example: The question “List all the records of the Beatles?” did not provide a complete list, but the question “find the list of all the Beatles albums?” was able to provide the complete list.
GPT-3’s Future
GPT-3 is like a Savant in that there is no emotion or context attached to what it knows. It simply responds to the question provided. Nevertheless, GPT-3 is extremely intelligent. In some ways much smarter than people, it knows about many things — more than any normal person could possibly recall. But it’s not perfect, it doesn’t know the answer to everything, and it can’t do complex reasoning. It forgets your query as soon as it generates a response. It holds a limited window of context information and has a limited number of characters it can provide in a single answer (currently 2048 words or tokens for the combination of input and output.)
Despite its limitations, GPT-3 can function as the heart of an intelligent system. GPT-3’s limits can be overcome with additional programming. Techniques could include multiple answering agents, more training, hybrid AI combining more than one system, and more. With these techniques, super intelligent agents could be made a base component for any system. They could be embedded into companions, they could be used in games to simulate non-player characters and they could function as help agents guiding users through applications or complex processes.
Conclusion
GPT-3 knows a lot and its abilities go beyond regurgitating information. GPT-3 seems to have some flexibility in the way it provides its answers based on how questions are asked and the input examples that are provided. This makes GPT-3 very useful as a programming tool to enable intelligent applications that require language as a form of input or output. I predict that we will see an explosion in intelligent interfaces in the near future that rely on General AI systems like GPT-3 to navigate the increasingly complex world of information in which we live.