
EDMONTON, January 12, 2021 – I am completely fascinated with a new technology recently released by OpenAI, the organization originally founded by Elon Musk (Tesla CEO) and Sam Altman (Y Combinator co-founder). OpenAI has produced some amazing inventions over the last fews years, but nothing comes close to their recent creation of GPT-3, a natural language processing AI system. There are many applications possible for GPT-3 in health — I’ve been experimenting and I’m impressed.
If you haven’t noticed, AI is everywhere and is being infused into everything. While it’s a bit of a sales gimmick in some cases, in many areas, AI is completely altering the product landscape. Its impossible to buy a product, search for a movie, or listen to music without some sort of AI system observing your choices and optimizing your future experience. AI is driving cars, protecting customers, controlling buildings, running financial systems, optimizing operations, and generally running the planet. Not without people of course, but with emerging so called General AI systems (GAI) like GPT-3, that may change.
While not fully autonomous, AI can significantly boost productivity when it’s built into tools. This is especially true for professionals like lawyers, engineers, architects or artists so there is a significant incentive for developers to incorporate AI into their applications. AI development, however, has been hampered because of the expense and expertise required to build AI into applications. General AI tools address this issue by simplifying the AI development process, and reducing the coding requirements.
In the past, building AI systems was incredibly hard and expensive because experts with deep knowledge about how AI systems work would be required and extensive computer systems were required to run AI “training”. For example, IBM’s Deep Blue, the AI application that was able to defeat the chess Grandmaster Gary Kasparov, was built using a large team of experts, a huge collection of ultra-fast modern computer systems, and several years of development. Clearly, this not an effort possible for the average developer; however, general AI like GPT-3 has the potential to completely change the way AI systems are made. GPT-3 can do amazingly intelligent things, without any programming or AI expertise.
Health Tips
I want to build a kind of “Health Advisor” that can provide some health tips. Normally, if I was going to build a health tips database I would gather up hundreds of tips from various databases and put them into one database. Then I would write some code that could pull up a random tip from the database on demand. With GPT-3 this is not necessary because GPT-3 can generate health tips for you. Simply by giving GPT-3 some sample health tips, it is able to generalize from its vast knowledge of language to provide novel health tips on demand.
The site “https://www.healthline.com/nutrition/27-health-and-nutrition-tips” has some great starting tips. My thinking was, if GPT-3 can generalize from these, that would would be a great example of general AI. Programming GPT-3 simply requires provide some examples of what you want to use for input and what you want to use for output. GPT-3, as a general AI system, should be able to generalize. Here are a few examples implemented in the python language:
gpt.add_example(Example(‘health => tip:’, ‘Don’t drink sugar calories. Sugary drinks are among the most fattening items you can put into your body.’))
gpt.add_example(Example(‘health => tip:’, ‘Eat nuts. Despite being high in fat, nuts are incredibly nutritious and healthy.’))
gpt.add_example(Example(‘health => tip:’, ‘Avoid processed junk food (eat real food instead). Processed junk food is incredibly unhealthy.’))
gpt.add_example(Example(‘health => tip:’, ‘Don’t fear coffee. Coffee is very healthy.’))
gpt.add_example(Example(‘health => tip:’, ‘Eat fatty fish. Fish is a great source of high-quality protein and healthy fat.’)) |
In all, I provided 26 examples like this. Some were complete copies from the web site, while others were just partial copies. Then I ran these examples through GPT-3. The questions I’m asking GPT-3 every time are the same. What I’m hoping is that GPT-3 will realize I’m looking for unique responses — it did not disappoint.
Input: health => tip: GPT-3 Response:Don’t eat a lot of processed grains. Grains are nutritious, but they’re also heavily processed. The refining process removes many nutrients, and grains are often refined in unhealthy ways. For this reason, it’s best to avoid processed grains and stick to whole grains instead. |
Where did this piece of knowledge come from? I didn’t provide this example. GPT-3 came up with this insight all on its own. What is remarkable about this insight is that the response is both factually correct and grammatically correct. It’s certainly possible that this is an exact copy of some other health tips database, but the fact remains that GPT-3 figured out that this is exactly the kind of output that I was looking for and generated it without me writing any code to get the AI to work.
Input: health => tip: GPT-3 Response: Don’t eat too much sugar. Eating too much sugar is one of the worst things you can do for your health. It’s strongly linked to numerous metabolic diseases, including obesity, type 2 diabetes, and heart disease. |
This response is very interesting because it is close to one of the examples that I gave GPT-3, but not exactly the same. Here is my example: “‘Minimize your sugar intake. Added sugar is one of the worst ingredients in the modern diet, as large amounts can harm your metabolic health.High sugar intake is linked to numerous ailments, including obesity, type 2 diabetes, heart disease, and many forms of cancer.”
Clearly GPT-3 understands that this example is a good response but a summary is possible.
Input: health => tip: GPT-3 Response: Eat plenty of whole, unprocessed foods. Whole, unprocessed foods are the healthiest foods you can eat. They’re loaded with fiber, vitamins, minerals, and antioxidants. They’re also low in calories and high in satiating protein and fat. |
This tip is unlike any of the tips in the list. GPT-3 made this one up on its own! GPT-3 seems to be indicating that having both protein and fat are good for you if they come from whole, unprocessed foods. I had to look up “whole foods”, which I found out to be a well researched area of healthy eating. GPT-3 figured out, on its own, that this entire collection of words is exactly what I was looking for in a “health tip”, clearly demonstrating a kind of intelligence that is human like.
Mixing Parsing Styles
A health chat bot needs to be able to listen to a user and respond with tips or advice that relate to the user’s needs. Language understanding is a huge field in AI and GPT-3 is already having a huge impact. One example of language understanding is being able to sort out nouns from verbs in a sentence. This is called “part of speech” tagging and is usually the first step in a natural language understanding system. What I need GPT-3 to do is convert the input sentence into an output sentence written in a specific syntax that will be easy for an interpreter to understand. Amazingly, GPT-3 can do this without requiring any code.
Here is an example of the kind of interaction I’m looking for:
Input: Today is a lovely day => pos: Training example: noun<Today> verb<is> noun<day> |
Note the syntax. If I had to write a program to do this, it would not be particularly hard, but it would require specialized coding skill and time; however, simply by giving GPT-3 several examples of this input and output style, GPT-3 is able to perform the task correctly interpreting the input and formatting the output, without a line of code required.
After creating a part of speech program and a health tips program, I was curious to see if GPT-3 could sort out the difference. It turns out that it can. What I did was simply put both part of speech training mixed in with health tip train and, to my amazement, GPT-3 was easily able to distinguish between the two styles of question and respond accordingly.
Here is an example combining two different intelligent formats into one recognizer.
gpt.add_example(Example(‘A bird is kind of animal that flies => pos:’, ‘noun<bird> noun<animal> verb<is> verb<flies>’))
gpt.add_example(Example(‘This car has an engine that runs really well ==> pos:’, ‘noun<car> noun<engine> verb<has> verb<runs>’))
gpt.add_example(Example(‘The boy fell to the floor and cried => pos:’, ‘noun<boy> noun<floor> verb<fell> verb<cried>’))
gpt.add_example(Example(‘health => tip:’, ‘Do some cardio. Doing aerobic exercise, also called cardio, is one of the best things you can do for your mental and physical health.’))
gpt.add_example(Example(‘health => tip:’, ‘Don’t smoke or do drugs, and only drink in moderation. If you smoke or abuse drugs, tackle those problems first. Diet and exercise can wait. If you drink alcohol, do so in moderation and consider avoiding it completely if you tend to drink too much.’))
gpt.add_example(Example(‘health => tip:’, ‘Use extra virgin olive oil. Extra virgin olive oil is one of the healthiest vegetable oils. It’s loaded with heart-healthy monounsaturated fats and powerful antioxidants that can fight inflammation.’)) |
To use this, you simply append the appropriate key at the end of the statement. You can mix and match your use. For example for parts of speech just include “pos:” at the end.
The barn door was opened and the cows walked out => pos:
Or for health tips just use
health => tip:
Limitations
With all of its language intelligence, something I found intriguing was the lack of self awareness that GPT-3 has about its own input. For example, if I have a list of words or sentences, GPT-3 has difficulty sequencing input as output. If you request the first sentence, you might get the first one, but you might also get the last one.
gpt.add_example(Example(‘Bob walked to the car. Bob slipped. Bob fell. Bob hurt himself. => 1:’, ‘Bob walked to the car.’))
gpt.add_example(Example(‘Bob walked to the car. Bob slipped. Bob fell. Bob hurt himself. => 2:’, ‘Bob slipped.’))
gpt.add_example(Example(‘Bob walked to the car. Bob slipped. Bob fell. Bob hurt himself. => 3:’, ‘Bob fell.’)) |
With this input “ Bob walked to the car. Bob slipped. Bob fell. Bob hurt himself. => 1:” you might get the first element, but you might not. The point here is that GPT-3 knows a lot, but it doesn’t know everything.
Building GPT-3 agents will require pre-processing and external processing to get a fully functional system. My guess is that there are quite a few researchers working with GPT-3 and that we will see even more profound systems emerging over the coming weeks. Stay tuned.
If you are interested in trying out GPT-3, and you have access to a GPT-3 key, you can get the tool kit from Shreya Shankar and friends at (https://github.com/shreyashankar/gpt3-sandbox). I used this toolkit for some experimentation.
Conclusion
General AI seems to be around the corner. GPT-3 training can be generalized to many kinds of patterns and even though GPT-3 is still not as smart as human, its closer than anything we’ve had. GPT-3 demonstrates an entirely new type of AI that will clearly have a major impact in many different industries in many different ways.