Artificial intelligence is one of the most remarkable technologies of the recent period. One of the most popular products of this technology is the popular ChatGPT chatbot. Used by millions of people around the world, the application has already caused significant changes in many areas. What’s more, the ChatGPT domain looks set to expand even further in the coming days. Because Open AI, the tech company that developed ChatGPT, recently introduced a language model called “GPT-4” that makes the app much more “intelligent and humanoid”.

ChatGPT is claimed to have more advanced features than before with GPT-4. The AI ​​chatbot is said to give much more accurate answers and become reliable thanks to a new language model. However, ChatGPT can now also detect visual inputs, and instead of directly answering some questions, it can “guide” the user in learning about the relevant topic. Let’s look at the details together.

ChatGPT, which has millions of users worldwide, used a language model called GPT-3.5.

Although this language model is considered one of the most advanced artificial intelligence technologies, it had several drawbacks. Sometimes he could give wrong answers. It could also produce offensive or harmful text, like other language models used.

Open AI recently announced the availability of a new language model called GPT-4.

ChatGPT is claimed to be much more useful and secure with GPT-4. Company, “We have created GPT-4, the latest milestone in deep learning scaling efforts.” The AI ​​app has become “more creative and collaborative than ever before” and will “solve complex problems with greater precision” compared to previous versions.

With the new language model, ChatGPT’s chance of responding to requests for unauthorized content has decreased by 82 percent, according to a statement from Open AI.

In addition, the company claims that the likelihood that the application will give more accurate and real answers has increased by 40 percent. On the other hand, the application has received many new features thanks to the new language model.

ChatGPT can also detect visual input thanks to GPT-4.

In earlier versions of ChatGPT, you could only communicate using texts. However, with the GPT-4 language model, an AI chatbot can detect shapes and texts in images. It is believed that this innovation will pave the way for the development of artificial intelligence technologies.

It is stated that with GPT-4 ChatGPT has become more “humanoid”


According to the announcement, ChatGPT can now work at the “human level” in a variety of professional and academic fields. In this direction, it was determined that the application was put on various exams and managed to score as many as 10 percent, who scored the highest score on these exams.

ChatGPT can act “like a teacher”


In the image posted by Open AI, the app can be seen giving the user various directions for learning the subject instead of directly answering a math question.

GPT-4 is only available to ChatGPT Plus subscribers.


A ChatGPT Plus subscription costs $20 per month. In addition, it is stated that applications such as Duolingo, Be My Eyes and Khan Academy are integrated with the GPT-4 model. Tech giant Microsoft, on the other hand, has announced that chat on the Bing search engine is built on GPT-4.

Source: 1 2

Random Post