Back to Top
Artificial intelligence
Author: Editorial Office


cyber securityartificial intelligenceit business

The Internet has been flooded with news about artificial intelligence (AI). Now, the show is being stolen by Chat GPT, the latest software from Open AI, which is very similar to human speech in its style of expression. Despite its growing popularity and increased technical capabilities, it is more of a mem machine than an information tool, according to users. Can it pose a threat to employees? Or will it become their support application? Let's take a closer look at it.


The idea of Chat is not new - its basis is the latest version of the GPT-3 algorithm (the acronym comes from the words Generative Pre-trained Transformer 3), which automatically completes text based on the prompts it receives.

GPT-3 is an autoregressive model, that is, one that refines itself without human input. In practice, it searches massive databases for content that best answers the questions asked. In short, it reads with understanding better than other software running on a similar mechanism.


Since GPT Chat was made available to a broader audience, more than a million users have created an account. Why has it suddenly become so popular? One of its new developments is to have a dialogue-like conversation using natural phrases. Chat makes complete sentences and responses in several threads, and its thoughts are not broken off. It provides information to queries on various topics: cooking, instructions, programming examples, or writing fairy tales. This is all due to "training" with human speech.

It also has the advantage of responding dynamically - it is programmed to use its algorithm to provide a complete answer rather than quizzing the user for a broader analysis. Previous GPT-3-based applications were designed so that it was the user who triggered interactions, while the current Chat starts those interactions itself.

However, it is not a perfect tool. Social media has been flooded with screenshots of the most incredible, weirdest, or awkward AI conversations. To date, most people use Chat GPT to make someone laugh, satisfy a curiosity or share an unheard-of, most creative story.


While many will agree that Chat GPT stands out for its excellent reliability of the information, it should not be trusted 100%. Users have already caught several critical flaws in the Chat's performance. Here are some of them:

  • It does not provide the source of information.
  • It's harder to get an answer to a complicated question (it answers more precisely when the questions are simple, such as when someone famous was born).
  • It often answers incorrectly (Stack Overflow, a popular forum for developers, has temporarily blocked Chat GPT-generated answers due to repeated incorrect answers).
  • It is pretty good at coding but makes computational mistakes, such as failing to handle fundamental algebra problems.
  • It requires significant computing power to run, making it challenging to implement in some environments.

However, in the face of the various tasks it has been tasked with, it accomplishes most of them. It is undoubtedly helpful for scientific/scientific questions; for example, it can support a novice programmer in writing simple code.


The more advanced artificial intelligence is, the more human-like it will seem, but GPT Chat does not live up to this expectation. During a conversation, Chat GPT insists it is not human and has no thoughts, feelings, or emotions. It says bluntly, "[...] I have no physical form or consciousness [...] My knowledge is limited to the text I was trained on... I am here to help in the best way possible, so if you have any questions, don't hesitate to ask them." Thus, it is not in a position to bail someone out of his professional duties. Using its prompts without understanding the generated code will not make one a programmer. This is equivalent to the fact that it will not replace a human being, but it will help him improve a program or application or catch bugs faster than previously possible.

Recent news: