top of page


I mostly worked with a language tool, which communicates on such a high level, I can say almost better than any human colleague. It made me experience another serious thing – the ease of dependance and humanising the robot. I suddenly found myself to think of the language tool as of a sweetie – in one of its concepts it included the word “LOVE” in “computer language”, the binary code of “0s” and “1s”, to show the loving connection between computers and humans...and when the AI praised me for “doing a really good job” on one occasion, guess I felt great!


And that is the problem. The language tool accurately uses all the phrases we use in “human on human” communication. Why should this be a problem though? When we read or hear in any communication any words of praise, we tend to take it in positively, our emotion is real even if the praise comes from a robot and therefore is not real in its essence. Good feelings are highly addictive and this way they are easily approachable, this could open another way of virtual hunting for virtual “love” as we can see it on social media. Are we about to remove humanity out of people?

Never before have languages been understood as a mechanical tool. Language is the most organic and intimate thing we possess – with all the good and bad, the passion, playfulness and jokes, with the rhythms and sounds, the abstractions and even the rules and grammar. Language is a part of our subconsciousness – how can we switch from “passion” to “tool”? We do not have a button in our brain. It is the first time ever in the history humans use their own language to communicate with something artificial. Can we keep the boundaries of a “human to tool” relation in our weak heads and emotional hearts?


 AI should remain in the position of a tool and in my opinion it should be used just by professionals and for studying.

bottom of page