In our startup we are now making noise. We need people to get to know us, super-ultra-talented integral: professionals; mentors; and investors, who like the kind of singularity we’re building and to reach out to us.
So we will publish 5 interesting stories per week until our funding is reached.
Then, from the next 6 months, it will be our own AI with human understanding and expression that will do the work.
Here I deep in two parts into the concept of AI to understand the language of animals published yesterday.
In the last year, it was always clear to me that if I don’t build a real AGI, I’ll make, at least, an imaginary one in movie and series scripts 😀
There’s an idea for a movie that I might write one day.
A guy who likes birds around his home garden and feeds them time to time, develops an AI to understand the language of the non-linear heartbeats.
In a classic movie mistake, the guy lets the open mic all over time, the AI outgrows itself and begins to analyze the sounds of the environment.
Suddenly, the guy wakes up one day and realizes in disbelief and astonishment that the AI is translating the language of the birds at full volume: “Wake up! Wake up! Give me food!”
The guy has the typical fears of any movie star and decides to test his AI to decipher the language of dolphins, especially because his freediving instructor is the girl of his dreams. He is shy like the typical leading man, but the AI and the birds help him, and who female movie character doesn’t smile when the birds are so cute to the shy smart guy?
Yes, the AI manages to decipher the language of the dolphins too (is a real AI, not just machine learning ;)).
At QOM we believe that we will do too.
After all, our algorithm allows an AI to understand like you are understanding now for the first time in the history.
How do we plan to do?
First, we go beyond the anthropomorphic bias.
First question: What it is like “for them” “from” their point of view (including our own embedded point of view).
We know in the QOM model how things relate to each other to produce a sensor-effector correlation. That relationship is born from certain interactions with other things where the media and the environment are a key condition.
Each individual-ecosystem is unique. For example: the Mantis Shrimp sees through a range of 10 colors (the human only in 3); the octopus react to the world not only from a central brain, also from 8 “mini” brains, one in each tentacle, and the bats use the sound to locate themselves and their environment.
From all that non-linear mix of relationships meaning comes. And it is precise and characteristic for each species. Is different for every specie and individual.
But all use the same Meaning structure.
And that presents a quantum behaviour!
We do not cross language for our calculations, we cross meanings. And meanings arise from many things. Not just from the brain.
This is one of the many paradigms and misconceptions in the field of AI.
“From” meaning, language arises. No otherwise.
So by crossing the quanta of meaning of two species, we can make a translation.
We plan to test our model in the simplest case: translations between different languages.
A translation of “meaning”.
That means, no more misunderstandings because of cultural differences.
The end of the movie?
Depends on funding 😀
It can be from just a happy hormonal ending that makes you smile and sigh, to something great or a terrifying and unexpected event that no one expected to induce the viewer to wait expectantly for the second part or pay the bill now to see it as soon as possible. After all, with this eager “I want to know what happens next!” Who wants to wait?
(In the picture, a dolphin in Roatan came to see who is our Chief Scientist, curious to know why two others dolphins spontaneously swam and play with him, to the complete amazement of his keepers.)
Did you like? Share it
Stay in touch.
Know our news before they are published.
(your data are safe with us)