Sharing our perspectives on timely topics

Marketing, money and technology: behind the scenes of the GPT-3
Over the past 6 months, GPT-3, a language model using deep learning to produce human-like text, has hit the headlines. Some of the articles had even been written by GPT-3 itself. Among other terms, the machine has been described as “stunning”, a “better writer than most humans” but also a bit “frightening”. From poetry to human-like conversation, its capacities appear infinite… but are they really? How does GPT-3 work and what does it say about the future of artificial intelligence?
Want more? discover below some papers "in a nutshell"
New EEG evidence on sentence production
Some languages require less neural activity than others. But these are not necessarily the ones we would imagine. In a study published today in the journal PLOS Biology, researchers at the University of Zurich have shown that languages that are often considered “easy” actually require an enormous amount of work from our brains.
Bonobos are sensitive to joint commitments
Bonobos, when abruptly interrupted in a social activity with another bonobo, resume it, as soon as the interruption is over, with the same partner.
Dwarf mongooses may combine units in order to create new meanings
Dwarf mongooses seem to produce a complex call that may be a combination of distinct call units.