Sharing our perspectives on timely topics
Over the past 6 months, GPT-3, a language model using deep learning to produce human-like text, has hit the headlines. Some of the articles had even been written by GPT-3 itself. Among other terms, the machine has been described as “stunning”, a “better writer than most humans” but also a bit “frightening”. From poetry to human-like conversation, its capacities appear infinite… but are they really? How does GPT-3 work and what does it say about the future of artificial intelligence?
Want more? discover below some papers "in a nutshell"
Some languages require less neural activity than others. But these are not necessarily the ones we would imagine. In a study published today in the journal PLOS Biology, researchers at the University of Zurich have shown that languages that are often considered “easy” actually require an enormous amount of work from our brains.
Dwarf mongooses seem to produce a complex call that may be a combination of distinct call units.