'pappagallo stocastico' è un ossimoro: i pappagalli sono notoriamente
deterministici, ma forse l'efficacia dell'epiteto è proprio in questa
giustapposizione
nessuno sa cosa davvero accade in quelle cucine. a me piace pensare che
stiano lavorando per evitare sequenze 'verbatim' e dover pagare pegno al
NYT e a tutti gli altri
G.

On Thu, 22 Feb 2024 at 20:37, Daniela Tafani <daniela.taf...@unipi.it>
wrote:

> On February 20, 2024, an optimization to the user experience introduced a
> bug with how the model processes language.
>
> LLMs generate responses by randomly sampling words based in part on
> probabilities. Their “language” consists of numbers that map to tokens.
>
> In this case, the bug was in the step where the model chooses these
> numbers. Akin to being lost in translation, the model chose slightly wrong
> numbers, which produced word sequences that made no sense. More
> technically, inference kernels produced incorrect results when used in
> certain GPU configurations.
>
> Upon identifying the cause of this incident, we rolled out a fix and
> confirmed that the incident was resolved.
> Posted 18 hours ago. Feb 21, 2024 - 17:03 PST
>
>
> https://openai.statuspage.io/incidents/ssg8fh7sfyz3
>
> Come hanno osservato in molti, la definizione di chatGPT fornita dai
> tecnici di  OpenAI è quella a cui ci si riferisce con "pappagallo
> stocastico".
> _______________________________________________
> nexa mailing list
> nexa@server-nexa.polito.it
> https://server-nexa.polito.it/cgi-bin/mailman/listinfo/nexa
>
_______________________________________________
nexa mailing list
nexa@server-nexa.polito.it
https://server-nexa.polito.it/cgi-bin/mailman/listinfo/nexa

Reply via email to