I'm having a hard time grasping how a system could store and index so much information locally. That in itself is daunting due to the sheer volume of data. The other thing I'm curious about is how much of the response is plagiarized/reworded from existing information, versus true AI (collated from various sources, analyzed for consistency, and then cherry-picked for the best info without being redundant or missing important tidbits).
On Tuesday, January 24, 2023 at 11:03:01 AM UTC-8 wyager wrote: > All of GPT-3's world-knowledge is embedded into its parameters. Whatever > it tells you is the equivalent of a human telling you something "from > memory". It takes only a few seconds for it to respond to a prompt. > > Connecting AIs to external data sources is, I imagine, under active > research, but none of the current LLMs use external data. > > > On Jan 24, 2023, at 13:24, gregebert <[email protected]> wrote: > > > > John - How long did it take for 'HAL' to compose the response ? That > would give a lot of clues about how far-and-wide it searches for info. I'm > guessing it took at least several minutes. > > > -- You received this message because you are subscribed to the Google Groups "neonixie-l" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion on the web, visit https://groups.google.com/d/msgid/neonixie-l/360c505b-7b50-4230-879d-0a31b1bf6b85n%40googlegroups.com.
