On 10/01/2024 16:35, Stephen Loosley wrote:
> The job of a prompt engineer is to skillfully write queries that can be given 
> to a generative AI so that it can provide accurate answers to complex 
> questions. And that is a lot more difficult than it sounds, since AIs have no 
> real understanding of context or subtlety. If a prompt or question is not 
> detailed enough, the AI might return the wrong answer or even a nonsensical 
> one. Even worse, most generative AIs will never say that they don’t know an 
> answer, so if a prompt does not provide the AI with enough context, it could 
> very likely hallucinate or make something up.
>
> Yet another problem is the fact that if you ask a generative AI the same 
> question twice, it will sometimes give different answers. Occasionally that 
> is the fault of the AI, but most times it happens because the prompt given to 
> it was not detailed enough to lock in a specific answer.

The absurdity of this is worthy of Franz Kafka!!

Companies are (allegedly) employing people on high salaries, who are presumably 
not experts in any particular subject, to interpret an abstract requirement and 
phrase a suitable question to ask a machine, which certainly isn't an expert, 
and then interpret the result!!

Why not just employ a real expert in the first place and dispense with the 
man/machine/man in the middle?  If a problem is so obscure, difficult to 
express semantically, and poorly understood by the responsible executives, what 
added value can a machine add?

The legal implications might be worth watching.

I think there is a place for neural networks, but this isn't it.

_David Lochrin_
_______________________________________________
Link mailing list
[email protected]
https://mailman.anu.edu.au/mailman/listinfo/link

Reply via email to