

3·
7 days agoI guess it would be more precise, and they could sell it to game developers.


I guess it would be more precise, and they could sell it to game developers.
I didn’t notice your critique on the outcome of results, but how they were achieved. LLM’s hallucinating are making computers make ”human errors”, which makes them less deterministic, the key reason I prefer doing some things on a computer.
It’s called the heuristic method and those doing it know the limitations. Whereas LLMs will just confidently put out garbage claiming it true.
Dam. Stack rocks to block water, use leaking water to turn propeller.
Do you work in the band named Pink Floyd?