A directory of what people actually want. Classified, clustered, ranked and updated daily
AI · 1 mentions
#1992534694362255594
If it knows, when interrogated, that it's lying, why is it so hard to program in a "interrogate yourself and make sure you're not lying" function?