to be clear: i don't think we should program robots to be intelligent, to be self-aware or to have personalities, even if we can. i see no practical use for such a thing. robots should be dumb slaves that are too stupid to question the futility of their existences. i don't want existentialist robots; it defeats the purpose of having robots. and, i don't want likeable or lovable robots, either, as that just blurs the necessary class division.
thankfully, i don't think it's truly possible to build these kinds of decision trees.
it's like a "random number generator". if you know how it works, you know it's not actually random, that you can predict the next number with a relatively small amount of information. likewise, any sort of personality that a robot might be able to demonstrate would necessarily be an illusion.
if you can predict what a robot will do, it's not demonstrating personality, it's just demonstrating a complicated program.