Really this all comes down to a very interesting metaphysical question; is functional identical to substance? Personally I don't think so, and thus feel that even if we can make a fully functional AI it still wouldn't be sapient, and would simply mimic sapiency a la the chinese room thought experiment. Thus I think that AI can't qualify for any rights, except maybe to not be abused by people who have no idea how computers work.
I think the metaphysical question is actually whether there
is substance at all. Perhaps souls are real things that exist, and humans just have some kind of special
je ne sais quoi that computers lack, and thus an AI can never be truly alive. Perhaps humans are nothing more than machines built of meat, and the illusion shared by all people of being aware of themselves and their environment is just a natural process that can ultimately be understood and replicated.
It occurs, however, that if souls are real things that exist, then they too are a natural process. If the thing that makes people different from robots exists, it can be observed, and possibly even manipulated. It is just another interchangeable part in the machine of logic and emotion that makes a mind. If human consciousness is something special, would it be possible to
move it into a machine? To destroy it without damaging the brain, creating human with no sense of self? To weaponise it into a gun that shoots ghosts? The science fiction possibilities are endless.

That's right, I just turned your AI civil rights discussion to
necromancy.