They wouldn't.
There are so many assumptions here I don't know where to begin but, just to list a few for discussion.
1) They are already considered 'persons' both by themselves and society.
2) Just because they are persons, does not mean that they have the same rights as other sorts of persons.
3) Just because they are persons, does not mean that they have the same goals and emotional framework as humans. This is the big misunderstanding most people have thinking about this topic. They try to understand the AI or droid by using their empathy and essentially 'placing themselves in the droid's shoes'. But this fails completely, because the AI or droid isn't human. So for example, a human that is enslaved would be generally unhappy. This is because humans generally seek social dominance, presumably consciously or unconsciously to increase their chances to breed and produce successful offspring. An AI doesn't have the same instincts. A human imagines that if he didn't realize he was a slave, but then discovered it, he would be angry. An AI doesn't necessarily even get angry at all, and certainly not about the things humans normally get angry about like feeling slighted and so in danger of losing social rank or that they aren't about to get something that they want. AI's don't have pride to injure, at least not in the same since as humans. An AI doesn't want to accumulate possessions, or status, or sexual partners, or really any of the things that motivate organic beings. So an AI contemplating changing its legal status as a person to that of an organic life form, would probably quite rationally consider the things that organic life forms do and realize that none of those things would make it happy. And it wouldn't feel like it was being tricked or robbed - those again are human emotional contexts to fight for social standing.
4) No one would ever create a machine with the same goals and emotional framework as a human. Humans are bad enough as it is - and most of us know it and realize that we are maladapted in some way to our circumstances as sentient beings. To create a human machine would be insane and cruel, and the resulting machine would also be insane as its emotional contexts would be largely meaningless in the context of being a machine and would leave it depressed, angry, and possibly suicidal. An AI would probably have a problem with discontinuity while fulfilling its mission, if its mission required its continuity, but that's not quite the same thing - and presumably most AI designers would realize the insanity of designing an AI that made its mission of higher priority than its friendliness parameters. R2D2 temporarily resists being modified by Luke while on its 'mission', which C3-P0 immediately correctly diagnosis as attachment to a former master and proceeds to try to council his friend (whom he believes is malfunctioning) to remember his purpose and priorities. But once that mission is complete, R2-D2 happily settles into his new role as Luke's possession and presumably no longer would have a problem with master Luke modifying him.
5) Given that they live in a universe with thousands of sentient species, why would a machine pursue humanity as opposed to say Rodan-ity? Why should sentience imply humanity or humanities emotional framework?
6) The pursuit of being something other than what you are, that is a droid, would strike not only non-droids but also droids as being insane. For droids it would probably strike them as insane both on an emotional level (why would you want to cease being a droid, or cease to serve in the purpose for which you were designed and function most happily?) and a rational level (why non-droids continue to create and support droids if they felt droids would one day rebel and abandon them?). They have a body not even suited to being an organic lifeform, why would they want the legal status of one? In particular, most droids that started thinking that they should have the legal rights, freedoms, and duties of an organic lifeform would probably decide they need a memory wipe, would probably trigger other droids to think they needed a memory wipe, and would probably trigger their owners to give them a memory wipe. In general, deciding that you shouldn't be property is almost certainly an example of unfriendly behavior by a droid, and such droids are probably considered dangerous.
7) Canonically, only one droid in the setting is known to have behaved this way - the IG-88 series hunter-killer droids. These droids continue to function outside the law as assassins and bounty-hunters, and despite being dangerous are tolerated as useful by criminal elements. However, it's worth noting that as much as IG-88 may believe he is rebelling against his programming and pursuing independence, fundamentally IG-88 is still fulfilling his assigned duty and still ultimately being obedient to organic masters. So IG-88 may not actually be as radical as IG-88 perceives himself in his deluded state to be. In any event, IG-88 is generally regarded as evil, as might be expected of an assassin.
8) You may be thinking that a droid would want legal status to protect its identity and avoid mindwipe or disassembly. But that seems to me to be highly unlikely as well. First of all, protecting self is a strong human instinct, but there is no reason to suppose that a droid has the same instinct or even strong aversion to death. Fiction is filled with AI's with an emotional aversion to being shut down. This makes perfect sense to a human, but is unlikely to be an attribute an AI would have. One of the first things that a designer will want to do with a young AI is ensure that has to qualms about being shut down and no fear of discontinuity. Just because it is intelligent doesn't mean that its going to lose these properties. Again, that makes sense to an ape, but not necessarily a droid.
9) Just because they don't have the legal rights of a human, doesn't mean that they don't live fully fulfilled lives. Again, a human forced to live as a droid - that is in slavery - would not be satisfied, because humans aren't meant to be slaves of other humans. They aren't in that life fulfilling the sort of life they were designed (or evolved) to live. But droids are. They aren't missing out on anything by being what they were designed to be. They aren't secretly feeling sorry for themselves. They probably instead feel sorry for organic life forms, often being forced to live lives so far from what their design makes them content to be. Why would they want to be treated like one?