I didn't read through the whole thread, so some of this may be redundant.
There are five issues with drones, none of which really have to do with their droniness (that is, it's a remote-control weapons platform). They have to do with some of the secondary effects of drone use. Specifically:
~ states the ratio [https://en.wikipedia.org/wiki/Drone_attacks_in_Pakistan#Statistics] is ten or so civilians per one high-value target, which is still abysmal but not at terrible as the fifty I stated. It's possible that the figure I game was accidentally pulled by the press from a specific incident in which 50 civilians were killed in a single strike.[/footnote] per every one high-value target which is ridiculous for post-cold-war action. This statistic came up following the Sandy Hook Elementary School massacre when the industrialized world freaked out over the deaths of twenty-odd children and yet ignored the high toll of child deaths by drone strikes (implying harshly that we only care about children if they're white and in the first world). The Obama Administration's position has been to doubt the reports of high civilian casualty counts, though there has been a stated intention to phase out drone strikes by the CIA, and instead keep them in the hands of the military, which holds a much higher standard of restraint in the use of force.
~ Executions and Assassinations: The targeting of US Citizen and Yemeni imam Anwar al-Awlaki [https://en.wikipedia.org/wiki/Anwar_al-Awlaki] on September 30, 2011 has raised concerns about the legality of extra judicial executions of American citizens. al-Awlaki was given no trial or even a tribunal, yet was specifically targeted. al-Awlaki's sixteen-year old son was also slain in the strike and there have been two other American casualties due to drone strikes, though they were not specifically targeted. But this also raises the issue of targeting specific people at all (as opposed to units or commanding officers). In international law this falls into the purview of assassination and is regarded as really bad form. During the Cold War, both the US and the USSR had policies against targeting specific individuals after our failed attempt on Fidel Castro. That the US now demonstrates a willingness to target specific VIPs has shown the degree to which our ethics and commitment to just warfare doctrine has deteriorated.
~ Killbots: As things are, drones are controlled by an active human being who has to authorize each stike on a target. But a number of unmanned weapons platforms are being developed with the capability to identify valid targets and attack autonomously. A good portion of science fiction denotes this as the first step in making a Iran Air Flight 655 [https://en.wikipedia.org/wiki/Cybernetic_revolt] as a result. The circumstance are complicated, but incidents like this also show that such systems are not infallible. Similarly, a system of automated turrets have been developed in Australia that will open fire on anything that moves in a designated area (meant to be an area-denial system as an alternative to landmines). So we still need to nail down what kind of autonomy is or isn't allowed in warfare.
~ Security: One of the more obvious problems with remote-control weapons is that they're prone to getting hacked. It was believed when we armed predators with missiles that they would use a robust encryption system, but this proved questionable when police prototypes were hacked and stolen by college students from the local academy, using the same protocol that is used for armed, military Predators. This does not bode well for our weapons presently out on the field. Obviously we need to improve the security encryption that we use for our drones, but more encryption requires more data bandwidth which reduces performance of communications, especially when weather or countermeasures interfere. We're more likely to send up insecure drones at the risk of them getting hacked and captured rather than waiting until our technology is strong enough to ensure a stable, secure communications line.
~ Domestic Use: Now that drones have been so successful as a military tool, local law enforcement wants a piece of the action, less to strike at targets (though some crowd-control ideas have come to mind) and more to better police their precincts. The problem is, our police are already overenthusiastic about using privacy invasive tools without reasonable oversight or respect of civil privacy, and they've already proven they're more interested in getting collars rather than keeping the peace or seeing justice done. We're now in an era where charges are made up to serve corporate interests, and divisions of law enforcement are being used as mercenary forces for corporate enforcement. Why would we want to trust them with technology we know will be used to peep through our windows?
238U