The Pentagon seems to have had one of those "oh shit" moments when it dawned on them that data-mining can be a double-edged sword. Apparently some out-of-the-box thinker asked what might happen if a potential adversary chose to sift through America's mountains of commercially-available data?
The doomsday thinkers over at DARPA are looking
for researchers to "investigate the national security threat posed by
public data available either for purchase or through open sources." The
question is, could a determined data miner use only publicly available
information -- culled from Web pages and social media or from a consumer
data broker -- to cause "nation-state type effects." Forget identify
theft. DARPA appears to be talking about outing undercover intelligence
officers; revealing military war plans; giving hackers a playbook for
taking down a bank; or creating maps of sensitive government facilities.
The irony is delicious. At the time government officials are assuring Americans they have nothing
to fear from the National Security Agency poring through their personal
records, the military is worried that Russia or al Qaeda is going to
wreak nationwide havoc after combing through people's personal records.
As timely as this new DARPA project is, it wasn't NSA snooping that piqued the agency's interest. It was Brokeback Mountain. In 2009, Netflix sponsored a contest to improve its movie recommendation algorithm. Things went off the rails when a pair of researchers
used supposedly anonymous information provided by the company to
identify Netflix customers, by comparing their film reviews with reviews
posted on the Internet Movie Database. A closeted lesbian who had
watched the award-winning gay cowboy flick sued Netflix, alleging her
privacy was violated because the company had made it possible for her to
be outed.
DARPA's requests for research proposals points to the Netflix debacle,
and the lawsuit, as a cautionary tale. Part of the research is aimed at
identifying which potentially dangerous databases and computing tools
are out there.
And in a second bit of irony, DARPA suggests a few, including "low-cost
big data analytic capabilities" like Amazon's cloud service. That's the
service that the CIA wants to use to build a $600-million cloud
for the intelligence community. Could a tool meant to serve the spies'
computing needs end up being used against them?
This is the same way I feel about Predator/Drones, wait until other powers or non-state powers start using them against US interest or even in the homeland. Why wouldn't they? It ain't like the technology is all that expensive and Iran has already captured at least one to reverse engineer.
ReplyDeleteThere are several posts on drones on this blog, Koot. A few look at the RQ-170 stealth drone taken by Iran and now at least partly in China.
ReplyDeleteIsrael claims to have downed a couple of drones allegedly launched out of Lebanon by Hezbollah.
The U.S. has been testing hand-launched, kill-bot drones in Afghanistan that are designed to auto-detect and attack targets that display characteristics of insurgents. If they haven't already, sooner or later one or more of them will malfunction and fall into the other side's hands.
These, in my view, are the most problematical because they use mainly off-the-shelf components which means they can be easily and inexpensively replicated and deployed in numbers.
Wait . . . the CIA is going to use Amazon's cloud services to store their data?! Yeah, sounds secure.
ReplyDelete"designed to auto-detect and attack targets that display characteristics of insurgents"
ReplyDeleteUn-effing believable, not much chance of collateral damage/mistakes with this protocol, eh? What are these people thinking, or are they actually thinking at all?
Orwell's scary prognostications seem like "Mary had a Little Lamb" or "Little Bo-Peep" compared to the reality that today is turning out to be!
Collateral damage is instant absolution today, Koot. If you can claim it's unintentional then you're not responsible. When governments freely use that to set the bar of accountability, it's anything goes.
ReplyDeleteThere have been plenty of voices calling for a discussion of the issues surrounding autonomous killbots but they've been flatly ignored.
Kill them all. God will know His own.