Edward Snowden Used a Simple Web Crawler To Gather NSA Data
Snowden used a tool similar to the one Google uses to index websites for its search engine to gather his controversial NSA data.
It would be fun to imagine Edward Snowden speed-hacking his way into the NSA servers like Hugh Jackman in Swordfish, or physically stealing a briefcase full of hard-disks and making a daring escape from NSA headquarters, but the truth is much simpler. Speaking to the New York Times [http://www.nytimes.com/2014/02/09/us/snowden-used-low-cost-tool-to-best-nsa.html?pagewanted=all&_r=2], a senior intelligence official said that Snowden used nothing more than a simple web crawler to gather his controversial data.
Using the web crawler, Snowden "scraped data out of our systems" while he went about his day job, said the official. "We do not believe this was an individual sitting at a machine and downloading this much material in sequence," he said, adding that the process was "quite automated." To automatically collect the info he wanted, Snowden only needed the right logins to bypass what internal defenses were in place.
What makes this data so damning is that the NSA's mission statement is to "protect the nation's most sensitive military and intelligence computer systems from cyberattacks," which is quite embarrassing considering the simplicity of Snowden's technique - Investigators found that Snowden's attacks were hardly sophisticated and should have been easily detected.
Agency officials insist that if Snowden was working at NSA's headquarters at Fort Meade, he would have been caught, but the Hawaii branch that he was employed at lacked the activity monitors that would have found his bot.
Web crawlers are commonly used by search engines like Google to index websites.
Regardless of whether or not you think Snowden was "right" or "wrong," you have to admit the NSA is partly to blame for not protecting itself properly.
Source: Engadget [http://www.nytimes.com/2014/02/09/us/snowden-used-low-cost-tool-to-best-nsa.html?pagewanted=all&_r=2]
Permalink
Snowden used a tool similar to the one Google uses to index websites for its search engine to gather his controversial NSA data.
It would be fun to imagine Edward Snowden speed-hacking his way into the NSA servers like Hugh Jackman in Swordfish, or physically stealing a briefcase full of hard-disks and making a daring escape from NSA headquarters, but the truth is much simpler. Speaking to the New York Times [http://www.nytimes.com/2014/02/09/us/snowden-used-low-cost-tool-to-best-nsa.html?pagewanted=all&_r=2], a senior intelligence official said that Snowden used nothing more than a simple web crawler to gather his controversial data.
Using the web crawler, Snowden "scraped data out of our systems" while he went about his day job, said the official. "We do not believe this was an individual sitting at a machine and downloading this much material in sequence," he said, adding that the process was "quite automated." To automatically collect the info he wanted, Snowden only needed the right logins to bypass what internal defenses were in place.
What makes this data so damning is that the NSA's mission statement is to "protect the nation's most sensitive military and intelligence computer systems from cyberattacks," which is quite embarrassing considering the simplicity of Snowden's technique - Investigators found that Snowden's attacks were hardly sophisticated and should have been easily detected.
Agency officials insist that if Snowden was working at NSA's headquarters at Fort Meade, he would have been caught, but the Hawaii branch that he was employed at lacked the activity monitors that would have found his bot.
Web crawlers are commonly used by search engines like Google to index websites.
Regardless of whether or not you think Snowden was "right" or "wrong," you have to admit the NSA is partly to blame for not protecting itself properly.
Source: Engadget [http://www.nytimes.com/2014/02/09/us/snowden-used-low-cost-tool-to-best-nsa.html?pagewanted=all&_r=2]
Permalink