Snowden used a Web Crawler to get the files from the NSA

The basic method used by Snowden used to gather the files he got from the NSA, was a Web Crawler.

This method is quite cheap and automated - it is a simple web crawler, a software that searches, and saves every page, or file it deems interesting, according to the New York Times.

top secret

The investigation, carried out by an internal staff of the NSA, concluded that Snowden's attack was not as sophisticated as first thought, and that it should have been detected by its special security screens s.

A web crawler it can be programmed to go from web page to web page, visit links contained in documents, copy and save files and web pages. It is one που χρησιμοποιείται συχνά από εταιρείες του Διαδικτύου, όπως την , και γενικότερα από ιστοσελίδες αναζήτησης για να κατεβάζουν το περιεχόμενο, να το αρχειοθετούν ούτως ώστε να προσφέρουν γρήγορα αποτελέσματα αναζήτησης.

Officials engaged in research believe that the Snowden gained access to 1,7 million files.

iGuRu.gr The Best Technology Site in Greecefgns

every publication, directly to your inbox

Join the 2.087 registrants.

Written by giorgos

George still wonders what he's doing here ...

Leave a reply

Your email address is not published. Required fields are mentioned with *

Your message will not be published if:
1. Contains insulting, defamatory, racist, offensive or inappropriate comments.
2. Causes harm to minors.
3. It interferes with the privacy and individual and social rights of other users.
4. Advertises products or services or websites.
5. Contains personal information (address, phone, etc.).