The basic method used by Snowden usesused to gather the files he got from the NSA, was a Web Crawler.
This method is quite cheap and automated - it is a simple web crawler, a software that searches, and saves every page, or file it deems interesting, according to the New York Times.
The investigation, carried out by an internal staff of the NSA, concluded that Snowden's attack was not as sophisticated as first thought, and that it should have been detected by its special security screens services.
A web crawler it can be programmed to go from web page to web page, visit links contained in documents, copy and save files and web pages. It is one tool που χρησιμοποιείται συχνά από εταιρείες του Διαδικτύου, όπως την Google, και γενικότερα από ιστοσελίδες αναζήτησης για να κατεβάζουν το περιεχόμενο, να το αρχειοθετούν ούτως ώστε να προσφέρουν γρήγορα αποτελέσματα αναζήτησης.
Officials engaged in research believe that the Snowden gained access to 1,7 million files.