Abstract | ||
---|---|---|
Malicious javascript frequently serves as a starting point of web-based attacks, in particular cross-site scripting. Thus detecting malicious javascript before execution can protect users from attacks such as malware infection, drive-by downloads, and even from participating in denial-of-service attacks as part of botnet sometimes. A large collection of malicious javascript would help with detector development, but by the time crawler arrives at blacklisted domains attackers and malicious scripts are often long gone. We have used classifiers to direct a web crawler better towards more likely locations of malicious scripts, and show how this targeted web crawler performs compared to crawler seed with blacklisted-domains. |
Year | DOI | Venue |
---|---|---|
2009 | 10.1145/1651309.1651317 | CIKM-DSMM |
Keywords | Field | DocType |
blacklisted domains attacker,detector development,targeted web crawler,malicious javascript,time crawler,malicious script,web crawler,denial-of-service attack,malicious javascript collection,large collection,drive-by downloads,denial of service attack,security,web crawling | World Wide Web,Botnet,Computer science,Computer security,Unobtrusive JavaScript,Malware,Web crawler,Scripting language,JavaScript | Conference |
Citations | PageRank | References |
6 | 0.50 | 6 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Peter Likarish | 1 | 92 | 6.86 |
Eunjin Jung | 2 | 125 | 13.06 |