This commit is contained in:
2018-05-22 19:24:19 +02:00
commit 546188fb7d
5 changed files with 180 additions and 0 deletions

7
README.md Normal file
View File

@@ -0,0 +1,7 @@
# crawl
`crawl` is a simple web crawler that is local to one domain. Its using
goroutines to parallelize things.
The generated sitemap is exceedingly simple, we do not keep track of a real
graph, but rather just a shallow list of links for every page.