Skip to content

Commit

Permalink
update ci
Browse files Browse the repository at this point in the history
  • Loading branch information
indrajithi committed Jun 13, 2024
1 parent c2e9b88 commit 8f68644
Show file tree
Hide file tree
Showing 3 changed files with 8 additions and 8 deletions.
3 changes: 1 addition & 2 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,7 @@ jobs:
run: curl -sSL https://install.python-poetry.org | python3 -
- name: Install dependencies
run: |
poetry install
mypy --install-types
poetry install && mypy --install-types
- name: Run linter :pylint
run: |
poetry run pylint tiny_web_crawler
Expand Down
11 changes: 6 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,14 +27,15 @@ from tiny_web_crawler.crawler import Spider
root_url = 'http://github.com'
max_links = 2

spider = Spider(root_url, max_links)
spider.start()
crawl = Spider(root_url, max_links)
crawl.start()


# set workers and delay (default: delay is 0.5 sec and verbose is True)
# Set workers and delay (default: delay is 0.5 sec and verbose is True)
# If you do not want delay, set delay=0

crawl = Spider('https://github.com', 5, max_workers=5, delay=1, verbose=False)
spider.start()
crawl = Spider(root_url='https://github.com', max_links=5, max_workers=5, delay=1, verbose=False)
crawl.start()

```

Expand Down
2 changes: 1 addition & 1 deletion tiny_web_crawler/crawler.py
Original file line number Diff line number Diff line change
Expand Up @@ -209,7 +209,7 @@ def main() -> None:

crawler = Spider(root_url, max_links, save_to_file='out.json')
print(Fore.GREEN + f"Crawling: {root_url}")
print(crawler.start().keys())
crawler.start()


if __name__ == '__main__':
Expand Down

0 comments on commit 8f68644

Please sign in to comment.