Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

linkchecker doesn't detect broken links on the same domain that miss the library prefix #50

Open
OnixGH opened this issue Jul 20, 2016 · 2 comments

Comments

@OnixGH
Copy link
Contributor

OnixGH commented Jul 20, 2016

When checking http://domain/library/\* then all links to http://domain/somethingelse are seen as external and ignored. However, this causes a specific error case not to be seen: url_for links that have a non-existing destination.

For example:
url_for('/install/valid.html') ==> http://domain/library/install/valid.html
url_for('/install/invalid.html') ==> http://domain/install/invalid.html <-- not reported

@OnixGH
Copy link
Contributor Author

OnixGH commented Jul 20, 2016

Linkchecker offers --check-extern (to proceed to "external" links), and with either --ignore-url REGEX or --no-follow-url REGEX we should be able to keep it to the same domain.

It's probably a good idea to start checking external links (without following) as well.

@OnixGH
Copy link
Contributor Author

OnixGH commented Jul 20, 2016

For now, a simple Nginx workaround solves the immediate issue (by allowing use of linkchecker http://localhost/index.html):

location /index.html { return 302 sub/index.html; }
location / { deny all; return 404; }

OnixGH pushed a commit that referenced this issue Jul 20, 2016
These were not detected before due to a linkchecker vs. subdomain gotcha
(GH-50).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant