You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The docker daemon was restarted on a subset of nodes and thus caused the crawler to die with the following error.
How to Reproduce
While the crawler is running restart docker daemon. Vary the time between shutdown and restart so the crawler could
Log Output
/var/log/upstart/csf-crawler-docker.log.1.gz" [noeol] 14L, 2082C 1,1 Top
for inspect in exec_dockerps():
File "/opt/cloudsight/collector/crawler/dockerutils.py", line 37, in exec_dockerps
raise DockerutilsException('Failed to exec dockerps')
DockerutilsException: Failed to exec dockerps
Process crawler-1:
Traceback (most recent call last):
File "/usr/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
self.run()
File "/usr/lib/python2.7/multiprocessing/process.py", line 114, in run
self._target(*self._args, **self._kwargs)
File "/opt/cloudsight/collector/crawler/crawler.py", line 61, in crawler_worker
crawlutils.snapshot(**params)
File "/opt/cloudsight/collector/crawler/crawlutils.py", line 258, in snapshot
containers = get_filtered_list_of_containers(options, namespace)
File "/opt/cloudsight/collector/crawler/containers.py", line 111, in get_filtered_list_of_containers
for container in containers_list:
File "/opt/cloudsight/collector/crawler/containers.py", line 37, in list_all_containers
for container in all_docker_containers:
File "/opt/cloudsight/collector/crawler/dockercontainer.py", line 35, in list_docker_containers
for inspect in exec_dockerps():
File "/opt/cloudsight/collector/crawler/dockerutils.py", line 37, in exec_dockerps
raise DockerutilsException('Failed to exec dockerps')
DockerutilsException: Failed to exec dockerps
Traceback (most recent call last):
File "/opt/cloudsight/collector/crawler/crawler.py", line 419, in
main()
File "/opt/cloudsight/collector/crawler/crawler.py", line 415, in main
start_autonomous_crawler(args.numprocesses, args.logfile, params, options)
File "/opt/cloudsight/collector/crawler/crawler.py", line 116, in start_autonomous_crawler
(pname, exitcode))
RuntimeError: crawler-0 terminated unexpectedly with errorcode 1
Debugging Commands Output
The text was updated successfully, but these errors were encountered:
This can be addressed by passing on docker-related exceptions, but with the current plugin design it would need changes in several plugins to ignore crawling on iterations that see this (passed) exception happening.
If it is urgent, can bump this issue up the todo queue.
Description
The docker daemon was restarted on a subset of nodes and thus caused the crawler to die with the following error.
How to Reproduce
While the crawler is running restart docker daemon. Vary the time between shutdown and restart so the crawler could
Log Output
/var/log/upstart/csf-crawler-docker.log.1.gz" [noeol] 14L, 2082C 1,1 Top
for inspect in exec_dockerps():
File "/opt/cloudsight/collector/crawler/dockerutils.py", line 37, in exec_dockerps
raise DockerutilsException('Failed to exec dockerps')
DockerutilsException: Failed to exec dockerps
Process crawler-1:
Traceback (most recent call last):
File "/usr/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
self.run()
File "/usr/lib/python2.7/multiprocessing/process.py", line 114, in run
self._target(*self._args, **self._kwargs)
File "/opt/cloudsight/collector/crawler/crawler.py", line 61, in crawler_worker
crawlutils.snapshot(**params)
File "/opt/cloudsight/collector/crawler/crawlutils.py", line 258, in snapshot
containers = get_filtered_list_of_containers(options, namespace)
File "/opt/cloudsight/collector/crawler/containers.py", line 111, in get_filtered_list_of_containers
for container in containers_list:
File "/opt/cloudsight/collector/crawler/containers.py", line 37, in list_all_containers
for container in all_docker_containers:
File "/opt/cloudsight/collector/crawler/dockercontainer.py", line 35, in list_docker_containers
for inspect in exec_dockerps():
File "/opt/cloudsight/collector/crawler/dockerutils.py", line 37, in exec_dockerps
raise DockerutilsException('Failed to exec dockerps')
DockerutilsException: Failed to exec dockerps
Traceback (most recent call last):
File "/opt/cloudsight/collector/crawler/crawler.py", line 419, in
main()
File "/opt/cloudsight/collector/crawler/crawler.py", line 415, in main
start_autonomous_crawler(args.numprocesses, args.logfile, params, options)
File "/opt/cloudsight/collector/crawler/crawler.py", line 116, in start_autonomous_crawler
(pname, exitcode))
RuntimeError: crawler-0 terminated unexpectedly with errorcode 1
Debugging Commands Output
The text was updated successfully, but these errors were encountered: