Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problems with installation #1

Open
moonlight2397 opened this issue Jan 6, 2021 · 4 comments
Open

Problems with installation #1

moonlight2397 opened this issue Jan 6, 2021 · 4 comments

Comments

@moonlight2397
Copy link

I get an error while installing - The command '/bin/bash -c apt-get install -y python-dev python-pip graphviz graphviz-dev libxml2-dev libxslt-dev && rm -rf /var/lib/apt/lists/* && pip2 install virtualenvwrapper==4.8.2 && echo "export WORKON_HOME=$HOME/.virtualenvs" > ~/.profile && echo "source /usr/local/bin/virtualenvwrapper.sh" > ~/.profile && source ~/.profile && mkvirtualenv -p python2.7 discoursegraphs' returned a non-zero code: 1

The full output is given below

`Collecting scandir; python_version < "3.5" (from pathlib2<3,>=2.3.3; python_version < "3.4" and sys_platform != "win32"->virtualenv->virtualenvwrapper==4.8.2)
Downloading https://files.pythonhosted.org/packages/df/f5/9c052db7bd54d0cbf1bc0bb6554362bba1012d03e5888950a4f5c5dadc4e/scandir-1.10.0.tar.gz
Building wheels for collected packages: stevedore, unknown, unknown, filelock, unknown, scandir
Running setup.py bdist_wheel for stevedore: started
Running setup.py bdist_wheel for stevedore: finished with status 'done'
Stored in directory: /root/.cache/pip/wheels/f2/da/da/c4d8e81b611d95cc588f65ab4f8997f0a4b51e66df071e11f0
Running setup.py bdist_wheel for unknown: started
Running setup.py bdist_wheel for unknown: finished with status 'done'
Stored in directory: /root/.cache/pip/wheels/27/c0/26/4a2f63cf61535cc65dcfbdfe1aaaefe5bc956a3eeef44037ea
Running setup.py bdist_wheel for unknown: started
Running setup.py bdist_wheel for unknown: finished with status 'done'
Stored in directory: /root/.cache/pip/wheels/26/71/44/c8fdeed2c1a7b49783e1d1435d94c564fd6bfe7f1e3eeba14b
Running setup.py bdist_wheel for filelock: started
Running setup.py bdist_wheel for filelock: finished with status 'done'
Stored in directory: /root/.cache/pip/wheels/66/13/60/ef107438d90e4aad6320e3424e50cfce5e16d1e9aad6d38294
Running setup.py bdist_wheel for unknown: started
Running setup.py bdist_wheel for unknown: finished with status 'error'
Complete output from command /usr/bin/python -u -c "import setuptools, tokenize;file='/tmp/pip-build-714n8d/unknown/setup.py';exec(compile(getattr(tokenize, 'open', open)(file).read().replace('\r\n', '\n'), file, 'exec'))" bdist_wheel -d /tmp/tmpRKYvJXpip-wheel- --python-tag cp27:
Traceback (most recent call last):
File "", line 1, in
IOError: [Errno 2] No such file or directory: '/tmp/pip-build-714n8d/unknown/setup.py'


Failed building wheel for unknown
Running setup.py clean for unknown
Complete output from command /usr/bin/python -u -c "import setuptools, tokenize;file='/tmp/pip-build-714n8d/unknown/setup.py';exec(compile(getattr(tokenize, 'open', open)(file).read().replace('\r\n', '\n'), file, 'exec'))" clean --all:
Traceback (most recent call last):
File "", line 1, in
IOError: [Errno 2] No such file or directory: '/tmp/pip-build-714n8d/unknown/setup.py'


Failed cleaning build dir for unknown
Running setup.py bdist_wheel for scandir: started
Running setup.py bdist_wheel for scandir: finished with status 'done'
Stored in directory: /root/.cache/pip/wheels/91/95/75/19c98a91239878abbc7c59970abd3b4e0438a7dd5b61778335
Successfully built stevedore unknown unknown filelock scandir
Failed to build unknown
Installing collected packages: unknown, appdirs, distlib, unknown, filelock, scandir, pathlib2, virtualenv, pbr, stevedore, virtualenv-clone, virtualenvwrapper
Successfully installed appdirs-1.4.4 distlib-0.3.1 filelock-3.0.12 pathlib2-2.3.5 pbr-5.5.1 scandir-1.10.0 stevedore-3.3.0 unknown-0.0.0 unknown-0.0.0 virtualenv-20.2.2 virtualenv-clone-0.5.4 virtualenvwrapper-4.8.2
You are using pip version 8.1.1, however version 20.3.3 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
Traceback (most recent call last):
File "/usr/lib/python2.7/runpy.py", line 174, in _run_module_as_main
"main", fname, loader, pkg_name)
File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
exec code in run_globals
File "/usr/local/lib/python2.7/dist-packages/virtualenvwrapper/hook_loader.py", line 16, in
from stevedore import ExtensionManager
File "/usr/local/lib/python2.7/dist-packages/stevedore/init.py", line 11, in
from .extension import ExtensionManager
File "/usr/local/lib/python2.7/dist-packages/stevedore/extension.py", line 19, in
from . import _cache
File "/usr/local/lib/python2.7/dist-packages/stevedore/_cache.py", line 31, in
import importlib_metadata
ImportError: No module named importlib_metadata
virtualenvwrapper.sh: There was a problem running the initialization hooks.

If Python could not import the module virtualenvwrapper.hook_loader,
check that virtualenvwrapper has been installed for
VIRTUALENVWRAPPER_PYTHON=/usr/bin/python and that PATH is
set properly.
Traceback (most recent call last):
File "/usr/local/bin/virtualenv", line 7, in
from virtualenv.main import run_with_catch
File "/usr/local/lib/python2.7/dist-packages/virtualenv/init.py", line 3, in
from .run import cli_run, session_via_cli
File "/usr/local/lib/python2.7/dist-packages/virtualenv/run/init.py", line 12, in
from .plugin.activators import ActivationSelector
File "/usr/local/lib/python2.7/dist-packages/virtualenv/run/plugin/activators.py", line 6, in
from .base import ComponentBuilder
File "/usr/local/lib/python2.7/dist-packages/virtualenv/run/plugin/base.py", line 9, in
from importlib_metadata import entry_points
ImportError: No module named importlib_metadata
The command '/bin/bash -c apt-get install -y python-dev python-pip graphviz graphviz-dev libxml2-dev libxslt-dev && rm -rf /var/lib/apt/lists/* && pip2 install virtualenvwrapper==4.8.2 && echo "export WORKON_HOME=$HOME/.virtualenvs" > ~/.profile && echo "source /usr/local/bin/virtualenvwrapper.sh" > ~/.profile && source ~/.profile && mkvirtualenv -p python2.7 discoursegraphs' returned a non-zero code: 1`

@arne-cl
Copy link
Member

arne-cl commented Jan 7, 2021

Hi @moonlight2397,

thank you for bringing this up!
Until I find the time to fix this, you can still use the pre-built Docker image from Docker Hub like this:

/tmp$ docker run -p 9000:9000 nlpbox/corenlp:3.9.2

/tmp$ cat input.txt 
Although they didn't like it, they accepted the offer.

/tmp$ docker run --net host -v /tmp:/tmp -ti nlpbox/dplp /tmp/input.txt
0       1       Although        although        IN      mark    3       O        (ROOT (SBAR (IN Although)      1
0       2       they    they    PRP     nsubj   3       O        (S (NP (PRP they))     1
0       3       didn't  didn't  VBP     root    0       O        (VP (VBP didn't)       1
0       4       like    like    IN      case    5       O        (PP (IN like)  1
0       5       it,     it,     NN      nmod    3       O        (NP (NP (NN it,))      1
0       6       they    they    PRP     nsubj   7       O        (SBAR (S (NP (PRP they))       2
0       7       accepted        accept  VBD     acl:relcl       5       O        (VP (VBD accepted)     2
0       8       the     the     DT      det     9       O        (NP (DT the)   2
0       9       offer.  offer.  NN      dobj    7       O        (NN offer.)))))))))))  2

@swords-fyx
Copy link

I got the same problem as @moonlight2397. The /tmp$ docker run -p 9000:9000 nlpbox/corenlp:3.9.2 stops at:
docker run -p 9000:9000 nlpbox/corenlp:3.9.2
[main] INFO CoreNLP - --- StanfordCoreNLPServer#main() called ---
[main] INFO CoreNLP - setting default constituency parser
[main] INFO CoreNLP - using SR parser: edu/stanford/nlp/models/srparser/englishSR.ser.gz
[main] INFO CoreNLP - Threads: 16
[main] INFO CoreNLP - Starting server...
[main] INFO CoreNLP - StanfordCoreNLPServer listening at /0.0.0.0:9000

The response of the line "/tmp$ docker run --net host -v /tmp:/tmp -ti nlpbox/dplp /tmp/input.txt" is:
Could not parse input file ...
Calling CoreNLP resulted in this error: [Errno 2] No such file or directory: '.xml'

It seems that the corenlp module produces the problem?

@moonlight2397
Copy link
Author

I got the same problem as @moonlight2397. The /tmp$ docker run -p 9000:9000 nlpbox/corenlp:3.9.2 stops at:
docker run -p 9000:9000 nlpbox/corenlp:3.9.2
[main] INFO CoreNLP - --- StanfordCoreNLPServer#main() called ---
[main] INFO CoreNLP - setting default constituency parser
[main] INFO CoreNLP - using SR parser: edu/stanford/nlp/models/srparser/englishSR.ser.gz
[main] INFO CoreNLP - Threads: 16
[main] INFO CoreNLP - Starting server...
[main] INFO CoreNLP - StanfordCoreNLPServer listening at /0.0.0.0:9000

The response of the line "/tmp$ docker run --net host -v /tmp:/tmp -ti nlpbox/dplp /tmp/input.txt" is:
Could not parse input file ...
Calling CoreNLP resulted in this error: [Errno 2] No such file or directory: '.xml'

It seems that the corenlp module produces the problem?

I am not sure where the problem happens. But I do have a few solutions that worked for me. You can use the docker image for dplp, or run it through rst-workbench. You can use also the original DPLP repo with CoreNLP locally, and copy the code from arne-cl's dplp repo, which gives the output in a format that is compatible with other NLPBox services.

@arne-cl
Copy link
Member

arne-cl commented Apr 12, 2021

Hi @swords-fyx,
your problem looks very different to what moonlight2397 described.

When you run docker run -p 9000:9000 nlpbox/corenlp:3.9.2, can you go to
http://localhost:9000/ in your browser to verify that CoreNLP is running and parse your
/tmp/input.txt there manually?

If that works, please post the output of cat /tmp/input.txt and docker run --net host -v /tmp:/tmp -ti nlpbox/dplp /tmp/input.txt.
Mine looks like this:

arne@t470:/tmp$ cat /tmp/input.txt
Although they didn't like her, they accepted the offer.
arne@t470:/tmp$ docker run --net host -v /tmp:/tmp -ti nlpbox/dplp /tmp/input.txt
Unable to find image 'nlpbox/dplp:latest' locally
latest: Pulling from nlpbox/dplp
18d680d61657: Already exists
0addb6fece63: Already exists
78e58219b215: Already exists
eb6959a66df2: Already exists
b8a55676c2c9: Pull complete
da406a51cfb4: Pull complete
28879b653754: Pull complete
ac771ea77e59: Pull complete
c1c78a2e677b: Pull complete
9ec920018318: Pull complete
14fe02bb7751: Pull complete
Digest: sha256:378fc31e02cd82d68b43014a7600fb6b99265e1c4893ecb0fb174e3ea63c397f
Status: Downloaded newer image for nlpbox/dplp:latest
0       1       Although        although        IN      mark    3       O        (ROOT (SBAR (IN Although)      1
0       2       they    they    PRP     nsubj   3       O        (S (NP (PRP they))     1
0       3       didn't  didn't  VBP     root    0       O        (VP (VBP didn't)       1
0       4       like    like    IN      case    5       O        (PP (IN like)  1
0       5       her,    her,    NN      nmod    3       O        (NP (NP (NN her,))     1
0       6       they    they    PRP     nsubj   7       O        (SBAR (S (NP (PRP they))       2
0       7       accepted        accept  VBD     acl:relcl       5       O        (VP (VBD accepted)     2
0       8       the     the     DT      det     9       O        (NP (DT the)   2
0       9       offer.  offer.  NN      dobj    7       O        (NN offer.)))))))))))  2

ParentedTree('NS-elaboration', [ParentedTree('EDU', ['1']), ParentedTree('EDU', ['2'])])

Please also post the output from the other window where CoreNLP is running.
It looks like this for me:

arne@t470:~$ docker run -p 9000:9000 nlpbox/corenlp:3.9.2
[main] INFO CoreNLP - --- StanfordCoreNLPServer#main() called ---
[main] INFO CoreNLP - setting default constituency parser
[main] INFO CoreNLP - using SR parser: edu/stanford/nlp/models/srparser/englishSR.ser.gz
[main] INFO CoreNLP -     Threads: 4
[main] INFO CoreNLP - Starting server...
[main] INFO CoreNLP - StanfordCoreNLPServer listening at /0.0.0.0:9000

It ran until there when I started CoreNLP, and when I ran the input file against DPLP, CoreNLP's output looked like this:

[pool-1-thread-2] INFO CoreNLP - [/172.17.0.1:57694] API call w/annotators tokenize,ssplit,pos,lemma,ner,parse
Although they didn't like her, they accepted the offer.
[pool-1-thread-2] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator tokenize
[pool-1-thread-2] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator ssplit
[pool-1-thread-2] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator pos
[pool-1-thread-2] INFO edu.stanford.nlp.tagger.maxent.MaxentTagger - Loading POS tagger from edu/stanford/nlp/models/pos-tagger/english-left3words/english-left3words-distsim.tagger ... done [1.2 sec].
[pool-1-thread-2] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator lemma
[pool-1-thread-2] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator ner
[pool-1-thread-2] INFO edu.stanford.nlp.ie.AbstractSequenceClassifier - Loading classifier from edu/stanford/nlp/models/ner/english.all.3class.distsim.crf.ser.gz ... done [5.0 sec].
[pool-1-thread-2] INFO edu.stanford.nlp.ie.AbstractSequenceClassifier - Loading classifier from edu/stanford/nlp/models/ner/english.muc.7class.distsim.crf.ser.gz ... done [0.9 sec].
[pool-1-thread-2] INFO edu.stanford.nlp.ie.AbstractSequenceClassifier - Loading classifier from edu/stanford/nlp/models/ner/english.conll.4class.distsim.crf.ser.gz ... done [0.8 sec].
[pool-1-thread-2] INFO edu.stanford.nlp.time.JollyDayHolidays - Initializing JollyDayHoliday for SUTime from classpath edu/stanford/nlp/models/sutime/jollyday/Holidays_sutime.xml as sutime.binder.1.
[pool-1-thread-2] INFO edu.stanford.nlp.time.TimeExpressionExtractorImpl - Using following SUTime rules: edu/stanford/nlp/models/sutime/defs.sutime.txt,edu/stanford/nlp/models/sutime/english.sutime.txt,edu/stanford/nlp/models/sutime/english.holidays.sutime.txt
[pool-1-thread-2] INFO edu.stanford.nlp.pipeline.TokensRegexNERAnnotator - ner.fine.regexner: Read 580704 unique entries out of 581863 from edu/stanford/nlp/models/kbp/english/gazetteers/regexner_caseless.tab, 0 TokensRegex patterns.
[pool-1-thread-2] INFO edu.stanford.nlp.pipeline.TokensRegexNERAnnotator - ner.fine.regexner: Read 4869 unique entries out of 4869 from edu/stanford/nlp/models/kbp/english/gazetteers/regexner_cased.tab, 0 TokensRegex patterns.
[pool-1-thread-2] INFO edu.stanford.nlp.pipeline.TokensRegexNERAnnotator - ner.fine.regexner: Read 585573 unique entries from 2 files
[pool-1-thread-2] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator parse
[pool-1-thread-2] INFO edu.stanford.nlp.parser.common.ParserGrammar - Loading parser from serialized file edu/stanford/nlp/models/srparser/englishSR.ser.gz ... done [24.5 sec].

I would also recommend you try https://github.com/NLPbox/dplp-service/ (built on top of dplp-docker) as it has a simpler user interface.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants