Skip to content

Commit

Permalink
Merge pull request #60 from yschneiderTEKLIA/fix-requirements
Browse files Browse the repository at this point in the history
Fix torch-related requirements and update github badges
  • Loading branch information
starride-teklia authored Mar 27, 2023
2 parents b1f4df2 + 85d6d16 commit 36c1f99
Show file tree
Hide file tree
Showing 6 changed files with 8 additions and 12 deletions.
6 changes: 3 additions & 3 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,16 +1,16 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v3.4.0
rev: v4.4.0
hooks:
- id: check-yaml
- id: trailing-whitespace
- id: end-of-file-fixer
- repo: https://github.com/timothycrosley/isort
rev: 5.6.4
rev: 5.12.0
hooks:
- id: isort
args: [--profile, black]
- repo: https://github.com/psf/black
rev: 22.3.0
rev: 23.1.0
hooks:
- id: black
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,8 @@
[![Coverage](https://img.shields.io/codecov/c/github/jpuigcerver/PyLaia?&label=Coverage&logo=Codecov&logoColor=ffffff&labelColor=f01f7a)](https://codecov.io/gh/jpuigcerver/PyLaia)
[![Code quality](https://img.shields.io/codefactor/grade/github/jpuigcerver/PyLaia?&label=CodeFactor&logo=CodeFactor&labelColor=2782f7)](https://www.codefactor.io/repository/github/jpuigcerver/PyLaia)

[![Python: 3.6+](https://img.shields.io/badge/Python-3.6%2B-FFD43B.svg?&logo=Python&logoColor=white&labelColor=306998)](https://www.python.org/)
[![PyTorch: 1.4.0+](https://img.shields.io/badge/PyTorch-1.4.0%2B-8628d5.svg?&logo=PyTorch&logoColor=white&labelColor=%23ee4c2c)](https://pytorch.org/)
[![Python: 3.8+](https://img.shields.io/badge/Python-3.8%2B-FFD43B.svg?&logo=Python&logoColor=white&labelColor=306998)](https://www.python.org/)
[![PyTorch: 1.13.0+](https://img.shields.io/badge/PyTorch-1.13.0%2B-8628d5.svg?&logo=PyTorch&logoColor=white&labelColor=%23ee4c2c)](https://pytorch.org/)
[![pre-commit: enabled](https://img.shields.io/badge/pre--commit-enabled-76877c?&logo=pre-commit&labelColor=1f2d23)](https://github.com/pre-commit/pre-commit)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg?)](https://github.com/ambv/black)

Expand Down
2 changes: 0 additions & 2 deletions laia/callbacks/decode.py
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,6 @@ def on_test_batch_end(self, trainer, pl_module, outputs, batch, *args):
word_probs = []

for i, (img_id, hyp) in enumerate(zip(img_ids, hyps)):

if self.use_symbols:
hyp = [self.syms[v] for v in hyp]
if self.convert_spaces:
Expand All @@ -96,7 +95,6 @@ def on_test_batch_end(self, trainer, pl_module, outputs, batch, *args):
hyp = self.join_string.join(str(x) for x in hyp).strip()

if self.print_confidence_scores:

if self.print_word_confidence_scores:
word_prob = [f"{prob:.2f}" for prob in word_probs[i]]
self.write(
Expand Down
1 change: 0 additions & 1 deletion laia/decoders/ctc_language_decoder.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,6 @@ def __init__(
unk_token: str = "<unk>",
sil_token: str = "<space>",
):

self.decoder = ctc_decoder(
lm=language_model_path,
lexicon=lexicon_path,
Expand Down
1 change: 0 additions & 1 deletion laia/nn/resnet.py
Original file line number Diff line number Diff line change
Expand Up @@ -128,7 +128,6 @@ def __init__(
width_per_group: int = 64,
norm_layer: Optional[Type[nn.Module]] = None,
):

if len(layers) != 4:
raise ValueError("The length of layers should be 4")

Expand Down
6 changes: 3 additions & 3 deletions requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,8 @@ matplotlib
# cpu version: nnutils-pytorch
nnutils-pytorch-cuda
pytorch-lightning==1.1.0
torch>=1.13<1.14
torchvision>=0.14<0.15
torchaudio>=0.13<0.14
torch>=1.13,<1.14
torchvision>=0.14,<0.15
torchaudio>=0.13,<0.14
jsonargparse[signatures]==4.7
dataclasses; python_version < '3.7'

0 comments on commit 36c1f99

Please sign in to comment.