Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove unused input port in Input layer. #318

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

rileywhite-noblis
Copy link

@rileywhite-noblis rileywhite-noblis commented May 22, 2024

Issue Number: #298

Objective of pull request: Remove unused input port.

Pull request checklist

Your PR fulfills the following requirements:

  • Issue created that explains the change and why it's needed
  • Tests are part of the PR (for bug fixes / features)
  • Docs reviewed and added / updated if needed (for bug fixes / features)
  • PR conforms to Coding Conventions
  • PR applys BSD 3-clause or LGPL2.1+ Licenses to all code files
  • Lint (flakeheaven lint src/lava tests/) and (bandit -r src/lava/.) pass locally
  • Build tests (pytest) passes locally - I was unable to run a few tests due to a missing file (mnist.net) and being unable to open two others (/gts/ntidigits/input.npy, /gts/ntidigits/ntidigits.net). Not sure if this is a configuration issue on my end, but these failed before I made any modifications. Going to rely on the CI to check these, since that environment seems to work with these files, sorry.

Pull request type

  • Bugfix
  • Feature
  • Code style update (formatting, renaming)
  • Refactoring (no functional changes, no api changes)
  • Build related changes
  • Documentation changes
  • Other (please describe):

What is the current behavior?

In the latest release, the in port of the Input AbstractBlock for netx does not connect it's in port to it's neuron in port.

Load any model via netx with an input layer and connect it to data.

net = netx.hdf5.Network(trained_folder + '/network.net')
data.out.connect(net.inp)
sink.connect(net.out)

Then try to run this network, you will get an infinite hang.

What is the new behavior?

Input port is no longer available in Input layer, forcing usage of it's neuron's input port.

Does this introduce a breaking change?

  • Yes
  • No

Supplemental information

I will admit this change is a bit of a hack, but a better fix may be outside of my time commitment, and knowledge of the compiler. A better fix would probably be to set its input port to its neurons in_a port, but this causes issues with compilation that I was unable to sort out. Specifically:

/Users/m32005/lava-dl/tests/lava/lib/dl/netx/test_hdf5.py::TestHdf5Netx::test_tinynet failed: self = <tests.lava.lib.dl.netx.test_hdf5.TestHdf5Netx testMethod=test_tinynet>

    def test_tinynet(self) -> None:
        """Tests the output of three layer CNN."""
        steps_per_sample = 17
        net = netx.hdf5.Network(net_config=root + '/tiny.net')
    
        num_steps = steps_per_sample + len(net)
        sink = io.sink.RingBuffer(
            shape=net.out_layer.out.shape, buffer=num_steps
        )
        net.out_layer.out.connect(sink.a_in)
    
        # layer reset mechanism
        for i, l in enumerate(net.layers):
            u_resetter = io.reset.Reset(
                interval=steps_per_sample, offset=i - 1)
            v_resetter = io.reset.Reset(
                interval=steps_per_sample, offset=i - 1)
            u_resetter.connect_var(l.neuron.u)
            v_resetter.connect_var(l.neuron.v)
    
        if verbose:
            print(f'Network created from {net.filename}')
            print(net)
            print(f'{len(net) = }')
    
        # set input bias for the ground truth sample
        net.in_layer.neuron.bias_mant.init = np.load(
            root + '/gts/tinynet/input_bias.npy'
        )
    
        run_condition = RunSteps(num_steps=num_steps)
        run_config = TestRunConfig(select_tag='fixed_pt')
>       net.run(condition=run_condition, run_cfg=run_config)

tests/lava/lib/dl/netx/test_hdf5.py:112: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
../anaconda3/envs/lava/src/lava/src/lava/magma/core/process/process.py:364: in run
    self.create_runtime(run_cfg=run_cfg, compile_config=compile_config)
../anaconda3/envs/lava/src/lava/src/lava/magma/core/process/process.py:388: in create_runtime
    executable = self.compile(run_cfg, compile_config)
../anaconda3/envs/lava/src/lava/src/lava/magma/core/process/process.py:412: in compile
    return compiler.compile(self, run_cfg)
../anaconda3/envs/lava/src/lava/src/lava/magma/compiler/compiler.py:140: in compile
    proc_builders, channel_map = self._compile_proc_groups(
../anaconda3/envs/lava/src/lava/src/lava/magma/compiler/compiler.py:247: in _compile_proc_groups
    proc_builders, channel_map = self._extract_proc_builders(
../anaconda3/envs/lava/src/lava/src/lava/magma/compiler/compiler.py:410: in _extract_proc_builders
    builders, channel_map = subcompiler.get_builders(channel_map)
../anaconda3/envs/lava/src/lava/src/lava/magma/compiler/subcompilers/py/pyproc_compiler.py:112: in get_builders
    builders[process] = self._create_builder_for_process(process)
../anaconda3/envs/lava/src/lava/src/lava/magma/compiler/subcompilers/py/pyproc_compiler.py:128: in _create_builder_for_process
    inport_initializers = self._create_inport_initializers(process)
../anaconda3/envs/lava/src/lava/src/lava/magma/compiler/subcompilers/py/pyproc_compiler.py:199: in _create_inport_initializers
    ChannelBuildersFactory.get_port_dtype(port),
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

port = <lava.magma.core.process.ports.ports.InPort object at 0x302879a90>

    @staticmethod
    def get_port_dtype(port: AbstractPort) -> ty.Any:
        """Returns the d_type of a Process Port, as specified in the
        corresponding PortImplementation of the ProcessModel implementing the
        Process"""
    
        port_pm_class = ChannelBuildersFactory._get_port_process_model_class(
            port
        )
        if hasattr(port_pm_class, port.name):
            if isinstance(port, VarPort):
                return getattr(port_pm_class, port.var.name).d_type
            return getattr(port_pm_class, port.name).d_type
        elif isinstance(port, ImplicitVarPort):
            return getattr(port_pm_class, port.var.name).d_type
        # Port has different name in Process and ProcessModel
        else:
>           raise AssertionError(
                "Port {!r} not found in "
                "ProcessModel {!r}".format(port, port_pm_class)
            )
E           AssertionError: Port <lava.magma.core.process.ports.ports.InPort object at 0x302879a90> not found in ProcessModel <class 'lava.lib.dl.netx.blocks.models.PyInputModel'>

../anaconda3/envs/lava/src/lava/src/lava/magma/compiler/subcompilers/channel_builders_factory.py:225: AssertionError```

rileywhite-noblis and others added 2 commits May 22, 2024 10:46
It was unused because the neuronn input is what should be used instead. Removed because it was causing confusion.

A better fix would probably be to set it's input port to it's neurons in_a port, but this causes issues with compilation that I was unable to sort out.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants