Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Getting "A plugin had an unrecoverable error" #41

Open
rpasche opened this issue Jun 10, 2022 · 14 comments
Open

Getting "A plugin had an unrecoverable error" #41

rpasche opened this issue Jun 10, 2022 · 14 comments
Labels

Comments

@rpasche
Copy link

rpasche commented Jun 10, 2022

Logstash information:

Please include the following information:

logstash 8.4.0-SNAPSHOT
  1. Logstash installation source (e.g. built from source, with a package manager: DEB/RPM, expanded from tar or zip archive, docker)
Logstash got built from source

based on brach referenced at based on the branch referenced at elastic/logstash#14212

  1. How is Logstash being run (e.g. as a service/service manager: systemd, upstart, etc. Via command line, docker/kubernetes)
Logstash is run within kubernetes container
  1. How was the Logstash Plugin installed
Default plugin installation

JVM (e.g. java -version):
bundled JDK version

Using bundled JDK: /opt/logstash/jdk
Sending Logstash logs to /opt/logstash/logs which is now configured via log4j2.properties
{"level":"INFO","loggerName":"logstash.runner","timeMillis":1654859295754,"thread":"main","logEvent":{"message":"Log4j configuration path used is: /opt/logstash/config/log4j2.properties"}}
{"level":"INFO","loggerName":"logstash.runner","timeMillis":1654859295781,"thread":"main","logEvent":{"message":"Starting Logstash","logstash.version":"8.4.0","jruby.version":"jruby 9.3.4.0 (2.6.8) 2022-03-23 eff48c1ebf OpenJDK 64-Bit Server VM 11.0.15+10 on 11.0.15+10 +indy +jit [x86_64-linux]"}}
{"level":"INFO","loggerName":"logstash.runner","timeMillis":1654859295783,"thread":"main","logEvent":{"message":"JVM bootstrap flags: [-Xms4096m, -Xmx4
096m, -XX:+UseG1GC, -XX:G1ReservePercent=25, -XX:InitiatingHeapOccupancyPercent=30, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.in
vokedynamic=true, -Djruby.jit.threshold=0, -Djruby.regexp.interruptible=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -
Dlog4j2.isThreadContextMapInheritable=true, -Dcom.sun.management.jmxremote.port=2000, -Dcom.sun.management.jmxremote.authenticate=false, -Dcom.sun.mana
gement.jmxremote.ssl=false, -Dio.netty.native.workdir=/opt/logstash/temp, -Dio.netty.tmpdir=/opt/logstash/temp, -Dlog4j2.formatMsgNoLookups=true, -Dnet
workaddress.cache.ttl=60, -Djdk.io.File.enableADS=true, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --a
dd-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
"}}

Steps to reproduce:

I have rebuild logstash using referenced branch with the following steps:

  1. Checked out logstash referenced branch
  2. Installed openJDK 11
$ echo $JAVA_HOME
/usr/lib/jvm/java-1.11.0-openjdk-amd64
$ java -version
openjdk version "11.0.15" 2022-04-19
OpenJDK Runtime Environment (build 11.0.15+10-Ubuntu-0ubuntu0.20.04.1)
OpenJDK 64-Bit Server VM (build 11.0.15+10-Ubuntu-0ubuntu0.20.04.1, mixed mode, sharing)
$ rbenv versions
  system
* jruby-9.1.12.0 (set by /home/ccloud/git/logstash/.ruby-version)
  jruby-9.2.20.1
  jruby-9.3.4.0
$
$ # system ruby version (outside of logstash repository)
$ ruby -v
jruby 9.3.4.0 (2.6.8) 2022-03-23 eff48c1ebf OpenJDK 64-Bit Server VM 11.0.15+10-Ubuntu-0ubuntu0.20.04.1 on 11.0.15+10-Ubuntu-0ubuntu0.20.04.1 +jit [x86_64-linux]
$
  1. Installed rake and bundler like mentioned in the logstash README
gem install rake
gem install bundler
$ rake --version
unsupported Java version "11", defaulting to 1.7
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.jruby.util.io.FilenoUtil to method sun.nio.ch.SelChImpl.getFD()
WARNING: Please consider reporting this to the maintainers of org.jruby.util.io.FilenoUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
rake, version 13.0.6
$ bundler --version
unsupported Java version "11", defaulting to 1.7
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.jruby.util.io.FilenoUtil to method sun.nio.ch.SelChImpl.getFD()
WARNING: Please consider reporting this to the maintainers of org.jruby.util.io.FilenoUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
/home/ccloud/.rbenv/versions/jruby-9.1.12.0/lib/ruby/gems/shared/gems/bundler-2.3.15/lib/bundler.rb:40: warning: toplevel constant Mutex referenced by Thread::Mutex
Your RubyGems version (2.6.11) has a bug that prevents `required_ruby_version` from working for Bundler. Any scripts that use `gem install bundler` will break as soon as Bundler drops support for your Ruby version. Please upgrade RubyGems to avoid future breakage and silence this warning by running `gem update --system 3.2.3`
Bundler version 2.3.15
$
  1. Built archives via
rake artifact:archives
  1. Built docker image using this new archive

Provide logs (if relevant):

Logstash.yml

logstash.yml
log.format: json
log.level: info
slowlog.threshold.warn: 5s
slowlog.threshold.info: 1s
slowlog.threshold.debug: 500ms
slowlog.threshold.trace: 100ms
config.debug: false
http.host: "0.0.0.0"
path.queue: "/opt/logstash/queue"
queue.type: "persisted"
dead_letter_queue.enable: true
dead_letter_queue.max_bytes: 1073741824
pipeline.ecs_compatibility: disabled

pipelines.yml

- pipeline.id: main
  pipeline.workers: 8
  pipeline.batch.size: 125
  path.config: "/opt/logstash/config/configurations/*.conf_main"
  queue.max_bytes: 49392123904
  queue.drain: "true"
- pipeline.id: deadletter
  path.config: "/opt/logstash/config/configurations/*.conf_deadletter"
  queue.type: "memory"
  queue.max_bytes: 1073741824
...
...

deadletter configuration

~/config/configurations$ cat *deadletter
# code to catch mapping errors to elasticsearch
filter {
  ...
  ...
  }
}
input {
  dead_letter_queue {
    path => "/opt/logstash/data/dead_letter_queue"
    pipeline_id => "main"
  }
}
output {
  elasticsearch {
    ...
    ...
  }
}

Error we get

{"level":"ERROR","loggerName":"logstash.javapipeline","timeMillis":1654859314612,"thread":"[deadletter]<dead_letter_queue","logEvent":{"message":"A plugin had an unrecoverable error. Will restart this plugin.\n  Pipeline_id:deadletter\n  Plugin: <LogStash::Inputs::DeadLetterQueue pipeline_id=>\"main\", path=>\"/opt/logstash/data/dead_letter_queue\", id=>\"f22c07e1f107d3ee3a975c6ec7e2ce8cbf33c0a9381a287e6beb9187a286fb5e\", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>\"plain_d2758f57-b46b-45ea-820c-de791352a628\", enable_metric=>true, charset=>\"UTF-8\">, commit_offsets=>true>\n  Error: DLQ sub-path /opt/logstash/data/dead_letter_queue/main does not exist\n  Exception: Java::JavaNioFile::NoSuchFileException\n  Stack: org.logstash.input.DeadLetterQueueInputPlugin.lazyInitQueueReader(org/logstash/input/DeadLetterQueueInputPlugin.java:67)\norg.logstash.input.DeadLetterQueueInputPlugin.run(org/logstash/input/DeadLetterQueueInputPlugin.java:109)\njdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\njdk.internal.reflect.NativeMethodAccessorImpl.invoke(jdk/internal/reflect/NativeMethodAccessorImpl.java:62)\njdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(jdk/internal/reflect/DelegatingMethodAccessorImpl.java:43)\njava.lang.reflect.Method.invoke(java/lang/reflect/Method.java:566)\norg.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:427)\norg.jruby.javasupport.JavaMethod.invokeDirect(org/jruby/javasupport/JavaMethod.java:294)\nopt.logstash_minus_8_dot_4_dot_0_minus_SNAPSHOT.vendor.bundle.jruby.$2_dot_6_dot_0.gems.logstash_minus_input_minus_dead_letter_queue_minus_1_dot_1_dot_11.lib.logstash.inputs.dead_letter_queue.run(/opt/logstash-8.4.0-SNAPSHOT/vendor/bundle/jruby/2.6.0/gems/logstash-input-dead_letter_queue-1.1.11/lib/logstash/inputs/dead_letter_queue.rb:74)\nopt.logstash_minus_8_dot_4_dot_0_minus_SNAPSHOT.logstash_minus_core.lib.logstash.java_pipeline.inputworker(/opt/logstash-8.4.0-SNAPSHOT/logstash-core/lib/logstash/java_pipeline.rb:410)\nopt.logstash_minus_8_dot_4_dot_0_minus_SNAPSHOT.logstash_minus_core.lib.logstash.java_pipeline.start_input(/opt/logstash-8.4.0-SNAPSHOT/logstash-core/lib/logstash/java_pipeline.rb:401)\norg.jruby.RubyProc.call(org/jruby/RubyProc.java:309)\njava.lang.Thread.run(java/lang/Thread.java:829)"}}

Then, constantly following message like

{"level":"ERROR","loggerName":"logstash.javapipeline","timeMillis":1654862058107,"thread":"[deadletter]<dead_letter_queue","logEvent":{"message":"A plugin had an unrecoverable error. Will restart this plugin.\n  Pipeline_id:deadletter\n  Plugin: <LogStash::Inputs::DeadLetterQueue pipeline_id=>\"main", path=>\"/opt/logstash/data/dead_letter_queue\", id=>\"f22c07e1f107d3ee3a975c6ec7e2ce8cbf33c0a9381a287e6beb9187a286fb5e\", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>\"plain_aa1d7fba-b6c4-4e0b-a866-c699f71064b8\", enable_metric=>true, charset=>\"UTF-8\">, commit_offsets=>true>\n  Error: wrong number of arguments (given 0, expected 1)\n  Exception: ArgumentError\n  Stack: /opt/logstash-8.4.0-SNAPSHOT/vendor/bundle/jruby/2.6.0/gems/logstash-input-dead_letter_queue-1.1.11/lib/logstash/inputs/dead_letter_queue.rb:74:in `run'\n/opt/logstash-8.4.0-SNAPSHOT/logstash-core/lib/logstash/java_pipeline.rb:410:in `inputworker'\n/opt/logstash-8.4.0-SNAPSHOT/logstash-core/lib/logstash/java_pipeline.rb:401:in `block in start_input'"}}

While now looking into the "running" container, we see that the path exists

$ ls -l /opt/logstash/data/dead_letter_queue/main/
total 8
-rw-r--r--. 1 logstash logstash 1 Jun 10 11:10 1.log.tmp
$

In the past, we sometimes also had "race-condition" looking like issues. Our "main" pipeline is quite big...and for us, it looked like the deadletter pipeline is started, before the "monitored" main pipelines is fully started up (and also the directories within the "dead_letter_queue" folder have been created). We were able to workaround this issue in the past and then have sent a SIGHUP to logstash (to reload its pipelines). But this does not work anymore.

See logstash _node/pipelines API

$ curl localhost:9600/_node/pipelines?pretty
{
  "host" : "test-container-0",
  "version" : "8.4.0",
  "http_address" : "0.0.0.0:9600",
  "id" : "35567748-4f5c-4e22-804b-7b2323613daa",
  "name" : "test-container-0",
  "ephemeral_id" : "aef08794-aa99-4a01-a210-b7a2b68fc6c9",
  "status" : "green",
  "snapshot" : true,
  "pipeline" : {
    "workers" : 16,
    "batch_size" : 125,
    "batch_delay" : 50
  },
  "pipelines" : {
    "deadletter" : {
      "ephemeral_id" : "5a541ead-34c9-4ef3-891b-3450c82a82d5",
      "hash" : "a1957004a2b54d5e57165a673359952873f77e5e5416adb683d70ccadf490ba0",
      "workers" : 16,
      "batch_size" : 125,
      "batch_delay" : 50,
      "config_reload_automatic" : false,
      "config_reload_interval" : 3000000000,
      "dead_letter_queue_enabled" : true,
      "dead_letter_queue_path" : "/opt/logstash/data/dead_letter_queue/deadletter"
    },
    "main" : {
      "ephemeral_id" : "c1b6b758-b035-4e20-9a3d-e2a7859d7041",
      "hash" : "de838b948484413cf86a7892edd304232f65824ad214587a99031e20216bffc3",
      "workers" : 8,
      "batch_size" : 125,
      "batch_delay" : 50,
      "config_reload_automatic" : false,
      "config_reload_interval" : 3000000000,
      "dead_letter_queue_enabled" : true,
      "dead_letter_queue_path" : "/opt/logstash/data/dead_letter_queue/main"
    },
    ...
    ...
  }
}
@rpasche rpasche added the bug label Jun 10, 2022
@rpasche
Copy link
Author

rpasche commented Jun 10, 2022

I am wondering, why the plugin is not already failing at

At least, if I read it correctly, this is the line where the register method of the Java plugin should get triggered, which itself should then fail at

@kares
Copy link
Contributor

kares commented Jun 14, 2022

Hey Robert, this does not seem like a new issue:

Error: DLQ sub-path /opt/logstash/data/dead_letter_queue/main does not exist\n  Exception: Java::JavaNioFile::NoSuchFileException

... given this is a known issue, does it persist after a restart or two?

You concern seems to me that this is somehow caused by a change in the 8.4.0-SNAPSHOT build you're using? If that is the case we could try a different SNAPSHOT that would contain the patch you're after but would not be based off main but rather a more stable Logstash branch such as 8.2 or 8.3.

@mashhurs
Copy link

Hi @rpasche,
Thanks again for deep dive and sorry for the issue with PQ that you are trying to solve.

The PR for the symbolic link is under review and we are giving a priority to make the change happen.

And I assume you are building Logstash from the PQ symbolic link change source. I have also tried on my local that it doesn't seem belong to the changes. Let me know if I can help or if my understanding is not correct.

Thanks.

@rpasche
Copy link
Author

rpasche commented Jun 27, 2022

Hey Robert, this does not seem like a new issue:

Error: DLQ sub-path /opt/logstash/data/dead_letter_queue/main does not exist\n  Exception: Java::JavaNioFile::NoSuchFileException

... given this is a known issue, does it persist after a restart or two?

You concern seems to me that this is somehow caused by a change in the 8.4.0-SNAPSHOT build you're using? If that is the case we could try a different SNAPSHOT that would contain the patch you're after but would not be based off main but rather a more stable Logstash branch such as 8.2 or 8.3.

Hi @kares I am back from vacation now. I would need to test this again. But I think I heard, that 8.3.0 should get released on 28th of June. So I think, we will anyhow test this release within our environment.

@rpasche
Copy link
Author

rpasche commented Jun 29, 2022

Hi @kares @jsvd @richardgilm,

yesterday, version 8.3.0 got officially released. We have tested this version within our stack and we get again a A plugin had an unrecoverable error.....but with another error message.

This is what we see in our logs

{"level":"ERROR","loggerName":"logstash.javapipeline","timeMillis":1656485427851,"thread":"[deadletter]<dead_letter_queue","logEvent":{"message":"A plugin had an unrecoverable error. Will restart this plugin.\n  Pipeline_id:deadletter\n  Plugin: <LogStash::Inputs::DeadLetterQueue pipeline_id=>\"in-https\", path=>\"/opt/logstash/data/dead_letter_queue\", id=>\"0b6cd156bd02d597addab476308a1e811786c47e231d05a6902564983c580525\", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>\"plain_17bb2b8f-d426-4206-a621-ac596c26aeb7\", enable_metric=>true, charset=>\"UTF-8\">, commit_offsets=>true>\n  Error: Could not resolve type id 'java.lang.Long' as a subtype of `java.lang.Object`: Configured `PolymorphicTypeValidator` (of type `com.fasterxml.jackson.databind.jsontype.BasicPolymorphicTypeValidator`) denied resolution\n at [Source: (byte[])[5161 bytes]; byte offset: #146] (through reference chain: java.util.HashMap[\"DATA\"]->org.logstash.ConvertedMap[\"process\"]->org.logstash.ConvertedMap[\"ppid\"])\n  Exception: Java::ComFasterxmlJacksonDatabindExc::InvalidTypeIdException\n  Stack: com.fasterxml.jackson.databind.exc.InvalidTypeIdException.from(com/fasterxml/jackson/databind/exc/InvalidTypeIdException.java:43)\ncom.fasterxml.jackson.databind.DeserializationContext.invalidTypeIdException(com/fasterxml/jackson/databind/DeserializationContext.java:2073)\ncom.fasterxml.jackson.databind.DatabindContext._throwSubtypeClassNotAllowed(com/fasterxml/jackson/databind/DatabindContext.java:287)\ncom.fasterxml.jackson.databind.DatabindContext.resolveAndValidateSubType(com/fasterxml/jackson/databind/DatabindContext.java:244)\ncom.fasterxml.jackson.databind.jsontype.impl.ClassNameIdResolver._typeFromId(com/fasterxml/jackson/databind/jsontype/impl/ClassNameIdResolver.java:72)\ncom.fasterxml.jackson.databind.jsontype.impl.ClassNameIdResolver.typeFromId(com/fasterxml/jackson/databind/jsontype/impl/ClassNameIdResolver.java:66)\ncom.fasterxml.jackson.databind.jsontype.impl.TypeDeserializerBase._findDeserializer(com/fasterxml/jackson/databind/jsontype/impl/TypeDeserializerBase.java:159)\ncom.fasterxml.jackson.databind.jsontype.impl.AsArrayTypeDeserializer._deserialize(com/fasterxml/jackson/databind/jsontype/impl/AsArrayTypeDeserializer.java:97)\ncom.fasterxml.jackson.databind.jsontype.impl.AsArrayTypeDeserializer.deserializeTypedFromAny(com/fasterxml/jackson/databind/jsontype/impl/AsArrayTypeDeserializer.java:71)\ncom.fasterxml.jackson.databind.deser.std.UntypedObjectDeserializer$Vanilla.deserializeWithType(com/fasterxml/jackson/databind/deser/std/UntypedObjectDeserializer.java:781)\ncom.fasterxml.jackson.databind.deser.std.MapDeserializer._readAndBindStringKeyMap(com/fasterxml/jackson/databind/deser/std/MapDeserializer.java:611)\ncom.fasterxml.jackson.databind.deser.std.MapDeserializer.deserialize(com/fasterxml/jackson/databind/deser/std/MapDeserializer.java:437)\ncom.fasterxml.jackson.databind.deser.std.MapDeserializer.deserialize(com/fasterxml/jackson/databind/deser/std/MapDeserializer.java:32)\ncom.fasterxml.jackson.databind.jsontype.impl.AsArrayTypeDeserializer._deserialize(com/fasterxml/jackson/databind/jsontype/impl/AsArrayTypeDeserializer.java:120)\ncom.fasterxml.jackson.databind.jsontype.impl.AsArrayTypeDeserializer.deserializeTypedFromAny(com/fasterxml/jackson/databind/jsontype/impl/AsArrayTypeDeserializer.java:71)\ncom.fasterxml.jackson.databind.deser.std.UntypedObjectDeserializer$Vanilla.deserializeWithType(com/fasterxml/jackson/databind/deser/std/UntypedObjectDeserializer.java:781)\ncom.fasterxml.jackson.databind.deser.std.MapDeserializer._readAndBindStringKeyMap(com/fasterxml/jackson/databind/deser/std/MapDeserializer.java:611)\ncom.fasterxml.jackson.databind.deser.std.MapDeserializer.deserialize(com/fasterxml/jackson/databind/deser/std/MapDeserializer.java:437)\ncom.fasterxml.jackson.databind.deser.std.MapDeserializer.deserialize(com/fasterxml/jackson/databind/deser/std/MapDeserializer.java:32)\ncom.fasterxml.jackson.databind.jsontype.impl.AsArrayTypeDeserializer._deserialize(com/fasterxml/jackson/databind/jsontype/impl/AsArrayTypeDeserializer.java:120)\ncom.fasterxml.jackson.databind.jsontype.impl.AsArrayTypeDeserializer.deserializeTypedFromAny(com/fasterxml/jackson/databind/jsontype/impl/AsArrayTypeDeserializer.java:71)\ncom.fasterxml.jackson.databind.deser.std.UntypedObjectDeserializer$Vanilla.deserializeWithType(com/fasterxml/jackson/databind/deser/std/UntypedObjectDeserializer.java:781)\ncom.fasterxml.jackson.databind.deser.std.MapDeserializer._readAndBindStringKeyMap(com/fasterxml/jackson/databind/deser/std/MapDeserializer.java:611)\ncom.fasterxml.jackson.databind.deser.std.MapDeserializer.deserialize(com/fasterxml/jackson/databind/deser/std/MapDeserializer.java:437)\ncom.fasterxml.jackson.databind.deser.std.MapDeserializer.deserialize(com/fasterxml/jackson/databind/deser/std/MapDeserializer.java:32)\ncom.fasterxml.jackson.databind.jsontype.impl.AsArrayTypeDeserializer._deserialize(com/fasterxml/jackson/databind/jsontype/impl/AsArrayTypeDeserializer.java:120)\ncom.fasterxml.jackson.databind.jsontype.impl.AsArrayTypeDeserializer.deserializeTypedFromObject(com/fasterxml/jackson/databind/jsontype/impl/AsArrayTypeDeserializer.java:61)\ncom.fasterxml.jackson.databind.deser.std.MapDeserializer.deserializeWithType(com/fasterxml/jackson/databind/deser/std/MapDeserializer.java:482)\ncom.fasterxml.jackson.databind.deser.impl.TypeWrappedDeserializer.deserialize(com/fasterxml/jackson/databind/deser/impl/TypeWrappedDeserializer.java:74)\ncom.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(com/fasterxml/jackson/databind/deser/DefaultDeserializationContext.java:323)\ncom.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(com/fasterxml/jackson/databind/ObjectMapper.java:4674)\ncom.fasterxml.jackson.databind.ObjectMapper.readValue(com/fasterxml/jackson/databind/ObjectMapper.java:3723)\norg.logstash.Event.fromSerializableMap(org/logstash/Event.java:234)\norg.logstash.Event.deserialize(org/logstash/Event.java:510)\norg.logstash.DLQEntry.deserialize(org/logstash/DLQEntry.java:96)\norg.logstash.common.io.DeadLetterQueueReader.pollEntry(org/logstash/common/io/DeadLetterQueueReader.java:172)\norg.logstash.input.DeadLetterQueueInputPlugin.run(org/logstash/input/DeadLetterQueueInputPlugin.java:104)\njdk.internal.reflect.GeneratedMethodAccessor155.invoke(jdk/internal/reflect/GeneratedMethodAccessor155)\njdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(jdk/internal/reflect/DelegatingMethodAccessorImpl.java:43)\njava.lang.reflect.Method.invoke(java/lang/reflect/Method.java:566)\norg.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:456)\norg.jruby.javasupport.JavaMethod.invokeDirect(org/jruby/javasupport/JavaMethod.java:317)\nopt.logstash_minus_8_dot_3_dot_0.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_dead_letter_queue_minus_1_dot_1_dot_12.lib.logstash.inputs.dead_letter_queue.run(/opt/logstash-8.3.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-dead_letter_queue-1.1.12/lib/logstash/inputs/dead_letter_queue.rb:74)\nopt.logstash_minus_8_dot_3_dot_0.logstash_minus_core.lib.logstash.java_pipeline.inputworker(/opt/logstash-8.3.0/logstash-core/lib/logstash/java_pipeline.rb:410)\nopt.logstash_minus_8_dot_3_dot_0.logstash_minus_core.lib.logstash.java_pipeline.start_input(/opt/logstash-8.3.0/logstash-core/lib/logstash/java_pipeline.rb:401)\norg.jruby.RubyProc.call(org/jruby/RubyProc.java:318)\njava.lang.Thread.run(java/lang/Thread.java:829)"}}

Further information (might be helpful). We forced the deadletter to trigger, because we changed the type from an int into a string in the data

The version (from API)

$ curl localhost:9600?pretty
{
  "host" : "container",
  "version" : "8.3.0",
  "http_address" : "0.0.0.0:9600",
  "id" : "24a6d1f6-37c1-4a21-995d-0cafcba724a8",
  "name" : "container",
  "ephemeral_id" : "6710e4ec-57d8-4680-808b-14f385702ef6",
  "status" : "green",
  "snapshot" : false,
  "pipeline" : {
    "workers" : 16,
    "batch_size" : 125,
    "batch_delay" : 50
  },
  "build_date" : "2022-06-23T22:50:24Z",
  "build_sha" : "e6ba1107a07b0468ca11d2765a4ab99a2c974eef",
  "build_snapshot" : false
}
$

@jsvd
Copy link
Member

jsvd commented Jun 29, 2022 via email

@rpasche
Copy link
Author

rpasche commented Jun 29, 2022

Hi. @jsvd ,

I just wanted to provide a very simple example for you. Strange enough, in this case, the deadletter plugin worked as expected.

I need to ask colleague, what exactly he tested.

At least, this is what "I" did.

curl -k -H "Content-Type: application/json" https://localhost -d '{"testfield_as_number": 12345678}'

This created the event as expected in the target index. Additionally, I explicitly set the type to long for the testfield_as_number.

Then, forcing a mapping conflict with

curl -k -H "Content-Type: application/json" https://localhos -d '{"testfield_as_number": "giving a string to force mapping conflict"}'

And this got correctly processed by our deadletter pipeline (which we are writing to a dedicated index).

@jsvd
Copy link
Member

jsvd commented Jun 29, 2022

Yes this is what I'm seeing as well, there are code paths where java.lang.Long is deserialized correctly:

irb(main):025:0> Java::org.logstash.Event.deserialize(Java::org.logstash.Event.new("data" => Java::java.lang.Long.new(1000000)).serialize).to_json
=> "{\"@timestamp\":\"2022-06-29T08:07:29.881204Z\",\"@version\":\"1\",\"data\":1000000}"

@jsvd
Copy link
Member

jsvd commented Jun 29, 2022

@rpasche if there's any way y'all can pass us the DLQ files that trigger this issue it'd facilitate debugging. In the meantime we'll change the type validation to be "allow all", while we identify the issue. elastic/logstash#14312

@rpasche
Copy link
Author

rpasche commented Jun 29, 2022

@jsvd Where should I paste it? It's ~8k

@jsvd
Copy link
Member

jsvd commented Jun 29, 2022

https://upload.elastic.co/u/92d7a242-6feb-49e3-8b2e-2204e8d3b736 (expires in 1 day)
I suggest scrubbing any potential sensitive information before uploading.

@rpasche
Copy link
Author

rpasche commented Jun 29, 2022

@jsvd Uploaded the file

@jsvd
Copy link
Member

jsvd commented Jul 5, 2022

The Could not resolve type id 'java.lang.Long' as a subtype of... issue has been resolved with 8.3.1 that was released last week: https://www.elastic.co/guide/en/logstash/8.3/logstash-8-3-1.html

@rpasche
Copy link
Author

rpasche commented Jul 5, 2022

Hi @jsvd sounds great. We'll test this soon. Thanks for the input (and fix)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants