Skip to content
This repository has been archived by the owner on Dec 15, 2021. It is now read-only.

log4j system initialization problem #26

Open
yomansk8 opened this issue Jan 5, 2017 · 8 comments
Open

log4j system initialization problem #26

yomansk8 opened this issue Jan 5, 2017 · 8 comments

Comments

@yomansk8
Copy link

yomansk8 commented Jan 5, 2017

Hello, I have an issue when using this plugin to output data to ElasticSearch.

My Logstash conf is the following :

input {
  dynamodb {
    _type => "dev_organizations"
    endpoint => "http://dynamodb:8000"
    streams_endpoint => "http://dynamodb:8000"
    view_type => "new_and_old_images"
    aws_access_key_id => "MY_ACCESS_KEY"
    aws_secret_access_key => "MY_SECRET"
    table_name => "dev_organizations"
  }
  dynamodb {
    _type => "dev_automations"
    endpoint => "http://dynamodb:8000"
    streams_endpoint => "http://dynamodb:8000"
    view_type => "new_and_old_images"
    aws_access_key_id => "MY_ACCESS_KEY"
    aws_secret_access_key => "MY_SECRET"
    table_name => "dev_automations"
  }
  dynamodb {
    _type => "dev_bags"
    endpoint => "http://dynamodb:8000"
    streams_endpoint => "http://dynamodb:8000"
    view_type => "new_and_old_images"
    aws_access_key_id => "MY_ACCESS_KEY"
    aws_secret_access_key => "MY_SECRET"
    table_name => "dev_bags"
  }
  dynamodb {
    _type => "dev_billings"
    endpoint => "http://dynamodb:8000"
    streams_endpoint => "http://dynamodb:8000"
    view_type => "new_and_old_images"
    aws_access_key_id => "MY_ACCESS_KEY"
    aws_secret_access_key => "MY_SECRET"
    table_name => "dev_billings"
  }
  dynamodb {
    _type => "dev_boxes"
    endpoint => "http://dynamodb:8000"
    streams_endpoint => "http://dynamodb:8000"
    view_type => "new_and_old_images"
    aws_access_key_id => "MY_ACCESS_KEY"
    aws_secret_access_key => "MY_SECRET"
    table_name => "dev_boxes"
  }
  dynamodb {
    _type => "dev_collects"
    endpoint => "http://dynamodb:8000"
    streams_endpoint => "http://dynamodb:8000"
    view_type => "new_and_old_images"
    aws_access_key_id => "MY_ACCESS_KEY"
    aws_secret_access_key => "MY_SECRET"
    table_name => "dev_collects"
  }
  dynamodb {
    _type => "dev_items"
    endpoint => "http://dynamodb:8000"
    streams_endpoint => "http://dynamodb:8000"
    view_type => "new_and_old_images"
    aws_access_key_id => "MY_ACCESS_KEY"
    aws_secret_access_key => "MY_SECRET"
    table_name => "dev_items"
  }
  dynamodb {
    _type => "dev_locations"
    endpoint => "http://dynamodb:8000"
    streams_endpoint => "http://dynamodb:8000"
    view_type => "new_and_old_images"
    aws_access_key_id => "MY_ACCESS_KEY"
    aws_secret_access_key => "MY_SECRET"
    table_name => "dev_locations"
  }
  dynamodb {
    _type => "dev_notifications"
    endpoint => "http://dynamodb:8000"
    streams_endpoint => "http://dynamodb:8000"
    view_type => "new_and_old_images"
    aws_access_key_id => "MY_ACCESS_KEY"
    aws_secret_access_key => "MY_SECRET"
    table_name => "dev_notifications"
  }
  dynamodb {
    _type => "dev_organizations"
    endpoint => "http://dynamodb:8000"
    streams_endpoint => "http://dynamodb:8000"
    view_type => "new_and_old_images"
    aws_access_key_id => "MY_ACCESS_KEY"
    aws_secret_access_key => "MY_SECRET"
    table_name => "dev_organizations"
  }
  dynamodb {
    _type => "dev_packers"
    endpoint => "http://dynamodb:8000"
    streams_endpoint => "http://dynamodb:8000"
    view_type => "new_and_old_images"
    aws_access_key_id => "MY_ACCESS_KEY"
    aws_secret_access_key => "MY_SECRET"
    table_name => "dev_packers"
  }
  dynamodb {
    _type => "dev_parcels"
    endpoint => "http://dynamodb:8000"
    streams_endpoint => "http://dynamodb:8000"
    view_type => "new_and_old_images"
    aws_access_key_id => "MY_ACCESS_KEY"
    aws_secret_access_key => "MY_SECRET"
    table_name => "dev_parcels"
  }
  dynamodb {
    _type => "dev_settings"
    endpoint => "http://dynamodb:8000"
    streams_endpoint => "http://dynamodb:8000"
    view_type => "new_and_old_images"
    aws_access_key_id => "MY_ACCESS_KEY"
    aws_secret_access_key => "MY_SECRET"
    table_name => "dev_settings"
  }
  dynamodb {
    _type => "dev_templates"
    endpoint => "http://dynamodb:8000"
    streams_endpoint => "http://dynamodb:8000"
    view_type => "new_and_old_images"
    aws_access_key_id => "MY_ACCESS_KEY"
    aws_secret_access_key => "MY_SECRET"
    table_name => "dev_templates"
  }
  dynamodb {
    _type => "dev_transactions"
    endpoint => "http://dynamodb:8000"
    streams_endpoint => "http://dynamodb:8000"
    view_type => "new_and_old_images"
    aws_access_key_id => "MY_ACCESS_KEY"
    aws_secret_access_key => "MY_SECRET"
    table_name => "dev_transactions"
  }
  dynamodb {
    _type => "dev_users"
    endpoint => "http://dynamodb:8000"
    streams_endpoint => "http://dynamodb:8000"
    view_type => "new_and_old_images"
    aws_access_key_id => "MY_ACCESS_KEY"
    aws_secret_access_key => "MY_SECRET"
    table_name => "dev_users"
  }
}
filter {
  dynamodb {}
  mutate {
    remove_field => ["message", "host", "eventName", "keys"]
  }
}
output {
  elasticsearch {
    hosts => ["http://elasticsearch:9200"]
    index => "%{[_type]}"
    document_id => "%{[id]}"
  }
  stdout {}
}

When I run Logstash, I got the following output :

Settings: Default pipeline workers: 4
log4j:WARN No appenders could be found for logger (org.apache.http.client.protocol.RequestAddCookies).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Pipeline main started

Now if I try to insert item in my table, Logstash dont do anything (Dont capture the streams ?)...
But if I restart Logstash, the initial scan actually found the changes in tables and output them to ElasticSearch.

Any idea on why I got this behaviour ?

@marcosnils
Copy link

@yomansk8 run logstash in debug mode (-D) and paste the output here

@yomansk8
Copy link
Author

yomansk8 commented Jan 5, 2017

@marcosnils This is what it gives me :

Reading config file {:config_file=>"/usr/conf/logstash.conf", :level=>:debug, :file=>"logstash/config/loader.rb", :line=>"69", :method=>"local_config"}
Plugin not defined in namespace, checking for plugin file {:type=>"input", :name=>"dynamodb", :path=>"logstash/inputs/dynamodb", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"86", :method=>"lookup"}
Unknown setting '_type' for dynamodb {:level=>:error, :file=>"logstash/config/mixin.rb", :line=>"330", :method=>"validate_check_invalid_parameter_names"}
fetched an invalid config {:config=>"...MYCONFIGFILEHERE...", :reason=>"Something is wrong with your configuration.", :level=>:error, :file=>"logstash/agent.rb", :line=>"448", :method=>"create_pipeline"}
starting agent {:level=>:info, :file=>"logstash/agent.rb", :line=>"213", :method=>"execute"}

Do you think that it can be caused by the unknown parameter _type ?

@marcosnils
Copy link

@yomansk8 IDK, but as the log states, something is wrong in the config

@yomansk8
Copy link
Author

yomansk8 commented Jan 5, 2017

@marcosnils I've tried to change the _type parameter to type. No more error in config file, but I got the same behavior as before... Nothing is triggered by Logstash and nothig is output to ElasticSearch until I restart Logstash.

@marcosnils
Copy link

Try inserting a record in dynamo after that. Verbose logging should show a lot of info.

@yomansk8
Copy link
Author

yomansk8 commented Jan 5, 2017

I've tried, but nothing is logged when I insert a record in dynamo.
The only thing that I see is a loop of this :

Flushing buffer at interval {:instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0x43f049 @operations_mutex=#<Mutex:0xe840e31>, @max_size=500, @operations_lock=#<Java::JavaUtilConcurrentLocks::ReentrantLock:0xabb9b39>, @submit_proc=#<Proc:0x5067640c@/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:57>, @logger=#<Cabin::Channel:0x7e3c0db7 @metrics=#<Cabin::Metrics:0x2808331a @metrics_lock=#<Mutex:0x2f96fb45>, @metrics={}, @channel=#<Cabin::Channel:0x7e3c0db7 ...>>, @subscriber_lock=#<Mutex:0x443abdbf>, @level=:debug, @subscribers={13568=>#<Cabin::Subscriber:0x330c399d @output=#<Cabin::Outputs::IO:0x2bd8a06a @io=#<IO:fd 1>, @lock=#<Mutex:0x3a1e27eb>>, @options={}>}, @data={}>, @last_flush=2017-01-05 13:33:22 +0000, @flush_interval=1, @stopping=#<Concurrent::AtomicBoolean:0x4a125d03>, @buffer=[], @flush_thread=#<Thread:0x1880450d run>>", :interval=>1, :level=>:debug, :file=>"logstash/outputs/elasticsearch/buffer.rb", :line=>"90", :method=>"interval_flush"}
Pushing flush onto pipeline {:level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}

And when I restart Logstash, he found my previously created records and output them to ElasticSearch producing the following logs :

filter received {:event=>{"message"=>"{\"eventID\":\"0\",\"eventName\":\"INSERT\",\"eventVersion\":\"1.0\",\"eventSource\":\"aws:dynamodb\",\"awsRegion\":{},\"dynamodb\":{\"keys\":{\"id\":{\"S\":\"123456789\"}},\"newImage\":{\"signature\":{\"BOOL\":false},\"printer\":{\"S\":\"A6\"},\"active\":{\"BOOL\":true},\"typ\":{\"S\":\"ORGA\"},\"services\":{\"S\":\"PickShip\"},\"logoUrl\":{\"S\":\"mulogourl\"},\"billing\":{\"S\":\"987654321\"},\"receptionMode\":{\"S\":\"BAG\"},\"updateBy\":{\"S\":\"456789123\"},\"name\":{\"S\":\"My awesome company\"},\"logo\":{\"S\":\"mylogo.png\"},\"id\":{\"S\":\"123456789\"},\"logoKey\":{\"S\":\"mylogokey\"},\"crn\":{\"S\":\"crn-123\"},\"updateWhen\":{\"S\":\"1483546492043\"},\"tva\":{\"S\":\"tva-123\"}},\"sequenceNumber\":\"0\",\"sizeBytes\":231,\"streamViewType\":\"NEW_AND_OLD_IMAGES\"}}", "host"=>"8023a429e412", "@version"=>"1", "@timestamp"=>"2017-01-05T13:33:13.088Z", "type"=>"dev_organizations"}, :level=>:debug, :file=>"(eval)", :line=>"130", :method=>"filter_func"}
filters/LogStash::Filters::Mutate: removing field {:field=>"message", :level=>:debug, :file=>"logstash/filters/base.rb", :line=>"175", :method=>"filter_matched"}
filters/LogStash::Filters::Mutate: removing field {:field=>"host", :level=>:debug, :file=>"logstash/filters/base.rb", :line=>"175", :method=>"filter_matched"}
filters/LogStash::Filters::Mutate: removing field {:field=>"eventName", :level=>:debug, :file=>"logstash/filters/base.rb", :line=>"175", :method=>"filter_matched"}
filters/LogStash::Filters::Mutate: removing field {:field=>"keys", :level=>:debug, :file=>"logstash/filters/base.rb", :line=>"175", :method=>"filter_matched"}
output received {:event=>{"@version"=>"1", "@timestamp"=>"2017-01-05T13:33:13.088Z", "type"=>"dev_organizations", "signature"=>false, "printer"=>"A6", "active"=>true, "typ"=>"ORGA", "services"=>"PickShip", "logoUrl"=>"mulogourl", "billing"=>"987654321", "receptionMode"=>"BAG", "updateBy"=>"456789123", "name"=>"My awesome company", "logo"=>"mylogo.png", "id"=>"123456789", "logoKey"=>"mylogokey", "crn"=>"crn-123", "updateWhen"=>"1483546492043", "tva"=>"tva-123"}, :level=>:debug, :file=>"(eval)", :line=>"138", :method=>"output_func"}
2017-01-05T13:33:13.088Z %{host} %{message}

@marcosnils
Copy link

@yomansk8 seems like logstash is working as it scans your tables when it starts. When you created your dynamodb tables, did you enable the stream option for those tables?

@yomansk8
Copy link
Author

yomansk8 commented Jan 5, 2017

Yes, the stream option is enable on those tables.
I've created all of them with the following parameters :

"StreamSpecification": {
  "StreamEnabled": true,
  "StreamViewType": "NEW_AND_OLD_IMAGES"
}

I agree that it seems like a stream problem, but I cant figure out where !

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants