Replies: 3 comments
-
The table widget is generic so it can have any column/type combination but elastic/splunk have a schema. So we can't just upload any table to the index, we need to match up the table schema with the relevant index. This is best done as a vql query in a cell I think. |
Beta Was this translation helpful? Give feedback.
-
Splunk doesn't really have a schema, if it's json it can be analysed. I thought same could be configured for Elastic, so it's able to ingest different formats as long as it's json (with the https://www.elastic.co/guide/en/logstash/current/plugins-codecs-json.html) but I may be wrong since I don't work with elastic that often... But as you mentioned one can always run a vql query which does it so that's fine as well (or configure the Server to do so for specific Artifacts etc..). |
Beta Was this translation helpful? Give feedback.
-
Elastic has a schema (called a mapping in elastic language) which defines how to index each field. You can always just upload the whole things as a JSON blob and full text index that but I am not really sure how useful it would be. Likewise I have never used splunk but according to instructions in the splunk upload artifact some preparation is needed |
Beta Was this translation helpful? Give feedback.
-
It would be awesome if there was a button next to the "download csv" button to upload results directly to Splunk/Elastic/...
Beta Was this translation helpful? Give feedback.
All reactions