如果您愿意使用Logstash,则可以通过
elasticsearch输入进行查询,然后使用
csv输出将数据转储到CSV文件中来轻松完成此操作。看起来像这样:
input { elasticsearch { hosts => ["localhost:9200"] index => "your_index" query => '{"query": {"match_all": {}}}' }}output { csv { fields => ["field1", "field2", "field3"] path => "/path/to/file.csv" }}更新
如果需要动态调用此函数,则可以基于要作为输入给Shell脚本的查询来动态生成此logstash配置:
#!/bin/shif [ -z "$LOGSTASH_HOME" ]; then echo "WARNING: The LOGSTASH_HOME environment variable is not set!" exit 0fiLS_CONF="input { elasticsearch { hosts => ["localhost:9200"] index => 'megacorp' query => '{"query":{"query_string": {"query": "$1"}}}' }}output { csv { fields => [$2] path => "/path/to/file.csv" }}"$LOGSTASH_HOME/bin/logstash -e "$LS_CONF"然后,您可以
my_field:123456像这样通过查询调用该脚本
./es_to_csv.sh "my_field:123456" "field1,field2,field3"
这将具有与调用相同的效果,
{{elasticUrl}}/_search?q=my_field:123456并产生带有列的CSV文件field1,field2,field3



