elk 清除 indices delete clean windows curator

ELK 教學 - 定期清除 Elasticsearch 資料 https://blog.johnwu.cc/article/elk-purge-elasticsearch-index.html https://www.elastic.co/guide/en/elasticsearch/client/curator/current/configfile.html https://www.elastic.co/guide/en/elasticsearch/client/curator/current/ex_delete_indices.html https://anjia0532.github.io/2017/04/06/elasticsearch-delete-indices-by-date/ config.yml # Remember, leave a key empty if there is no value. None will be a string, # not a Python "NoneType" client: hosts: - xxx.xxx.xxx.xxx port: 9200 url_prefix: use_ssl: False certificate: client_cert: client_key: ssl_no_validate: False http_auth: timeout: 30 #timeout: 60 master_only: False logging: loglevel: INFO logfile: logformat: default blacklist: ['elasticsearch', 'urllib3'] curator_filebeat.yml # Remember, leave a key empty if there is no value. None will be a string, # not a Python "NoneType" # # Also remember that all examples have 'disable_action' set to True. If you # want to use this action as a template, be sure to set this to False after # copying it. actions: 1: action: delete_indices description: >- Delete indices older than 30 days (based on index name), for logstash- prefixed indices. Ignore the error if the filter does not result in an actionable list of indices (ignore_empty_list) and exit cleanly. options: ignore_empty_list: True disable_action: False filters: - filtertype: pattern kind: prefix value: filebeat- - filtertype: age source: name direction: older timestring: '%Y.%m.%d' unit: days unit_count: 30 curator_heartbeat.yml ...

2019-01-08 · 2 min · 337 words · Me

ethereum-etl ethereumetl elk logstash kibana

all output columns with logstash filter { if [srctype] == "etl" { #[fields][srctype] csv { columns => [ "number", "hash", "parent_hash", "nonce", "sha3_uncles", "logs_bloom", "transactions_root", "state_root", "receipts_root", "timestamp", "extra_data", "transaction_count", "gas_limit", "size", "total_difficulty", "difficulty", "miner", "block_hash", "block_number", "transaction_index", "from_address", "to_address", "value", "gas", "gas_price", "input", "address", "bytecode", "function_sighashes", "is_erc20", "is_erc721", "log_index", "transaction_hash", "data", "topics", "cumulative_gas_used", "gas_used", "contract_address", "root,status" ] separator => "," remove_field => ["message"] #autodetect_column_names => true #have problems #autogenerate_column_names => true #have problems skip_empty_columns => true skip_empty_rows => true } }

2018-12-17 · 1 min · 82 words · Me

logstash fileds if

https://sueboy.blogspot.com/2018/11/elk60filebeatdocumenttype.html filebeat.yml - type: log paths: - /var/log/geth.log exclude_files: ['.gz$'] fields: srctype: "geth" pipleline logstah.conf if [fields][srctype] == “geth” { BUT fields_under_root: true - type: log paths: - /var/log/geth.log exclude_files: ['.gz$'] fields: srctype: "geth" fields_under_root: true if [srctype] == “geth” {

2018-12-17 · 1 min · 41 words · Me

kibana geo_point How to Part 5

Oragin geoip { source => "filebeatserverip" target => "filebeatserveripgeoip" add_field => [ "[filebeatserveripgeoip][coordinates]", "%{[filebeatserveripgeoip][longitude]}" ] add_field => [ "[filebeatserveripgeoip][coordinates]", "%{[filebeatserveripgeoip][latitude]}" ] } mutate { convert => ["[filebeatserveripgeoip][coordinates]", "float"] } Delete add_field => [ “[filebeatserveripgeoip][coordinates]”, “%{[filebeatserveripgeoip][longitude]}” ] add_field => [ “[filebeatserveripgeoip][coordinates]”, “%{[filebeatserveripgeoip][latitude]}” ] convert => ["[filebeatserveripgeoip][coordinates]", “float”] geoip { source => "filebeatserverip" target => "filebeatserveripgeoip" } mutate { } ===== { "index_patterns": ["filebeat*", "heartbeat*"], "settings": { "number_of_shards": 1 }, "mappings": { "doc": { "properties": { "filebeatserveripgeoip.coordinates": { "type": "geo_point" } } } } } Change filebeatserveripgeoip.coordinates -> filebeatserveripgeoip.location ...

2018-12-07 · 1 min · 177 words · Me

geth log No Year

geth log mined INFO [12-07|13:04:44] 🔨 mined potential block number=1934700 hash=3f9161…88da7d only month-day ……. grok { match => ["message", "%{LOGLEVEL:logType} \[%{DATA:gethmm}-%{DATA:gethdd}\|%{DATA:gethtime}\] %{GREEDYDATA:tmessage} number=(?\b\w+\b) hash=(?\b\w+...\w+\b)"] add_field => ["gethdate", "%{[gethmm]}-%{[gethdd]} %{[gethtime]}"] } ruby { code => " tstamp = event.get('@timestamp').to_i event.set('epoch',tstamp) event.set('gethdate', Time.at(tstamp).strftime('%Y')+'-'+event.get('gethdate')) " } date { match => [ "gethdate" , "YYYY-MM-dd HH:mm:ss"] target => "gethdate" timezone => "Asia/Taipei" } Recreate index GET _cat/indices?v GET _cat/indices?v&s=index GET filebeat-6.5.1-2018.12.06 DELETE filebeat-6.5.1-2018.12.06 GET _cat/indices?v ...

2018-12-07 · 1 min · 82 words · Me