jdk-8u121-linux-x64.tar.gz
kibana-5.0.0-linux-x86_64.tar.gz
elasticsearch-5.0.0.tar.gz
logstash-5.3.1.zip
下载并安装Logstash,安装logstash只需将它解压的对应目录即可,例如:/usr/local下:
logstash5.0.0下载:
https://www.elastic.co/downloads/past-releases/logstash-5-0-0
# tar –zxf logstash-1.5.2.tar.gz -C /usr/local/
安装完成后运行如下命令:
shell> bin/logstash -e 'input { stdin {} } output { stdout {} }'
运行
Sending Logstash logs to /usr/local/elasticsearch/files/logstash/logs which is now configured via log4j2.properties. The stdin plugin is now waiting for input: [2017-01-14T20:27:54,232][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>125} [2017-01-14T20:27:54,429][INFO ][logstash.pipeline ] Pipeline main started [2017-01-14T20:27:54,603][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} 2017-01-14T12:27:54.427Z 0.0.0.0
ctrl + C 退出,使用配置文件方法启动,创建配置文件(另一种格式输出:codec => rubydebug)
shell> vi config/logstashtest.conf input { stdin {} } output { stdout { codec => rubydebug } }
运行,使用配置文件
shell> bin/logstash -f config/logstashtest.conf
Sending Logstash logs to /usr/local/elasticsearch/files/logstash/logs which is now configured via log4j2.properties. The stdin plugin is now waiting for input: [2017-01-14T22:26:15,834][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>125} [2017-01-14T22:26:15,926][INFO ][logstash.pipeline ] Pipeline main started [2017-01-14T22:26:16,404][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9601} { "@timestamp" => 2017-01-14T14:26:15.952Z, "@version" => "1", "host" => "0.0.0.0", "message" => "" }
输入第一行信息,可看到输出结果:
hello logstash!
{
"@timestamp" => 2017-01-14T14:26:47.163Z,
"@version" => "1",
"host" => "0.0.0.0",
"message" => "hello logstash!"
}
现在配置输出到 Elasticsearch,也保留 stdout 的输出.(注意名称:hosts!)
shell> vi config/logstashtest.conf
input {
stdin {}
}
output{
elasticsearch {
hosts => ["192.168.1.222:9200"]
index => "test"
}
stdout {
codec => rubydebug
}
}
现在配置输出到 Elasticsearch,也保留 stdout 的输出.(注意名称:hosts!)
shell> vi config/logstashtest.conf input { stdin {} } output{ elasticsearch { hosts => ["192.168.1.222:9200"] index => "test" } stdout { codec => rubydebug } }
运行,使用配置文件
shell> bin/logstash -f config/logstashtest.conf Sending Logstash logs to /usr/local/elasticsearch/files/logstash/logs which is now configured via log4j2.properties. The stdin plugin is now waiting for input: [2017-01-14T23:10:11,512][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>["http://192.168.1.222:9200"]}} [2017-01-14T23:10:11,521][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil} [2017-01-14T23:10:11,785][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"} , "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false} , "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string" , "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string" , "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}] , "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false} , "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"} , "longitude"=>{"type"=>"half_float"}}}}}}}} [2017-01-14T23:10:17,071][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/logstash [2017-01-14T23:10:33,701][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["192.168.1.222:9200"]} [2017-01-14T23:10:33,714][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>125} [2017-01-14T23:10:33,715][INFO ][logstash.pipeline ] Pipeline main started [2017-01-14T23:10:35,171][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9601}
启动使用了模板,接着输入“hello logstash!”会正常输出 ,但是有没有保存到 elasticsearch 呢??
打开浏览器访问 kibana 地址: http://192.168.1.222:5601/
点击选项“Dev Tools”,查询输入的语句或单词:
GET _search
{
"query": {
"match_phrase": {
"message": "hello logstash!"
}
}
}
输出结果如下:
{
"took": 48,
"timed_out": false,
"_shards": {
"total": 31,
"successful": 31,
"failed": 0
},
"hits": {
"total": 1,
"max_score": 0.51623213,
"hits": [
{
"_index": "test",
"_type": "logs",
"_id": "AVmdjOMfxMlXNHIPQXkG",
"_score": 0.51623213,
"_source": {
"@timestamp": "2017-01-14T15:16:05.916Z",
"@version": "1",
"host": "0.0.0.0",
"message": "hello logstash!"
}
}
]
}
}