当前位置: 动力学知识库 > 问答 > 编程问答 >

elasticsearch - Data type conversion using logstash grok

问题描述:

Basic is a float field. The mentioned index is not present in elasticsearch. When running the config file with logstash -f, I am getting no exception. Yet, the data reflected and entered in elasticsearch shows the mapping of Basic as string. How do I rectify this? And how do I do this for multiple fields?

input {

file {

path => "/home/sagnik/work/logstash-1.4.2/bin/promosms_dec15.csv"

type => "promosms_dec15"

start_position => "beginning"

sincedb_path => "/dev/null"

}

}

filter {

grok{

match => [

"Basic", " %{NUMBER:Basic:float}"

]

}

csv {

columns => ["Generation_Date","Basic"]

separator => ","

}

ruby {

code => "event['Generation_Date'] = Date.parse(event['Generation_Date']);"

}

}

output {

elasticsearch {

action => "index"

host => "localhost"

index => "promosms-%{+dd.MM.YYYY}"

workers => 1

}

}

网友答案:

You have two problems. First, your grok filter is listed prior to the csv filter and because filters are applied in order there won't be a "Basic" field to convert when the grok filter is applied.

Secondly, unless you explicitly allow it, grok won't overwrite existing fields. In other words,

grok{
    match => [
        "Basic", " %{NUMBER:Basic:float}"
    ]
}

will always be a no-op. Either specify overwrite => ["Basic"] or, preferably, use mutate's type conversion feature:

mutate {
    convert => ["Basic", "float"]
}
分享给朋友:
您可能感兴趣的文章:
随机阅读: