elasticsearch,logstash , logstash tab separator not escaping

logstash tab separator not escaping


Tag: elasticsearch,logstash

I have tab separated data which I want to input into logstash. Here is my configuration file:

input {
    file {
        path => "/*.csv"
        type => "testSet"
        start_position => "beginning"

filter {
    csv {
        separator => "\t"

output {
    stdout {
        codec => rubydebug

It simply looks for all .csv files and separates them using tabs. For an input like this:

col1    col2
data1    data2

logstash output is (for the two rows):

column1 => "col1\tcol2"
column1 => "data1\tdata2"

Obviously it is not correctly parsing it. I saw that this issue was brought up a while ago here but there was no solution. Does anyone know if this problem has been resolved or maybe there's another way to do it? Thanks!


Instead of using "\t" as the seperator, input an actual tab. like this:

filter {
   csv {
    separator => "  "


Elasticsearch - Order search results ASC

having a problem with my elasticsearch. Setup: Having a Company-Class with the data field "companyName". My search shall search and response all companys with the searched term. If I try to sort via .Sort(x=> x.OnField(x => x.CompanyName).Descending()) The data aren't sorted rightly - reference stackOverflow I tried the given solution,...

How to define a bucket aggregation where buckets are defined by arbitrary filters on a field (GROUP BY CASE equivalent)

ElasticSearch enables us to filter a set of documents by regex on any given field, and also to group the resulting documents by the terms in a given (same or different field, using "bucket aggregations". For example, on an index that contains a "Url" field and a "UserAgent" field (some...

Bad scoring due to different maxDocs of IDF

I have two documents with a field title of: News New Website If I search for the term new website the score for the News document is much higher than the other one which is obviously not what I want. I wrapped an explain around it and got: 'hits': [{'_explanation':...

Docker container http requests limit

I'm new to Docker so, most likely, I'm missing something. I'm running a container with Elasticsearch, using this image. I'm able to setup everyhing correctly. After that I was a using a script developed by a collegue in order to insert some data, basically querying a MySQL database and making...

Elasticsearch - Query document missing an array value

I would like to query my elasticsearch index in order to retrieve the documents that don't contain a specific value in an array. For instance, if my query is : { "query": { "bool": { "must": [ { "match_all": {} } ], "must_not": [], "should": [] } }, "from": 0,...

Elasticsearch geospatial search, problems with index setup

I'm trying to search for documents previously added to an index, which has been configured to allow geospatial queries (or so I think). My elasticsearch instance is hosted on qbox.io. This is the code I wrote to create an index from the command line curl -XPOST username:[email protected]/events -d '{ "settings"...

Elasticsearch standard analyser stopwords

I am trying to guess what is the default stopwords list in standard analyzer in elasticsearch. I run version 1.3.1, and it seems to me that the English list is used, because running a wildcard query like this { "wildcard" : { "name" : { "wildcard" : "*in*" } }...

ElasticSearch REST - insert JSON string without using class

I am looking for an example where we can push below sample JSON string to ElasticSearch without using classes in REST api. { "UserID":1, "Username": "Test", "EmailID": "[email protected]" } We get the input as xml and we convert it to JSON string using NewtonSoft.JSON dll. I know REST api is...

Get document on some condition in elastic search java API

As I know we can parse document in elastic search, And when we search for a keyword, It will return the document using this code of java API:- org.elasticsearch.action.search.SearchResponse searchHits = node.client() .prepareSearch() .setIndices("indices") .setQuery(qb) .setFrom(0).setSize(1000) .addHighlightedField("file.filename") .addHighlightedField("content") .addHighlightedField("meta.title") .setHighlighterPreTags("<span class='badge badge-info'>") .setHighlighterPostTags("</span>") .addFields("*", "_source")...

ElasticSearch Multiple Scrolls Java API

I want to get all data from an index. Since the number of items is too large for memory I use the Scroll (nice function): client.prepareSearch(index) .setTypes(myType).setSearchType(SearchType.SCAN) .setScroll(new TimeValue(60000)) .setSize(amountPerCall) .setQuery(MatchAll()) .execute().actionGet(); Which works nice when calling: client.prepareSearchScroll(scrollId) .setScroll(new TimeValue(600000)) .execute().actionGet() But, when I call the former method multiple times,...

Elasticsearch NumberFormatException when running two consecutive java tests

I have two test in a class, each of them containing the following query: SearchQuery searchQuery = new NativeSearchQueryBuilder().withQuery(matchAllQuery()).withFilter(rangeFilter("publishDate").lt(date)).build(); In one of the tests, the number of the results elasticsearchTemplate.count(searchQuery, Article.class), in the other one the returned values are verified elasticsearchTemplate.queryForPage(searchQuery,Article.class) If I run any of these two tests separately,...

indexing names in json using elasticsearch in couchdb

I am trying to implement full-text query for my json documents. I want to search by title. My json is as follows: { "release":{ "genres":{ "genre":"Electronic" }, "identifiers":{ "identifier":[ { "description":"A-Side", "value":"MPO SK 032 A1 G PHRUPMASTERGENERAL T27 LONDON", "type":"Matrix / Runout" }, { "description":"B-Side", "value":"MPO SK 032 B1", "type":"Matrix...

ElasticSearch- “No query registered for…”

ElasticSearch returns me "No query registered for [likes_count]" error when trying to look up entries using the following query. The field likes_count is a new field of documents and does not exist in every document. The same query works without the sort part. Why does this error appear? Thanks {...

How to compute the scores based on field data in elasticsearch

I have the following fields in documents { name: "Pearl", age : 43, weight: 54, bodyWeight : 103, height : 1.8 } Now i want to get scores for the documents based on the bodyWeight to height ratio of the documents. How to do that?...

Elasticsearch and C# - query to find exact matches over strings

I need a way to search documents using a plain exact match over two or multiple fields which are of type "string" and "integer". I'd like to avoid standard query as I don't care about scoring or best match, just a yes/no outcome if both the fields match or not....

How to write search queries in kibana using Query DSL for Elasticsearch aggregation

I am working on ELK stack to process Apache access logs. Spent a lot of time understanding Query DSL format so that more complex queries can be written. Currently am facing issues with running the queries in kibana interface but the same queries work just fine when posted using curl...

NEST ElasticSearch.NET Escape Special Characters

I have been experimenting with the use of the NEST client for Elastic Search, but seem to have hit a barrier when filtering on a term which contains special/reserved characters such as '/' Below is a JSON representation of my model.. "categories": { "count": 1, "default": "root/Hello/World/Category", } When submitting...

How to use arrays in lambda expressions?

I am writing a program with NEST library of ElasticSearch. I want to write a lambda expression for a function with this argument: HighlighDescriptor<parentdocument> HighlighDescriptor.onFields (param Action<HighlightFieldDescriptor<ParentDocument>>[] fieldHighlighters) I don't know what is the array in the function argument?...

How to get duplicate field values in elastic search by field name without knowing its value

I have a field "EmployeeName" in an elastic search index - and I would like to execute a query that will return me all the cases where there are duplicate values of "EmployeeName". Can this be done? I found more_like_this but this requires field value for "like_text". But my requirement...

Strange behaviour of limit in Elasticsearch

I tried two queries. First one looks like this (it simply lists all data): # listing 1 from elasticsearch import Elasticsearch from elasticsearch_dsl import Search, Q, F .... .... connection etc s = Search(using=db,index="reestr") rows = s.execute() for r in rows: print(r) listing 1 prints out all documents from the...

elastic search sort in aggs by column

I am trying to sort in elastic search in aggs, equivalent in mysql "ORDER BY Title ASC/DESC". Here is the index structure: 'body' => array( 'mappings' => array( 'test_type' => array( '_source' => array( 'enabled' => true ), 'properties' => array( 'ProductId' => array( 'type' => 'integer', 'index' => 'not_analyzed'...

Logstash. Get fields by position number

Background I have the scheme: logs from my app go through rsyslog to central log server, then to Logstash and Elasticsearch. Logs from app is a pure JSON, but rsyslog adds to log "timestamp", "app name" and "server name" fileds. And log becomes to this: timestamp app-name server-name [JSON] Question...

Parsing Google Custom Search API for Elasticsearch Documents

After retrieving results from the Google Custom Search API and writing it to JSON, I want to parse that JSON to make valid Elasticsearch documents. You can configure a parent - child relationship for nested results. However, this relationship seems to not be inferred by the data structure itself. I've...

Elasticsearch: How to query using partial phrases in quotation marks

I am trying to implement a search behavior that supports partial phrases. A possible search input could look like this: example "hello world" elasticsearch Now I want to get all documents, that contain the words example and elasticsearch as well as the phrase hello world. As this is a very...

Elasticsearch boost per field with function score

I have a query with different query data for different fields and ORed results. I also want to favor hits with certain fields. Ideally this would only increase ranking but would not cause results that did not contain some of the terms in the other fields. This would skew results...

Get elasticsearch result based on two keys

I want to get all docs who's "PayerAccountId" should equal to "123" and "UsageStartDate" should be in range [2015-05-01 TO 2015-05-10] I am expecting something to run like this, curl -X GET -d '{"query" : {"match" : { "PayerAccountId:\"156023466485\" AND UsageStartDate:[2015-01-01 TO 2015-01-10]" }}}' Obviously it's not working any...

Elasticsearch aggregations over regex matching in a list

My documents in elasticsearch are of the form { ... dimensions : list[string] ... } I'd like to find all dimensions over all the documents that match a regex. I feel like an aggregation would probably do the trick, but I'm having trouble formulating it. For example, suppose I have...

NEST - Using GET instead of POST/PUT for searching

Is there a way to tell NEST to use GET instead of POST when performing searches? Similar to how the ElasticSearch documentation shows CURL using GET I'd like to use GET when using NEST instead of using POST as it currently does.

Operator '??' cannot be applied to operands of type IQueryContainer and lambda expression

I am trying to create a method to process a certain query. I follow an example posted on the Nest repository (line 60), but still the MatchAll is not recognized by the compiler and if I try to build the solution, the error that shows is: Operator '??' cannot be...

ElasticSearch: How to search on different fields that are not related that are arrays of objects

I want to search on different fields that are not related that are arrays of objects. I cannot find out how. Given the following mapping and data entry: I want to give the user the ability to search all possible fields in any combination. The user would use a form...

How to have multiple regex based on or condition in elasticsearch?

I want to get all 000ANT and 0BBNTA from id, is there something similar to terms which works with regexp or is there any other way? Otherwise I will have to query elasticsearch for each item say 000ANT and 0BBNTA. Please help. Below is something that I am trying out...

Logstash exec input plugin - Remove command run from @message

I'm using logstash 1.5.1 on a windows machine. I have to make a rest call, that delivers me JSON output. Therefore I'm using exec. The result is no json anymore :-(. The @message of this event will be the entire stdout of the command as one event. https://www.elastic.co/guide/en/logstash/current/plugins-inputs-exec.html My logstash...

ElasticSearch (Nest) Terms sub aggregation of Terms - Not working as intended

Taking the following mapping in account : { "person": { "properties": { "id": { "type": "string" }, "name": { "type": "string" }, ... "trainings": { "properties": { "attendanceDate": { "type": "date", "format": "dateOptionalTime" }, "providerId": { "type": "string", "index": "not_analyzed" }, "trainingId": { "type": "string", "index": "not_analyzed" } ... }...

logstash tab separator not escaping

I have tab separated data which I want to input into logstash. Here is my configuration file: input { file { path => "/*.csv" type => "testSet" start_position => "beginning" } } filter { csv { separator => "\t" } } output { stdout { codec => rubydebug } }...

MultiMatch query with Nest and Field Suffix

Using Elasticsearch I have a field with a suffix - string field with a .english suffix with an english analyser on it as shown in the following mapping ... "valueString": { "type": "string", "fields": { "english": { "type": "string", "analyzer": "english" } } } ... The following query snippet won't...

Creating Index in Elasticsearch using Java API giving NoClassFoundException

I'm trying to create a node based client using Java API and index a JSON document. Here's the code : import java.util.Date; import java.util.HashMap; import java.util.Map; import org.elasticsearch.action.deletebyquery.DeleteByQueryResponse; import org.elasticsearch.client.Client; import org.elasticsearch.node.Node; import static org.elasticsearch.node.NodeBuilder.*; public class Els { public static void main (String args[]){ Els p = new Els();...

get buckets count in elasticsearch aggregations

I am using elasticsearch to search a database with a lot of duplicates. I am using field colapse and it works, however it returns the amount of hits (including duplicates) and not the amount of buckets. "aggs": { "uniques": { "terms": { "field": "guid" }, "aggs": { "jobs": { "top_hits":...

Not able to access Kibana running in a Docker container on port 5601

I have built a docker image with the following Docker file. # gunicorn-flask FROM devdb/kibana MAINTAINER John Doe <[email protected]> ENV DEBIAN_FRONTEND noninteractive RUN apt-get update RUN apt-get install -y python python-pip python-virtualenv gunicorn # Setup flask application RUN mkdir -p /deploy/app COPY gunicorn_config.py /deploy/gunicorn_config.py COPY app /deploy/app RUN pip install...

ElasticSearch - Configuration to Analyse a document on Indexing

In a single request, I want to retrieve documents from a SOR, store them in ElasticSearch, and then search those documents using the ES search API. There seems to be some lag from the time the document is indexed and the time it is analyzed and ready to be searched....

Query returns both documents instead of just one

var res = esclient.Search<MyClass>(q => q .Query(fq => fq .Filtered(fqq => fqq .Query(qq => qq.MatchAll()) .Filter(ff => ff .Bool(b => b .Must(m1 => m1.Term("macaddress", "mac")) .Must(m2 => m2.Term("another_field", 123)) ) ) ) ) ); As far as I can understand the bool and must together are the equivalent of the...

ElasticSearch - how to get the auto generated id from an insert query

On my ElasticSearch database I need to get the autogenerated id from my insert query (I'm using .NET C#). How to do it? I tried debugging the readRecords response but I didn't find such id. Basically I need the equivalent of the MySQL LAST_INSERT_ID() command. var readRecords = elasticClient.Search<HistoryRecord>(s =>...

elasticsearch aggregation group by null key

here is the data in my elasticsearch server: {"system": "aaa"}, {"system": "bbb"}, {"system": null} I want to get the statistics for system. then I did the query: { "aggs" : { "myAggrs" : { "terms" : { "field" : "system" } } } it gives me the result: { "key":...

Javascript: Altering an object where dot notation is used [duplicate]

This question already has an answer here: How to access object properties containing special characters? 1 answer I'm building an Elasticsearch search interface. My method is to build the initial query object, and then alter it depending on the user input. In the filter part of my object, I...

ElasticSearch asynchronous post

I'm posting data on my ElasticSearch database. I've noticed that data is not immediately available, it requires some milliseconds to show up in a GET request. I can live with that (after all, the calls are asynchronous so this behavior is expected) but in my test code I need to...

How to read data in logs using logstash?

I have just started log stash, i have log files in that log file whole object is printed in the logs, Since my object is huge i cant write the grok patterns to the whole object and also i expecting only two values out of those object. Can you please...

Passing Elasticsearch and Kibana config file to docker containers

I have found a docker image devdb/kibana which runs Elasticsearch 1.5.2 and Kibana 4.0.2. However I would like to pass into this docker container the configuration files for both Elasticsearch (i.e elasticsearch.yml) and Kibana (i.e config.js) Can I do that with this image itself? Or for that would I have...

Re-index object with new fields

It seems like as long as the id field is maintained, its super easy to re-index a document by simply calling Index(), but is there a way to given an object was updated and new fields were added, to have it include these new fields in the index? I'm not...

Sending logs every 2 hours using logstash-forwarder without using cronjob

Is there a way I can send data using the logstash-forwarder every 2 hours or more without using a cronjob script to start and stop the forwarder every time I want to send the data?