Refactor API, add aggregations and custom queries
This commit introduces a refactor of the codebase and the API, to make it more user friendly. Queries can now directly be executed via the `Run()` method. Internally, the library no longer uses JSON generation as a major mechanism, instead all types need to implement a `Mappable` interface which simply turns each type in a `map[string]interface{}`, which is what the ElasticSearch client expects. This makes the code easier to write, and makes writing tests less error prone, as JSON need not be written directly. Support for metrics aggregations is also added. However, aggregations of type bucket, pipeline and matrix are not supported yet. To make the library more useful in its current state, support is added for running custom queries and aggregations, via the `CustomQuery()` and `CustomAgg()` functions, which both accepts an arbitrary `map[string]interface{}`.
This commit is contained in:
parent
55000abc77
commit
1dd88421a2
65
README.md
65
README.md
|
@ -1,10 +1,14 @@
|
|||
# esquery
|
||||
|
||||
`esquery` is an idiomatic, easy-to-use query builder for the [official Go client](https://github.com/elastic/go-elasticsearch) for [ElasticSearch](https://www.elastic.co/products/elasticsearch). It alleviates the need to use extremely nested maps of empty interfaces and serializing queries to JSON manually. It also helps eliminating common mistakes such as misspelling query types, as everything is statically typed.
|
||||
**esquery** is a non-obtrusive, idiomatic and easy-to-use query and aggregation builder for the [official Go client](https://github.com/elastic/go-elasticsearch) for [ElasticSearch](https://www.elastic.co/products/elasticsearch). It alleviates the need to use extremely nested maps (`map[string]interface{}`) and serializing queries to JSON manually. It also helps eliminating common mistakes such as misspelling query types, as everything is statically typed.
|
||||
|
||||
Save yourself some joint aches and many lines of code by switching for maps to `esquery`. Wanna know how much code you'll save? just read this project's test.
|
||||
|
||||
## Usage
|
||||
|
||||
`esquery` can be used directly to build queries, with no need for external dependencies. It can execute the queries against an existing instance of `*esapi.API`, but the queries can also be manually converted to JSON if necessary.
|
||||
esquery provides a [method chaining](https://en.wikipedia.org/wiki/Method_chaining)-style API for building and executing queries and aggregations. It does not wrap the official Go client nor does it require you to change your existing code in order to integrate the library. Queries can be directly built with `esquery`, and executed by passing an `*elasticsearch.Client` instance (with optional search parameters). Results are returned as-is from the official client (e.g. `*esapi.Response` objects).
|
||||
|
||||
Getting started is extremely simple:
|
||||
|
||||
```go
|
||||
package main
|
||||
|
@ -18,17 +22,20 @@ import (
|
|||
)
|
||||
|
||||
func main() {
|
||||
// connect to an ElasticSearch instance
|
||||
es, err := elasticsearch.NewDefaultClient()
|
||||
if err != nil {
|
||||
log.Fatalf("Failed creating client: %s", err)
|
||||
}
|
||||
|
||||
res, err := esquery.Search(
|
||||
es,
|
||||
// run a boolean search query
|
||||
qRes, err := esquery.Query(
|
||||
esquery.
|
||||
Bool().
|
||||
Must(esquery.Term("title", "Go and Stuff")).
|
||||
Filter(esquery.Term("tag", "tech")),
|
||||
).Run(
|
||||
es,
|
||||
es.Search.WithContext(context.TODO()),
|
||||
es.Search.WithIndex("test"),
|
||||
)
|
||||
|
@ -36,15 +43,30 @@ func main() {
|
|||
log.Fatalf("Failed searching for stuff: %s", err)
|
||||
}
|
||||
|
||||
defer res.Body.Close()
|
||||
defer qRes.Body.Close()
|
||||
|
||||
// ...
|
||||
// run an aggregation
|
||||
aRes, err := esquery.Aggregate(
|
||||
esquery.Avg("average_score", "score"),
|
||||
esquery.Max("max_score", "score"),
|
||||
).Run(
|
||||
es,
|
||||
es.Search.WithContext(context.TODO()),
|
||||
es.Search.WithIndex("test"),
|
||||
)
|
||||
if err != nil {
|
||||
log.Fatalf("Failed searching for stuff: %s", err)
|
||||
}
|
||||
|
||||
defer aRes.Body.Close()
|
||||
|
||||
// ...
|
||||
}
|
||||
```
|
||||
|
||||
## Notes
|
||||
|
||||
* Library currently supports v7 of the ElasticSearch Go client.
|
||||
* `esquery` currently supports version 7 of the ElasticSearch Go client.
|
||||
* The library cannot currently generate "short queries". For example, whereas
|
||||
ElasticSearch can accept this:
|
||||
|
||||
|
@ -62,11 +84,11 @@ func main() {
|
|||
either receive one query object, or an array of query objects. `esquery` will
|
||||
generate an array even if there's only one query object.
|
||||
|
||||
## Supported queries
|
||||
## Supported Queries
|
||||
|
||||
The following queries are currently supported:
|
||||
|
||||
| Query | `esquery` Function |
|
||||
| ElasticSearch DSL | `esquery` Function |
|
||||
| ------------------------|---------------------- |
|
||||
| `"match"` | `Match()` |
|
||||
| `"match_bool_prefix"` | `MatchBoolPrefix()` |
|
||||
|
@ -88,3 +110,28 @@ The following queries are currently supported:
|
|||
| `"boosting"` | `Boosting()` |
|
||||
| `"constant_score"` | `ConstantScore()` |
|
||||
| `"dis_max"` | `DisMax()` |
|
||||
|
||||
### Custom Queries
|
||||
|
||||
To execute an arbitrary query, or any query that is not natively supported by the library yet, use the `CustomQuery()` function, which accepts any `map[string]interface{}` value.
|
||||
|
||||
## Supported Aggregations
|
||||
|
||||
The following aggregations are currently supported:
|
||||
|
||||
| ElasticSearch DSL | `esquery` Function |
|
||||
| ------------------------|---------------------- |
|
||||
| `"avg"` | `Avg()` |
|
||||
| `"weighted_avg"` | `WeightedAvg()` |
|
||||
| `"cardinality"` | `Cardinality()` |
|
||||
| `"max"` | `Max()` |
|
||||
| `"min"` | `Min()` |
|
||||
| `"sum"` | `Sum()` |
|
||||
| `"value_count"` | `ValueCount()` |
|
||||
| `"percentiles"` | `Percentiles()` |
|
||||
| `"stats"` | `Stats()` |
|
||||
| `"string_stats"` | `StringStats()` |
|
||||
|
||||
### Custom Aggregations
|
||||
|
||||
To execute an arbitrary aggregation, or any aggregation that is not natively supported by the library yet, use the `CustomAgg()` function, which accepts any `map[string]interface{}` value.
|
||||
|
|
|
@ -0,0 +1,56 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"encoding/json"
|
||||
|
||||
"github.com/elastic/go-elasticsearch/v7"
|
||||
"github.com/elastic/go-elasticsearch/v7/esapi"
|
||||
)
|
||||
|
||||
type Aggregation interface {
|
||||
Mappable
|
||||
Name() string
|
||||
}
|
||||
|
||||
type AggregationRequest struct {
|
||||
Aggs map[string]Mappable
|
||||
}
|
||||
|
||||
func Aggregate(aggs ...Aggregation) *AggregationRequest {
|
||||
req := &AggregationRequest{
|
||||
Aggs: make(map[string]Mappable),
|
||||
}
|
||||
for _, agg := range aggs {
|
||||
req.Aggs[agg.Name()] = agg
|
||||
}
|
||||
|
||||
return req
|
||||
}
|
||||
|
||||
func (req *AggregationRequest) Map() map[string]interface{} {
|
||||
m := make(map[string]interface{})
|
||||
|
||||
for name, agg := range req.Aggs {
|
||||
m[name] = agg.Map()
|
||||
}
|
||||
|
||||
return map[string]interface{}{
|
||||
"aggs": m,
|
||||
}
|
||||
}
|
||||
|
||||
func (req *AggregationRequest) Run(
|
||||
api *elasticsearch.Client,
|
||||
o ...func(*esapi.SearchRequest),
|
||||
) (res *esapi.Response, err error) {
|
||||
var b bytes.Buffer
|
||||
err = json.NewEncoder(&b).Encode(req.Map())
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
opts := append([]func(*esapi.SearchRequest){api.Search.WithBody(&b)}, o...)
|
||||
|
||||
return api.Search(opts...)
|
||||
}
|
|
@ -0,0 +1,62 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestAggregations(t *testing.T) {
|
||||
runMapTests(t, []mapTest{
|
||||
{
|
||||
"a simple, single aggregation",
|
||||
Aggregate(
|
||||
Avg("average_score", "score"),
|
||||
),
|
||||
map[string]interface{}{
|
||||
"aggs": map[string]interface{}{
|
||||
"average_score": map[string]interface{}{
|
||||
"avg": map[string]interface{}{
|
||||
"field": "score",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"a complex, multi-aggregation",
|
||||
Aggregate(
|
||||
Sum("total_score", "score"),
|
||||
WeightedAvg("weighted_score").
|
||||
Value("score", 50).
|
||||
Weight("weight", 1),
|
||||
StringStats("tag_stats", "tags").ShowDistribution(true),
|
||||
),
|
||||
map[string]interface{}{
|
||||
"aggs": map[string]interface{}{
|
||||
"total_score": map[string]interface{}{
|
||||
"sum": map[string]interface{}{
|
||||
"field": "score",
|
||||
},
|
||||
},
|
||||
"weighted_score": map[string]interface{}{
|
||||
"weighted_avg": map[string]interface{}{
|
||||
"value": map[string]interface{}{
|
||||
"field": "score",
|
||||
"missing": 50,
|
||||
},
|
||||
"weight": map[string]interface{}{
|
||||
"field": "weight",
|
||||
"missing": 1,
|
||||
},
|
||||
},
|
||||
},
|
||||
"tag_stats": map[string]interface{}{
|
||||
"string_stats": map[string]interface{}{
|
||||
"field": "tags",
|
||||
"show_distribution": true,
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
})
|
||||
}
|
|
@ -0,0 +1,13 @@
|
|||
package esquery
|
||||
|
||||
type CustomAggregation struct {
|
||||
m map[string]interface{}
|
||||
}
|
||||
|
||||
func CustomAgg(m map[string]interface{}) *CustomAggregation {
|
||||
return &CustomAggregation{m}
|
||||
}
|
||||
|
||||
func (agg *CustomAggregation) Map() map[string]interface{} {
|
||||
return agg.m
|
||||
}
|
|
@ -0,0 +1,35 @@
|
|||
package esquery
|
||||
|
||||
import "testing"
|
||||
|
||||
func TestCustomAgg(t *testing.T) {
|
||||
m := map[string]interface{}{
|
||||
"genres": map[string]interface{}{
|
||||
"terms": map[string]interface{}{
|
||||
"field": "genre",
|
||||
},
|
||||
"t_shirts": map[string]interface{}{
|
||||
"filter": map[string]interface{}{
|
||||
"term": map[string]interface{}{
|
||||
"type": "t-shirt",
|
||||
},
|
||||
},
|
||||
"aggs": map[string]interface{}{
|
||||
"avg_price": map[string]interface{}{
|
||||
"avg": map[string]interface{}{
|
||||
"field": "price",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
runMapTests(t, []mapTest{
|
||||
{
|
||||
"custom aggregation",
|
||||
CustomAgg(m),
|
||||
m,
|
||||
},
|
||||
})
|
||||
}
|
|
@ -0,0 +1,324 @@
|
|||
package esquery
|
||||
|
||||
import "github.com/fatih/structs"
|
||||
|
||||
type BaseAgg struct {
|
||||
name string
|
||||
apiName string
|
||||
*BaseAggParams `structs:",flatten"`
|
||||
}
|
||||
|
||||
type BaseAggParams struct {
|
||||
Field string `structs:"field"`
|
||||
Miss interface{} `structs:"missing,omitempty"`
|
||||
}
|
||||
|
||||
func newBaseAgg(apiName, name, field string) *BaseAgg {
|
||||
return &BaseAgg{
|
||||
name: name,
|
||||
apiName: apiName,
|
||||
BaseAggParams: &BaseAggParams{
|
||||
Field: field,
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
func (agg *BaseAgg) Name() string {
|
||||
return agg.name
|
||||
}
|
||||
|
||||
func (agg *BaseAgg) Map() map[string]interface{} {
|
||||
return map[string]interface{}{
|
||||
agg.apiName: structs.Map(agg.BaseAggParams),
|
||||
}
|
||||
}
|
||||
|
||||
/*******************************************************************************
|
||||
* Avg Aggregation
|
||||
* https://www.elastic.co/guide/en/elasticsearch/reference/
|
||||
* current/search-aggregations-metrics-avg-aggregation.html
|
||||
******************************************************************************/
|
||||
|
||||
type AvgAgg struct {
|
||||
*BaseAgg `structs:",flatten"`
|
||||
}
|
||||
|
||||
func Avg(name, field string) *AvgAgg {
|
||||
return &AvgAgg{
|
||||
BaseAgg: newBaseAgg("avg", name, field),
|
||||
}
|
||||
}
|
||||
|
||||
func (agg *AvgAgg) Missing(val interface{}) *AvgAgg {
|
||||
agg.Miss = val
|
||||
return agg
|
||||
}
|
||||
|
||||
/*******************************************************************************
|
||||
* Weighed Avg Aggregation
|
||||
* https://www.elastic.co/guide/en/elasticsearch/reference/
|
||||
* current/search-aggregations-metrics-weight-avg-aggregation.html
|
||||
******************************************************************************/
|
||||
|
||||
type WeightedAvgAgg struct {
|
||||
name string
|
||||
apiName string
|
||||
Val *BaseAggParams `structs:"value"`
|
||||
Weig *BaseAggParams `structs:"weight"`
|
||||
}
|
||||
|
||||
func WeightedAvg(name string) *WeightedAvgAgg {
|
||||
return &WeightedAvgAgg{
|
||||
name: name,
|
||||
apiName: "weighted_avg",
|
||||
}
|
||||
}
|
||||
|
||||
func (agg *WeightedAvgAgg) Name() string {
|
||||
return agg.name
|
||||
}
|
||||
|
||||
func (agg *WeightedAvgAgg) Value(field string, missing ...interface{}) *WeightedAvgAgg {
|
||||
agg.Val = new(BaseAggParams)
|
||||
agg.Val.Field = field
|
||||
if len(missing) > 0 {
|
||||
agg.Val.Miss = missing[len(missing)-1]
|
||||
}
|
||||
return agg
|
||||
}
|
||||
|
||||
func (agg *WeightedAvgAgg) Weight(field string, missing ...interface{}) *WeightedAvgAgg {
|
||||
agg.Weig = new(BaseAggParams)
|
||||
agg.Weig.Field = field
|
||||
if len(missing) > 0 {
|
||||
agg.Weig.Miss = missing[len(missing)-1]
|
||||
}
|
||||
return agg
|
||||
}
|
||||
|
||||
func (agg *WeightedAvgAgg) Map() map[string]interface{} {
|
||||
return map[string]interface{}{
|
||||
agg.apiName: structs.Map(agg),
|
||||
}
|
||||
}
|
||||
|
||||
/*******************************************************************************
|
||||
* Cardinality Aggregation
|
||||
* https://www.elastic.co/guide/en/elasticsearch/reference/
|
||||
* current/search-aggregations-metrics-cardinality-aggregation.html
|
||||
******************************************************************************/
|
||||
|
||||
type CardinalityAgg struct {
|
||||
*BaseAgg `structs:",flatten"`
|
||||
PrecisionThr uint16 `structs:"precision_threshold,omitempty"`
|
||||
}
|
||||
|
||||
func Cardinality(name, field string) *CardinalityAgg {
|
||||
return &CardinalityAgg{
|
||||
BaseAgg: newBaseAgg("cardinality", name, field),
|
||||
}
|
||||
}
|
||||
|
||||
func (agg *CardinalityAgg) Missing(val interface{}) *CardinalityAgg {
|
||||
agg.Miss = val
|
||||
return agg
|
||||
}
|
||||
|
||||
func (agg *CardinalityAgg) PrecisionThreshold(val uint16) *CardinalityAgg {
|
||||
agg.PrecisionThr = val
|
||||
return agg
|
||||
}
|
||||
|
||||
func (agg *CardinalityAgg) Map() map[string]interface{} {
|
||||
return map[string]interface{}{
|
||||
agg.apiName: structs.Map(agg),
|
||||
}
|
||||
}
|
||||
|
||||
/*******************************************************************************
|
||||
* Max Aggregation
|
||||
* https://www.elastic.co/guide/en/elasticsearch/reference/
|
||||
* current/search-aggregations-metrics-max-aggregation.html
|
||||
******************************************************************************/
|
||||
|
||||
type MaxAgg struct {
|
||||
*BaseAgg `structs:",flatten"`
|
||||
}
|
||||
|
||||
func Max(name, field string) *MaxAgg {
|
||||
return &MaxAgg{
|
||||
BaseAgg: newBaseAgg("max", name, field),
|
||||
}
|
||||
}
|
||||
|
||||
func (agg *MaxAgg) Missing(val interface{}) *MaxAgg {
|
||||
agg.Miss = val
|
||||
return agg
|
||||
}
|
||||
|
||||
/*******************************************************************************
|
||||
* Min Aggregation
|
||||
* https://www.elastic.co/guide/en/elasticsearch/reference/
|
||||
* current/search-aggregations-metrics-min-aggregation.html
|
||||
******************************************************************************/
|
||||
|
||||
type MinAgg struct {
|
||||
*BaseAgg `structs:",flatten"`
|
||||
}
|
||||
|
||||
func Min(name, field string) *MinAgg {
|
||||
return &MinAgg{
|
||||
BaseAgg: newBaseAgg("min", name, field),
|
||||
}
|
||||
}
|
||||
|
||||
func (agg *MinAgg) Missing(val interface{}) *MinAgg {
|
||||
agg.Miss = val
|
||||
return agg
|
||||
}
|
||||
|
||||
/*******************************************************************************
|
||||
* Sum Aggregation
|
||||
* https://www.elastic.co/guide/en/elasticsearch/reference/
|
||||
* current/search-aggregations-metrics-sum-aggregation.html
|
||||
******************************************************************************/
|
||||
|
||||
type SumAgg struct {
|
||||
*BaseAgg `structs:",flatten"`
|
||||
}
|
||||
|
||||
func Sum(name, field string) *SumAgg {
|
||||
return &SumAgg{
|
||||
BaseAgg: newBaseAgg("sum", name, field),
|
||||
}
|
||||
}
|
||||
|
||||
func (agg *SumAgg) Missing(val interface{}) *SumAgg {
|
||||
agg.Miss = val
|
||||
return agg
|
||||
}
|
||||
|
||||
/*******************************************************************************
|
||||
* Value Count Aggregation
|
||||
* https://www.elastic.co/guide/en/elasticsearch/reference/
|
||||
* current/search-aggregations-metrics-valuecount-aggregation.html
|
||||
******************************************************************************/
|
||||
|
||||
type ValueCountAgg struct {
|
||||
*BaseAgg `structs:",flatten"`
|
||||
}
|
||||
|
||||
func ValueCount(name, field string) *ValueCountAgg {
|
||||
return &ValueCountAgg{
|
||||
BaseAgg: newBaseAgg("value_count", name, field),
|
||||
}
|
||||
}
|
||||
|
||||
/*******************************************************************************
|
||||
* Percentiles Aggregation
|
||||
* https://www.elastic.co/guide/en/elasticsearch/reference/
|
||||
* current/search-aggregations-metrics-percentile-aggregation.html
|
||||
******************************************************************************/
|
||||
|
||||
type PercentilesAgg struct {
|
||||
*BaseAgg `structs:",flatten"`
|
||||
Prcnts []float32 `structs:"percents,omitempty"`
|
||||
Key *bool `structs:"keyed,omitempty"`
|
||||
TDigest struct {
|
||||
Compression uint16 `structs:"compression,omitempty"`
|
||||
} `structs:"tdigest,omitempty"`
|
||||
HDR struct {
|
||||
NumHistogramDigits uint8 `structs:"number_of_significant_value_digits,omitempty"`
|
||||
} `structs:"hdr,omitempty"`
|
||||
}
|
||||
|
||||
func Percentiles(name, field string) *PercentilesAgg {
|
||||
return &PercentilesAgg{
|
||||
BaseAgg: newBaseAgg("percentiles", name, field),
|
||||
}
|
||||
}
|
||||
|
||||
func (agg *PercentilesAgg) Percents(percents ...float32) *PercentilesAgg {
|
||||
agg.Prcnts = percents
|
||||
return agg
|
||||
}
|
||||
|
||||
func (agg *PercentilesAgg) Missing(val interface{}) *PercentilesAgg {
|
||||
agg.Miss = val
|
||||
return agg
|
||||
}
|
||||
|
||||
func (agg *PercentilesAgg) Keyed(b bool) *PercentilesAgg {
|
||||
agg.Key = &b
|
||||
return agg
|
||||
}
|
||||
|
||||
func (agg *PercentilesAgg) Compression(val uint16) *PercentilesAgg {
|
||||
agg.TDigest.Compression = val
|
||||
return agg
|
||||
}
|
||||
|
||||
func (agg *PercentilesAgg) NumHistogramDigits(val uint8) *PercentilesAgg {
|
||||
agg.HDR.NumHistogramDigits = val
|
||||
return agg
|
||||
}
|
||||
|
||||
func (agg *PercentilesAgg) Map() map[string]interface{} {
|
||||
return map[string]interface{}{
|
||||
agg.apiName: structs.Map(agg),
|
||||
}
|
||||
}
|
||||
|
||||
/*******************************************************************************
|
||||
* Stats Aggregation
|
||||
* https://www.elastic.co/guide/en/elasticsearch/reference/
|
||||
* current/search-aggregations-metrics-stats-aggregation.html
|
||||
******************************************************************************/
|
||||
|
||||
type StatsAgg struct {
|
||||
*BaseAgg `structs:",flatten"`
|
||||
}
|
||||
|
||||
func Stats(name, field string) *StatsAgg {
|
||||
return &StatsAgg{
|
||||
BaseAgg: newBaseAgg("stats", name, field),
|
||||
}
|
||||
}
|
||||
|
||||
func (agg *StatsAgg) Missing(val interface{}) *StatsAgg {
|
||||
agg.Miss = val
|
||||
return agg
|
||||
}
|
||||
|
||||
/*******************************************************************************
|
||||
* String Stats Aggregation
|
||||
* https://www.elastic.co/guide/en/elasticsearch/reference/
|
||||
* current/search-aggregations-metrics-string-stats-aggregation.html
|
||||
******************************************************************************/
|
||||
|
||||
type StringStatsAgg struct {
|
||||
*BaseAgg `structs:",flatten"`
|
||||
ShowDist *bool `structs:"show_distribution,omitempty"`
|
||||
}
|
||||
|
||||
func StringStats(name, field string) *StringStatsAgg {
|
||||
return &StringStatsAgg{
|
||||
BaseAgg: newBaseAgg("string_stats", name, field),
|
||||
}
|
||||
}
|
||||
|
||||
func (agg *StringStatsAgg) Missing(val interface{}) *StringStatsAgg {
|
||||
agg.Miss = val
|
||||
return agg
|
||||
}
|
||||
|
||||
func (agg *StringStatsAgg) ShowDistribution(b bool) *StringStatsAgg {
|
||||
agg.ShowDist = &b
|
||||
return agg
|
||||
}
|
||||
|
||||
func (agg *StringStatsAgg) Map() map[string]interface{} {
|
||||
return map[string]interface{}{
|
||||
agg.apiName: structs.Map(agg),
|
||||
}
|
||||
}
|
|
@ -0,0 +1,158 @@
|
|||
package esquery
|
||||
|
||||
import "testing"
|
||||
|
||||
func TestMetricAggs(t *testing.T) {
|
||||
runMapTests(t, []mapTest{
|
||||
{
|
||||
"avg agg: simple",
|
||||
Avg("average_score", "score"),
|
||||
map[string]interface{}{
|
||||
"avg": map[string]interface{}{
|
||||
"field": "score",
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"avg agg: with missing",
|
||||
Avg("average_score", "score").Missing(2),
|
||||
map[string]interface{}{
|
||||
"avg": map[string]interface{}{
|
||||
"field": "score",
|
||||
"missing": 2,
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"weighted avg",
|
||||
WeightedAvg("weighted_grade").Value("grade", 2).Weight("weight"),
|
||||
map[string]interface{}{
|
||||
"weighted_avg": map[string]interface{}{
|
||||
"value": map[string]interface{}{
|
||||
"field": "grade",
|
||||
"missing": 2,
|
||||
},
|
||||
"weight": map[string]interface{}{
|
||||
"field": "weight",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"cardinality: no precision threshold",
|
||||
Cardinality("type_count", "type"),
|
||||
map[string]interface{}{
|
||||
"cardinality": map[string]interface{}{
|
||||
"field": "type",
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"cardinality: with precision threshold",
|
||||
Cardinality("type_count", "type").PrecisionThreshold(100),
|
||||
map[string]interface{}{
|
||||
"cardinality": map[string]interface{}{
|
||||
"field": "type",
|
||||
"precision_threshold": 100,
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"value_count agg: simple",
|
||||
ValueCount("num_values", "score"),
|
||||
map[string]interface{}{
|
||||
"value_count": map[string]interface{}{
|
||||
"field": "score",
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"sum agg: simple",
|
||||
Sum("total_score", "score").Missing(1),
|
||||
map[string]interface{}{
|
||||
"sum": map[string]interface{}{
|
||||
"field": "score",
|
||||
"missing": 1,
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"max agg: simple",
|
||||
Max("max_score", "score"),
|
||||
map[string]interface{}{
|
||||
"max": map[string]interface{}{
|
||||
"field": "score",
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"min agg: simple",
|
||||
Min("min_score", "score"),
|
||||
map[string]interface{}{
|
||||
"min": map[string]interface{}{
|
||||
"field": "score",
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"percentiles: simple",
|
||||
Percentiles("load_time_outlier", "load_time"),
|
||||
map[string]interface{}{
|
||||
"percentiles": map[string]interface{}{
|
||||
"field": "load_time",
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"percentiles: complex",
|
||||
Percentiles("load_time_outlier", "load_time").
|
||||
Keyed(true).
|
||||
Percents(95, 99, 99.9).
|
||||
Compression(200).
|
||||
NumHistogramDigits(3).
|
||||
Missing(20),
|
||||
map[string]interface{}{
|
||||
"percentiles": map[string]interface{}{
|
||||
"field": "load_time",
|
||||
"percents": []float32{95, 99, 99.9},
|
||||
"keyed": true,
|
||||
"missing": 20,
|
||||
"tdigest": map[string]interface{}{
|
||||
"compression": 200,
|
||||
},
|
||||
"hdr": map[string]interface{}{
|
||||
"number_of_significant_value_digits": 3,
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"stats agg",
|
||||
Stats("grades_stats", "grade"),
|
||||
map[string]interface{}{
|
||||
"stats": map[string]interface{}{
|
||||
"field": "grade",
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"string_stats agg: no show distribution",
|
||||
StringStats("message_stats", "message.keyword"),
|
||||
map[string]interface{}{
|
||||
"string_stats": map[string]interface{}{
|
||||
"field": "message.keyword",
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"string_stats agg: with show distribution",
|
||||
StringStats("message_stats", "message.keyword").ShowDistribution(false),
|
||||
map[string]interface{}{
|
||||
"string_stats": map[string]interface{}{
|
||||
"field": "message.keyword",
|
||||
"show_distribution": false,
|
||||
},
|
||||
},
|
||||
},
|
||||
})
|
||||
}
|
61
boolean.go
61
boolean.go
|
@ -1,61 +0,0 @@
|
|||
package esquery
|
||||
|
||||
import "encoding/json"
|
||||
|
||||
/*******************************************************************************
|
||||
* Boolean Queries
|
||||
* https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-bool-query.html
|
||||
******************************************************************************/
|
||||
|
||||
type BoolQuery struct {
|
||||
params boolQueryParams
|
||||
}
|
||||
|
||||
type boolQueryParams struct {
|
||||
Must []json.Marshaler `json:"must,omitempty"`
|
||||
Filter []json.Marshaler `json:"filter,omitempty"`
|
||||
MustNot []json.Marshaler `json:"must_not,omitempty"`
|
||||
Should []json.Marshaler `json:"should,omitempty"`
|
||||
MinimumShouldMatch int16 `json:"minimum_should_match,omitempty"`
|
||||
Boost float32 `json:"boost,omitempty"`
|
||||
}
|
||||
|
||||
func Bool() *BoolQuery {
|
||||
return &BoolQuery{}
|
||||
}
|
||||
|
||||
func (q *BoolQuery) Must(must ...json.Marshaler) *BoolQuery {
|
||||
q.params.Must = append(q.params.Must, must...)
|
||||
return q
|
||||
}
|
||||
|
||||
func (q *BoolQuery) Filter(filter ...json.Marshaler) *BoolQuery {
|
||||
q.params.Filter = append(q.params.Filter, filter...)
|
||||
return q
|
||||
}
|
||||
|
||||
func (q *BoolQuery) MustNot(mustnot ...json.Marshaler) *BoolQuery {
|
||||
q.params.MustNot = append(q.params.MustNot, mustnot...)
|
||||
return q
|
||||
}
|
||||
|
||||
func (q *BoolQuery) Should(should ...json.Marshaler) *BoolQuery {
|
||||
q.params.Should = append(q.params.Should, should...)
|
||||
return q
|
||||
}
|
||||
|
||||
func (q *BoolQuery) MinimumShouldMatch(val int16) *BoolQuery {
|
||||
q.params.MinimumShouldMatch = val
|
||||
return q
|
||||
}
|
||||
|
||||
func (q *BoolQuery) Boost(val float32) *BoolQuery {
|
||||
q.params.Boost = val
|
||||
return q
|
||||
}
|
||||
|
||||
func (q BoolQuery) MarshalJSON() ([]byte, error) {
|
||||
return json.Marshal(map[string]boolQueryParams{
|
||||
"bool": q.params,
|
||||
})
|
||||
}
|
|
@ -1,31 +0,0 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestBool(t *testing.T) {
|
||||
runTests(t, []queryTest{
|
||||
{
|
||||
"bool with only a simple must",
|
||||
Bool().Must(Term("tag", "tech")),
|
||||
"{\"bool\":{\"must\":[{\"term\":{\"tag\":{\"value\":\"tech\"}}}]}}\n",
|
||||
},
|
||||
{
|
||||
"bool which must match_all and filter",
|
||||
Bool().Must(MatchAll()).Filter(Term("status", "active")),
|
||||
"{\"bool\":{\"must\":[{\"match_all\":{}}],\"filter\":[{\"term\":{\"status\":{\"value\":\"active\"}}}]}}\n",
|
||||
},
|
||||
{
|
||||
"bool with a lot of stuff",
|
||||
Bool().
|
||||
Must(Term("user", "kimchy")).
|
||||
Filter(Term("tag", "tech")).
|
||||
MustNot(Range("age").Gte(10).Lte(20)).
|
||||
Should(Term("tag", "wow"), Term("tag", "elasticsearch")).
|
||||
MinimumShouldMatch(1).
|
||||
Boost(1.1),
|
||||
"{\"bool\":{\"must\":[{\"term\":{\"user\":{\"value\":\"kimchy\"}}}],\"filter\":[{\"term\":{\"tag\":{\"value\":\"tech\"}}}],\"must_not\":[{\"range\":{\"age\":{\"gte\":10,\"lte\":20}}}],\"should\":[{\"term\":{\"tag\":{\"value\":\"wow\"}}},{\"term\":{\"tag\":{\"value\":\"elasticsearch\"}}}],\"minimum_should_match\":1,\"boost\":1.1}}\n",
|
||||
},
|
||||
})
|
||||
}
|
43
boosting.go
43
boosting.go
|
@ -1,43 +0,0 @@
|
|||
package esquery
|
||||
|
||||
import "encoding/json"
|
||||
|
||||
/*******************************************************************************
|
||||
* Boosting Queries
|
||||
* https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-boosting-query.html
|
||||
******************************************************************************/
|
||||
|
||||
type BoostingQuery struct {
|
||||
params boostingQueryParams
|
||||
}
|
||||
|
||||
type boostingQueryParams struct {
|
||||
Positive json.Marshaler `json:"positive"`
|
||||
Negative json.Marshaler `json:"negative"`
|
||||
NegativeBoost float32 `json:"negative_boost"`
|
||||
}
|
||||
|
||||
func Boosting() *BoostingQuery {
|
||||
return &BoostingQuery{}
|
||||
}
|
||||
|
||||
func (q *BoostingQuery) Positive(p json.Marshaler) *BoostingQuery {
|
||||
q.params.Positive = p
|
||||
return q
|
||||
}
|
||||
|
||||
func (q *BoostingQuery) Negative(p json.Marshaler) *BoostingQuery {
|
||||
q.params.Negative = p
|
||||
return q
|
||||
}
|
||||
|
||||
func (q *BoostingQuery) NegativeBoost(b float32) *BoostingQuery {
|
||||
q.params.NegativeBoost = b
|
||||
return q
|
||||
}
|
||||
|
||||
func (q *BoostingQuery) MarshalJSON() ([]byte, error) {
|
||||
return json.Marshal(map[string]boostingQueryParams{
|
||||
"boosting": q.params,
|
||||
})
|
||||
}
|
|
@ -1,18 +0,0 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestBoost(t *testing.T) {
|
||||
runTests(t, []queryTest{
|
||||
{
|
||||
"boosting query",
|
||||
Boosting().
|
||||
Positive(Term("text", "apple")).
|
||||
Negative(Term("text", "pie tart")).
|
||||
NegativeBoost(0.5),
|
||||
"{\"boosting\":{\"positive\":{\"term\":{\"text\":{\"value\":\"apple\"}}},\"negative\":{\"term\":{\"text\":{\"value\":\"pie tart\"}}},\"negative_boost\":0.5}}\n",
|
||||
},
|
||||
})
|
||||
}
|
|
@ -1,29 +0,0 @@
|
|||
package esquery
|
||||
|
||||
import "encoding/json"
|
||||
|
||||
type ConstantScoreQuery struct {
|
||||
params constantScoreParams
|
||||
}
|
||||
|
||||
type constantScoreParams struct {
|
||||
Filter json.Marshaler `json:"filter"`
|
||||
Boost float32 `json:"boost,omitempty"`
|
||||
}
|
||||
|
||||
func ConstantScore(filter json.Marshaler) *ConstantScoreQuery {
|
||||
return &ConstantScoreQuery{
|
||||
params: constantScoreParams{Filter: filter},
|
||||
}
|
||||
}
|
||||
|
||||
func (q *ConstantScoreQuery) Boost(b float32) *ConstantScoreQuery {
|
||||
q.params.Boost = b
|
||||
return q
|
||||
}
|
||||
|
||||
func (q ConstantScoreQuery) MarshalJSON() ([]byte, error) {
|
||||
return json.Marshal(map[string]constantScoreParams{
|
||||
"constant_score": q.params,
|
||||
})
|
||||
}
|
|
@ -1,20 +0,0 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestConstantScore(t *testing.T) {
|
||||
runTests(t, []queryTest{
|
||||
{
|
||||
"constant_score query without boost",
|
||||
ConstantScore(Term("user", "kimchy")),
|
||||
"{\"constant_score\":{\"filter\":{\"term\":{\"user\":{\"value\":\"kimchy\"}}}}}\n",
|
||||
},
|
||||
{
|
||||
"constant_score query with boost",
|
||||
ConstantScore(Term("user", "kimchy")).Boost(2.2),
|
||||
"{\"constant_score\":{\"filter\":{\"term\":{\"user\":{\"value\":\"kimchy\"}}},\"boost\":2.2}}\n",
|
||||
},
|
||||
})
|
||||
}
|
31
dis_max.go
31
dis_max.go
|
@ -1,31 +0,0 @@
|
|||
package esquery
|
||||
|
||||
import "encoding/json"
|
||||
|
||||
type DisMaxQuery struct {
|
||||
params disMaxParams
|
||||
}
|
||||
|
||||
type disMaxParams struct {
|
||||
Queries []json.Marshaler `json:"queries"`
|
||||
TieBreaker float32 `json:"tie_breaker,omitempty"`
|
||||
}
|
||||
|
||||
func DisMax(queries ...json.Marshaler) *DisMaxQuery {
|
||||
return &DisMaxQuery{
|
||||
params: disMaxParams{
|
||||
Queries: queries,
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
func (q *DisMaxQuery) TieBreaker(b float32) *DisMaxQuery {
|
||||
q.params.TieBreaker = b
|
||||
return q
|
||||
}
|
||||
|
||||
func (q DisMaxQuery) MarshalJSON() ([]byte, error) {
|
||||
return json.Marshal(map[string]disMaxParams{
|
||||
"dis_max": q.params,
|
||||
})
|
||||
}
|
|
@ -1,15 +0,0 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestDisMax(t *testing.T) {
|
||||
runTests(t, []queryTest{
|
||||
{
|
||||
"dis_max",
|
||||
DisMax(Term("title", "Quick pets"), Term("body", "Quick pets")).TieBreaker(0.7),
|
||||
"{\"dis_max\":{\"queries\":[{\"term\":{\"title\":{\"value\":\"Quick pets\"}}},{\"term\":{\"body\":{\"value\":\"Quick pets\"}}}],\"tie_breaker\":0.7}}\n",
|
||||
},
|
||||
})
|
||||
}
|
45
es.go
45
es.go
|
@ -1,46 +1,5 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
|
||||
"github.com/elastic/go-elasticsearch/v7"
|
||||
"github.com/elastic/go-elasticsearch/v7/esapi"
|
||||
)
|
||||
|
||||
type ESQuery struct {
|
||||
Query json.Marshaler `json:"query"`
|
||||
}
|
||||
|
||||
func encode(q json.Marshaler, b *bytes.Buffer) (err error) {
|
||||
b.Reset()
|
||||
err = json.NewEncoder(b).Encode(q)
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed encoding query to JSON: %w", err)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func Search(
|
||||
api *elasticsearch.Client,
|
||||
q json.Marshaler,
|
||||
o ...func(*esapi.SearchRequest),
|
||||
) (res *esapi.Response, err error) {
|
||||
var b bytes.Buffer
|
||||
err = encode(ESQuery{q}, &b)
|
||||
if err != nil {
|
||||
return res, err
|
||||
}
|
||||
|
||||
opts := append([]func(*esapi.SearchRequest){api.Search.WithBody(&b)}, o...)
|
||||
|
||||
return api.Search(opts...)
|
||||
}
|
||||
|
||||
func (q ESQuery) MarshalJSON() ([]byte, error) {
|
||||
return json.Marshal(map[string]json.Marshaler{
|
||||
"query": q.Query,
|
||||
})
|
||||
type Mappable interface {
|
||||
Map() map[string]interface{}
|
||||
}
|
||||
|
|
37
es_test.go
37
es_test.go
|
@ -1,27 +1,40 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"encoding/json"
|
||||
"reflect"
|
||||
"testing"
|
||||
)
|
||||
|
||||
type queryTest struct {
|
||||
name string
|
||||
q json.Marshaler
|
||||
expJSON string
|
||||
type mapTest struct {
|
||||
name string
|
||||
q Mappable
|
||||
exp map[string]interface{}
|
||||
}
|
||||
|
||||
func runTests(t *testing.T, tests []queryTest) {
|
||||
func runMapTests(t *testing.T, tests []mapTest) {
|
||||
for _, test := range tests {
|
||||
var b bytes.Buffer
|
||||
t.Run(test.name, func(t *testing.T) {
|
||||
err := encode(test.q, &b)
|
||||
if err != nil {
|
||||
t.Errorf("unexpectedly failed: %s", err)
|
||||
} else if b.String() != test.expJSON {
|
||||
t.Errorf("expected %q, got %q", test.expJSON, b.String())
|
||||
m := test.q.Map()
|
||||
|
||||
// convert both maps to JSON in order to compare them. we do not
|
||||
// use reflect.DeepEqual on the maps as this doesn't always work
|
||||
exp, got, ok := sameJSON(test.exp, m)
|
||||
if !ok {
|
||||
t.Errorf("expected %s, got %s", exp, got)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func sameJSON(a, b map[string]interface{}) (aJSON, bJSON []byte, ok bool) {
|
||||
aJSON, aErr := json.Marshal(a)
|
||||
bJSON, bErr := json.Marshal(b)
|
||||
|
||||
if aErr != nil || bErr != nil {
|
||||
return aJSON, bJSON, false
|
||||
}
|
||||
|
||||
ok = reflect.DeepEqual(aJSON, bJSON)
|
||||
return aJSON, bJSON, ok
|
||||
}
|
||||
|
|
3
go.mod
3
go.mod
|
@ -3,7 +3,6 @@ module bitbucket.org/scalock/esquery
|
|||
go 1.13
|
||||
|
||||
require (
|
||||
github.com/elastic/go-elasticsearch v0.0.0
|
||||
github.com/elastic/go-elasticsearch/v7 v7.6.0
|
||||
github.com/elastic/go-elasticsearch/v8 v8.0.0-20200210103600-aff00e5adfde
|
||||
github.com/fatih/structs v1.1.0
|
||||
)
|
||||
|
|
2
go.sum
2
go.sum
|
@ -4,3 +4,5 @@ github.com/elastic/go-elasticsearch/v7 v7.6.0 h1:sYpGLpEFHgLUKLsZUBfuaVI9QgHjS3J
|
|||
github.com/elastic/go-elasticsearch/v7 v7.6.0/go.mod h1:OJ4wdbtDNk5g503kvlHLyErCgQwwzmDtaFC4XyOxXA4=
|
||||
github.com/elastic/go-elasticsearch/v8 v8.0.0-20200210103600-aff00e5adfde h1:Y9SZx8RQqFycLxi5W5eFmxMqnmijULVc3LMjBTtZQdM=
|
||||
github.com/elastic/go-elasticsearch/v8 v8.0.0-20200210103600-aff00e5adfde/go.mod h1:xe9a/L2aeOgFKKgrO3ibQTnMdpAeL0GC+5/HpGScSa4=
|
||||
github.com/fatih/structs v1.1.0 h1:Q7juDM0QtcnhCpeyLGQKyg4TOIghuNXrkL32pHAUMxo=
|
||||
github.com/fatih/structs v1.1.0/go.mod h1:9NiDSp5zOcgEDl+j00MP/WkGVPOlPRLejGD8Ga6PJ7M=
|
||||
|
|
|
@ -1,13 +0,0 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestMatchAll(t *testing.T) {
|
||||
runTests(t, []queryTest{
|
||||
{"match_all without a boost", MatchAll(), "{\"match_all\":{}}\n"},
|
||||
{"match_all with a boost", MatchAll().Boost(2.3), "{\"match_all\":{\"boost\":2.3}}\n"},
|
||||
{"match_none", MatchNone(), "{\"match_none\":{}}\n"},
|
||||
})
|
||||
}
|
|
@ -1,15 +0,0 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestMatch(t *testing.T) {
|
||||
runTests(t, []queryTest{
|
||||
{"simple match", Match("title", "sample text"), "{\"match\":{\"title\":{\"query\":\"sample text\"}}}\n"},
|
||||
{"match with more params", Match("issue_number").Query(16).Transpositions(false).MaxExpansions(32).Operator(AND), "{\"match\":{\"issue_number\":{\"query\":16,\"max_expansions\":32,\"transpositions\":false,\"operator\":\"and\"}}}\n"},
|
||||
{"match_bool_prefix", MatchBoolPrefix("title", "sample text"), "{\"match_bool_prefix\":{\"title\":{\"query\":\"sample text\"}}}\n"},
|
||||
{"match_phrase", MatchPhrase("title", "sample text"), "{\"match_phrase\":{\"title\":{\"query\":\"sample text\"}}}\n"},
|
||||
{"match_phrase_prefix", MatchPhrasePrefix("title", "sample text"), "{\"match_phrase_prefix\":{\"title\":{\"query\":\"sample text\"}}}\n"},
|
||||
})
|
||||
}
|
|
@ -0,0 +1,38 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"encoding/json"
|
||||
|
||||
"github.com/elastic/go-elasticsearch/v7"
|
||||
"github.com/elastic/go-elasticsearch/v7/esapi"
|
||||
)
|
||||
|
||||
type QueryRequest struct {
|
||||
Query Mappable
|
||||
}
|
||||
|
||||
func Query(q Mappable) *QueryRequest {
|
||||
return &QueryRequest{q}
|
||||
}
|
||||
|
||||
func (req *QueryRequest) Map() map[string]interface{} {
|
||||
return map[string]interface{}{
|
||||
"query": req.Query.Map(),
|
||||
}
|
||||
}
|
||||
|
||||
func (req *QueryRequest) Run(
|
||||
api *elasticsearch.Client,
|
||||
o ...func(*esapi.SearchRequest),
|
||||
) (res *esapi.Response, err error) {
|
||||
var b bytes.Buffer
|
||||
err = json.NewEncoder(&b).Encode(req.Query.Map())
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
opts := append([]func(*esapi.SearchRequest){api.Search.WithBody(&b)}, o...)
|
||||
|
||||
return api.Search(opts...)
|
||||
}
|
|
@ -0,0 +1,68 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestQueries(t *testing.T) {
|
||||
runMapTests(t, []mapTest{
|
||||
{
|
||||
"a simple match_all query",
|
||||
Query(MatchAll()),
|
||||
map[string]interface{}{
|
||||
"query": map[string]interface{}{
|
||||
"match_all": map[string]interface{}{},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"a complex query",
|
||||
Query(
|
||||
Bool().
|
||||
Must(
|
||||
Range("date").
|
||||
Gt("some time in the past").
|
||||
Lte("now").
|
||||
Relation(CONTAINS).
|
||||
TimeZone("Asia/Jerusalem").
|
||||
Boost(2.3),
|
||||
|
||||
Match("author").
|
||||
Query("some guy").
|
||||
Analyzer("analyzer?").
|
||||
Fuzziness("fuzz"),
|
||||
).
|
||||
Boost(3.1),
|
||||
),
|
||||
map[string]interface{}{
|
||||
"query": map[string]interface{}{
|
||||
"bool": map[string]interface{}{
|
||||
"must": []map[string]interface{}{
|
||||
{
|
||||
"range": map[string]interface{}{
|
||||
"date": map[string]interface{}{
|
||||
"gt": "some time in the past",
|
||||
"lte": "now",
|
||||
"relation": "CONTAINS",
|
||||
"time_zone": "Asia/Jerusalem",
|
||||
"boost": 2.3,
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"match": map[string]interface{}{
|
||||
"author": map[string]interface{}{
|
||||
"query": "some guy",
|
||||
"analyzer": "analyzer?",
|
||||
"fuzziness": "fuzz",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
"boost": 3.1,
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
})
|
||||
}
|
|
@ -0,0 +1,97 @@
|
|||
package esquery
|
||||
|
||||
import "github.com/fatih/structs"
|
||||
|
||||
/*******************************************************************************
|
||||
* Boolean Queries
|
||||
* https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-bool-query.html
|
||||
******************************************************************************/
|
||||
|
||||
type BoolQuery struct {
|
||||
must []Mappable
|
||||
filter []Mappable
|
||||
mustNot []Mappable
|
||||
should []Mappable
|
||||
minimumShouldMatch int16
|
||||
boost float32
|
||||
}
|
||||
|
||||
func Bool() *BoolQuery {
|
||||
return &BoolQuery{}
|
||||
}
|
||||
|
||||
func (q *BoolQuery) Must(must ...Mappable) *BoolQuery {
|
||||
q.must = append(q.must, must...)
|
||||
return q
|
||||
}
|
||||
|
||||
func (q *BoolQuery) Filter(filter ...Mappable) *BoolQuery {
|
||||
q.filter = append(q.filter, filter...)
|
||||
return q
|
||||
}
|
||||
|
||||
func (q *BoolQuery) MustNot(mustnot ...Mappable) *BoolQuery {
|
||||
q.mustNot = append(q.mustNot, mustnot...)
|
||||
return q
|
||||
}
|
||||
|
||||
func (q *BoolQuery) Should(should ...Mappable) *BoolQuery {
|
||||
q.should = append(q.should, should...)
|
||||
return q
|
||||
}
|
||||
|
||||
func (q *BoolQuery) MinimumShouldMatch(val int16) *BoolQuery {
|
||||
q.minimumShouldMatch = val
|
||||
return q
|
||||
}
|
||||
|
||||
func (q *BoolQuery) Boost(val float32) *BoolQuery {
|
||||
q.boost = val
|
||||
return q
|
||||
}
|
||||
|
||||
func (q *BoolQuery) Map() map[string]interface{} {
|
||||
var data struct {
|
||||
Must []map[string]interface{} `structs:"must,omitempty"`
|
||||
Filter []map[string]interface{} `structs:"filter,omitempty"`
|
||||
MustNot []map[string]interface{} `structs:"must_not,omitempty"`
|
||||
Should []map[string]interface{} `structs:"should,omitempty"`
|
||||
MinimumShouldMatch int16 `structs:"minimum_should_match,omitempty"`
|
||||
Boost float32 `structs:"boost,omitempty"`
|
||||
}
|
||||
|
||||
data.MinimumShouldMatch = q.minimumShouldMatch
|
||||
data.Boost = q.boost
|
||||
|
||||
if len(q.must) > 0 {
|
||||
data.Must = make([]map[string]interface{}, len(q.must))
|
||||
for i, m := range q.must {
|
||||
data.Must[i] = m.Map()
|
||||
}
|
||||
}
|
||||
|
||||
if len(q.filter) > 0 {
|
||||
data.Filter = make([]map[string]interface{}, len(q.filter))
|
||||
for i, m := range q.filter {
|
||||
data.Filter[i] = m.Map()
|
||||
}
|
||||
}
|
||||
|
||||
if len(q.mustNot) > 0 {
|
||||
data.MustNot = make([]map[string]interface{}, len(q.mustNot))
|
||||
for i, m := range q.mustNot {
|
||||
data.MustNot[i] = m.Map()
|
||||
}
|
||||
}
|
||||
|
||||
if len(q.should) > 0 {
|
||||
data.Should = make([]map[string]interface{}, len(q.should))
|
||||
for i, m := range q.should {
|
||||
data.Should[i] = m.Map()
|
||||
}
|
||||
}
|
||||
|
||||
return map[string]interface{}{
|
||||
"bool": structs.Map(data),
|
||||
}
|
||||
}
|
|
@ -0,0 +1,107 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestBool(t *testing.T) {
|
||||
runMapTests(t, []mapTest{
|
||||
{
|
||||
"bool with only a simple must",
|
||||
Bool().Must(Term("tag", "tech")),
|
||||
map[string]interface{}{
|
||||
"bool": map[string]interface{}{
|
||||
"must": []map[string]interface{}{
|
||||
{
|
||||
"term": map[string]interface{}{
|
||||
"tag": map[string]interface{}{
|
||||
"value": "tech",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"bool which must match_all and filter",
|
||||
Bool().Must(MatchAll()).Filter(Term("status", "active")),
|
||||
map[string]interface{}{
|
||||
"bool": map[string]interface{}{
|
||||
"must": []map[string]interface{}{
|
||||
{"match_all": map[string]interface{}{}},
|
||||
},
|
||||
"filter": []map[string]interface{}{
|
||||
{
|
||||
"term": map[string]interface{}{
|
||||
"status": map[string]interface{}{
|
||||
"value": "active",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"bool with a lot of stuff",
|
||||
Bool().
|
||||
Must(Term("user", "kimchy")).
|
||||
Filter(Term("tag", "tech")).
|
||||
MustNot(Range("age").Gte(10).Lte(20)).
|
||||
Should(Term("tag", "wow"), Term("tag", "elasticsearch")).
|
||||
MinimumShouldMatch(1).
|
||||
Boost(1.1),
|
||||
map[string]interface{}{
|
||||
"bool": map[string]interface{}{
|
||||
"must": []map[string]interface{}{
|
||||
{
|
||||
"term": map[string]interface{}{
|
||||
"user": map[string]interface{}{
|
||||
"value": "kimchy",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
"filter": []map[string]interface{}{
|
||||
{
|
||||
"term": map[string]interface{}{
|
||||
"tag": map[string]interface{}{
|
||||
"value": "tech",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
"must_not": []map[string]interface{}{
|
||||
{
|
||||
"range": map[string]interface{}{
|
||||
"age": map[string]interface{}{
|
||||
"gte": 10,
|
||||
"lte": 20,
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
"should": []map[string]interface{}{
|
||||
{
|
||||
"term": map[string]interface{}{
|
||||
"tag": map[string]interface{}{
|
||||
"value": "wow",
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"term": map[string]interface{}{
|
||||
"tag": map[string]interface{}{
|
||||
"value": "elasticsearch",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
"minimum_should_match": 1,
|
||||
"boost": 1.1,
|
||||
},
|
||||
},
|
||||
},
|
||||
})
|
||||
}
|
|
@ -0,0 +1,41 @@
|
|||
package esquery
|
||||
|
||||
/*******************************************************************************
|
||||
* Boosting Queries
|
||||
* https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-boosting-query.html
|
||||
******************************************************************************/
|
||||
|
||||
type BoostingQuery struct {
|
||||
Pos Mappable
|
||||
Neg Mappable
|
||||
NegBoost float32
|
||||
}
|
||||
|
||||
func Boosting() *BoostingQuery {
|
||||
return &BoostingQuery{}
|
||||
}
|
||||
|
||||
func (q *BoostingQuery) Positive(p Mappable) *BoostingQuery {
|
||||
q.Pos = p
|
||||
return q
|
||||
}
|
||||
|
||||
func (q *BoostingQuery) Negative(p Mappable) *BoostingQuery {
|
||||
q.Neg = p
|
||||
return q
|
||||
}
|
||||
|
||||
func (q *BoostingQuery) NegativeBoost(b float32) *BoostingQuery {
|
||||
q.NegBoost = b
|
||||
return q
|
||||
}
|
||||
|
||||
func (q *BoostingQuery) Map() map[string]interface{} {
|
||||
return map[string]interface{}{
|
||||
"boosting": map[string]interface{}{
|
||||
"positive": q.Pos.Map(),
|
||||
"negative": q.Neg.Map(),
|
||||
"negative_boost": q.NegBoost,
|
||||
},
|
||||
}
|
||||
}
|
|
@ -0,0 +1,36 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestBoosting(t *testing.T) {
|
||||
runMapTests(t, []mapTest{
|
||||
{
|
||||
"boosting query",
|
||||
Boosting().
|
||||
Positive(Term("text", "apple")).
|
||||
Negative(Term("text", "pie tart")).
|
||||
NegativeBoost(0.5),
|
||||
map[string]interface{}{
|
||||
"boosting": map[string]interface{}{
|
||||
"positive": map[string]interface{}{
|
||||
"term": map[string]interface{}{
|
||||
"text": map[string]interface{}{
|
||||
"value": "apple",
|
||||
},
|
||||
},
|
||||
},
|
||||
"negative": map[string]interface{}{
|
||||
"term": map[string]interface{}{
|
||||
"text": map[string]interface{}{
|
||||
"value": "pie tart",
|
||||
},
|
||||
},
|
||||
},
|
||||
"negative_boost": 0.5,
|
||||
},
|
||||
},
|
||||
},
|
||||
})
|
||||
}
|
|
@ -0,0 +1,33 @@
|
|||
package esquery
|
||||
|
||||
import "github.com/fatih/structs"
|
||||
|
||||
/*******************************************************************************
|
||||
* Constant Score Queries
|
||||
* https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-constant-score-query.html
|
||||
******************************************************************************/
|
||||
|
||||
type ConstantScoreQuery struct {
|
||||
filter Mappable
|
||||
boost float32
|
||||
}
|
||||
|
||||
func ConstantScore(filter Mappable) *ConstantScoreQuery {
|
||||
return &ConstantScoreQuery{
|
||||
filter: filter,
|
||||
}
|
||||
}
|
||||
|
||||
func (q *ConstantScoreQuery) Boost(b float32) *ConstantScoreQuery {
|
||||
q.boost = b
|
||||
return q
|
||||
}
|
||||
|
||||
func (q *ConstantScoreQuery) Map() map[string]interface{} {
|
||||
return map[string]interface{}{
|
||||
"constant_score": structs.Map(struct {
|
||||
Filter map[string]interface{} `structs:"filter"`
|
||||
Boost float32 `structs:"boost,omitempty"`
|
||||
}{q.filter.Map(), q.boost}),
|
||||
}
|
||||
}
|
|
@ -0,0 +1,41 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestConstantScore(t *testing.T) {
|
||||
runMapTests(t, []mapTest{
|
||||
{
|
||||
"constant_score query without boost",
|
||||
ConstantScore(Term("user", "kimchy")),
|
||||
map[string]interface{}{
|
||||
"constant_score": map[string]interface{}{
|
||||
"filter": map[string]interface{}{
|
||||
"term": map[string]interface{}{
|
||||
"user": map[string]interface{}{
|
||||
"value": "kimchy",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"constant_score query with boost",
|
||||
ConstantScore(Term("user", "kimchy")).Boost(2.2),
|
||||
map[string]interface{}{
|
||||
"constant_score": map[string]interface{}{
|
||||
"filter": map[string]interface{}{
|
||||
"term": map[string]interface{}{
|
||||
"user": map[string]interface{}{
|
||||
"value": "kimchy",
|
||||
},
|
||||
},
|
||||
},
|
||||
"boost": 2.2,
|
||||
},
|
||||
},
|
||||
},
|
||||
})
|
||||
}
|
|
@ -0,0 +1,13 @@
|
|||
package esquery
|
||||
|
||||
type CustomQry struct {
|
||||
m map[string]interface{}
|
||||
}
|
||||
|
||||
func CustomQuery(m map[string]interface{}) *CustomQry {
|
||||
return &CustomQry{m}
|
||||
}
|
||||
|
||||
func (q *CustomQry) Map() map[string]interface{} {
|
||||
return q.m
|
||||
}
|
|
@ -0,0 +1,23 @@
|
|||
package esquery
|
||||
|
||||
import "testing"
|
||||
|
||||
func TestCustomQuery(t *testing.T) {
|
||||
m := map[string]interface{}{
|
||||
"geo_distance": map[string]interface{}{
|
||||
"distance": "200km",
|
||||
"pin.location": map[string]interface{}{
|
||||
"lat": 40,
|
||||
"lon": -70,
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
runMapTests(t, []mapTest{
|
||||
{
|
||||
"custom query",
|
||||
CustomQuery(m),
|
||||
m,
|
||||
},
|
||||
})
|
||||
}
|
|
@ -0,0 +1,37 @@
|
|||
package esquery
|
||||
|
||||
import "github.com/fatih/structs"
|
||||
|
||||
/*******************************************************************************
|
||||
* Disjunction Max Queries
|
||||
* https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-dis-max-query.html
|
||||
******************************************************************************/
|
||||
|
||||
type DisMaxQuery struct {
|
||||
queries []Mappable
|
||||
tieBreaker float32
|
||||
}
|
||||
|
||||
func DisMax(queries ...Mappable) *DisMaxQuery {
|
||||
return &DisMaxQuery{
|
||||
queries: queries,
|
||||
}
|
||||
}
|
||||
|
||||
func (q *DisMaxQuery) TieBreaker(b float32) *DisMaxQuery {
|
||||
q.tieBreaker = b
|
||||
return q
|
||||
}
|
||||
|
||||
func (q *DisMaxQuery) Map() map[string]interface{} {
|
||||
inner := make([]map[string]interface{}, len(q.queries))
|
||||
for i, iq := range q.queries {
|
||||
inner[i] = iq.Map()
|
||||
}
|
||||
return map[string]interface{}{
|
||||
"dis_max": structs.Map(struct {
|
||||
Queries []map[string]interface{} `structs:"queries"`
|
||||
TieBreaker float32 `structs:"tie_breaker,omitempty"`
|
||||
}{inner, q.tieBreaker}),
|
||||
}
|
||||
}
|
|
@ -0,0 +1,35 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestDisMax(t *testing.T) {
|
||||
runMapTests(t, []mapTest{
|
||||
{
|
||||
"dis_max",
|
||||
DisMax(Term("title", "Quick pets"), Term("body", "Quick pets")).TieBreaker(0.7),
|
||||
map[string]interface{}{
|
||||
"dis_max": map[string]interface{}{
|
||||
"queries": []map[string]interface{}{
|
||||
{
|
||||
"term": map[string]interface{}{
|
||||
"title": map[string]interface{}{
|
||||
"value": "Quick pets",
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"term": map[string]interface{}{
|
||||
"body": map[string]interface{}{
|
||||
"value": "Quick pets",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
"tie_breaker": 0.7,
|
||||
},
|
||||
},
|
||||
},
|
||||
})
|
||||
}
|
|
@ -2,9 +2,9 @@ package esquery
|
|||
|
||||
import (
|
||||
"bytes"
|
||||
"encoding/json"
|
||||
"errors"
|
||||
"io"
|
||||
|
||||
"github.com/fatih/structs"
|
||||
)
|
||||
|
||||
/*******************************************************************************
|
||||
|
@ -29,7 +29,7 @@ type MatchQuery struct {
|
|||
params matchParams
|
||||
}
|
||||
|
||||
func (a MatchQuery) MarshalJSON() ([]byte, error) {
|
||||
func (a *MatchQuery) Map() map[string]interface{} {
|
||||
var mType string
|
||||
switch a.mType {
|
||||
case TypeMatch:
|
||||
|
@ -42,27 +42,27 @@ func (a MatchQuery) MarshalJSON() ([]byte, error) {
|
|||
mType = "match_phrase_prefix"
|
||||
}
|
||||
|
||||
return json.Marshal(map[string]interface{}{
|
||||
return map[string]interface{}{
|
||||
mType: map[string]interface{}{
|
||||
a.field: a.params,
|
||||
a.field: structs.Map(a.params),
|
||||
},
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
type matchParams struct {
|
||||
Qry interface{} `json:"query"`
|
||||
Anl string `json:"analyzer,omitempty"`
|
||||
AutoGenerate *bool `json:"auto_generate_synonyms_phrase_query,omitempty"`
|
||||
Fuzz string `json:"fuzziness,omitempty"`
|
||||
MaxExp uint16 `json:"max_expansions,omitempty"`
|
||||
PrefLen uint16 `json:"prefix_length,omitempty"`
|
||||
Trans *bool `json:"transpositions,omitempty"`
|
||||
FuzzyRw string `json:"fuzzy_rewrite,omitempty"`
|
||||
Lent bool `json:"lenient,omitempty"`
|
||||
Op MatchOperator `json:"operator,omitempty"`
|
||||
MinMatch string `json:"minimum_should_match,omitempty"`
|
||||
ZeroTerms string `json:"zero_terms_query,omitempty"`
|
||||
Slp uint16 `json:"slop,omitempty"` // only relevant for match_phrase query
|
||||
Qry interface{} `structs:"query"`
|
||||
Anl string `structs:"analyzer,omitempty"`
|
||||
AutoGenerate *bool `structs:"auto_generate_synonyms_phrase_query,omitempty"`
|
||||
Fuzz string `structs:"fuzziness,omitempty"`
|
||||
MaxExp uint16 `structs:"max_expansions,omitempty"`
|
||||
PrefLen uint16 `structs:"prefix_length,omitempty"`
|
||||
Trans *bool `structs:"transpositions,omitempty"`
|
||||
FuzzyRw string `structs:"fuzzy_rewrite,omitempty"`
|
||||
Lent bool `structs:"lenient,omitempty"`
|
||||
Op MatchOperator `structs:"operator,string,omitempty"`
|
||||
MinMatch string `structs:"minimum_should_match,omitempty"`
|
||||
ZeroTerms ZeroTerms `structs:"zero_terms_query,string,omitempty"`
|
||||
Slp uint16 `structs:"slop,omitempty"` // only relevant for match_phrase query
|
||||
}
|
||||
|
||||
func Match(fieldName string, simpleQuery ...interface{}) *MatchQuery {
|
||||
|
@ -156,7 +156,7 @@ func (q *MatchQuery) Slop(n uint16) *MatchQuery {
|
|||
return q
|
||||
}
|
||||
|
||||
func (q *MatchQuery) ZeroTermsQuery(s string) *MatchQuery {
|
||||
func (q *MatchQuery) ZeroTermsQuery(s ZeroTerms) *MatchQuery {
|
||||
q.params.ZeroTerms = s
|
||||
return q
|
||||
}
|
||||
|
@ -173,20 +173,15 @@ const (
|
|||
AND
|
||||
)
|
||||
|
||||
var ErrInvalidValue = errors.New("invalid constant value")
|
||||
|
||||
func (a MatchOperator) MarshalJSON() ([]byte, error) {
|
||||
var s string
|
||||
func (a MatchOperator) String() string {
|
||||
switch a {
|
||||
case OR:
|
||||
s = "or"
|
||||
return "or"
|
||||
case AND:
|
||||
s = "and"
|
||||
return "and"
|
||||
default:
|
||||
return nil, ErrInvalidValue
|
||||
return ""
|
||||
}
|
||||
|
||||
return json.Marshal(s)
|
||||
}
|
||||
|
||||
type ZeroTerms uint8
|
||||
|
@ -196,60 +191,13 @@ const (
|
|||
All
|
||||
)
|
||||
|
||||
func (a ZeroTerms) MarshalJSON() ([]byte, error) {
|
||||
var s string
|
||||
func (a ZeroTerms) String() string {
|
||||
switch a {
|
||||
case None:
|
||||
s = "none"
|
||||
return "none"
|
||||
case All:
|
||||
s = "all"
|
||||
return "all"
|
||||
default:
|
||||
return nil, ErrInvalidValue
|
||||
return ""
|
||||
}
|
||||
|
||||
return json.Marshal(s)
|
||||
}
|
||||
|
||||
/*******************************************************************************
|
||||
* Multi-Match Queries
|
||||
* https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-multi-match-query.html
|
||||
* NOTE: uncommented for now, article is too long
|
||||
******************************************************************************/
|
||||
|
||||
//type MultiMatchQuery struct {
|
||||
//fields []string
|
||||
//mType multiMatchType
|
||||
//params multiMatchQueryParams
|
||||
//}
|
||||
|
||||
//type multiMatchType uint8
|
||||
|
||||
//const (
|
||||
//BestFields multiMatchType = iota
|
||||
//MostFields
|
||||
//CrossFields
|
||||
//Phrase
|
||||
//PhrasePrefix
|
||||
//BoolPrefix
|
||||
//)
|
||||
|
||||
//func (a multiMatchType) MarshalJSON() ([]byte, error) {
|
||||
//var s string
|
||||
//switch a {
|
||||
//case BestFields:
|
||||
//s = "best_fields"
|
||||
//case MostFields:
|
||||
//s = "most_fields"
|
||||
//case CrossFields:
|
||||
//s = "cross_fields"
|
||||
//case Phrase:
|
||||
//s = "phrase"
|
||||
//case PhrasePrefix:
|
||||
//s = "phrase_prefix"
|
||||
//case BoolPrefix:
|
||||
//s = "bool_prefix"
|
||||
//default:
|
||||
//return nil, ErrInvalidValue
|
||||
//}
|
||||
//return json.Marshal(s)
|
||||
//}
|
|
@ -1,20 +1,22 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
)
|
||||
import "github.com/fatih/structs"
|
||||
|
||||
/*******************************************************************************
|
||||
* Match All Queries
|
||||
* https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-match-all-query.html
|
||||
******************************************************************************/
|
||||
|
||||
// https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-match-all-query.html
|
||||
type MatchAllQuery struct {
|
||||
all bool
|
||||
params matchAllParams
|
||||
}
|
||||
|
||||
type matchAllParams struct {
|
||||
Boost float32 `json:"boost,omitempty"`
|
||||
Boost float32 `structs:"boost,omitempty"`
|
||||
}
|
||||
|
||||
func (a MatchAllQuery) MarshalJSON() ([]byte, error) {
|
||||
func (a *MatchAllQuery) Map() map[string]interface{} {
|
||||
var mType string
|
||||
switch a.all {
|
||||
case true:
|
||||
|
@ -23,7 +25,9 @@ func (a MatchAllQuery) MarshalJSON() ([]byte, error) {
|
|||
mType = "match_none"
|
||||
}
|
||||
|
||||
return json.Marshal(map[string]matchAllParams{mType: a.params})
|
||||
return map[string]interface{}{
|
||||
mType: structs.Map(a.params),
|
||||
}
|
||||
}
|
||||
|
||||
func MatchAll() *MatchAllQuery {
|
|
@ -0,0 +1,33 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestMatchAll(t *testing.T) {
|
||||
runMapTests(t, []mapTest{
|
||||
{
|
||||
"match_all without a boost",
|
||||
MatchAll(),
|
||||
map[string]interface{}{
|
||||
"match_all": map[string]interface{}{},
|
||||
},
|
||||
},
|
||||
{
|
||||
"match_all with a boost",
|
||||
MatchAll().Boost(2.3),
|
||||
map[string]interface{}{
|
||||
"match_all": map[string]interface{}{
|
||||
"boost": 2.3,
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"match_none",
|
||||
MatchNone(),
|
||||
map[string]interface{}{
|
||||
"match_none": map[string]interface{}{},
|
||||
},
|
||||
},
|
||||
})
|
||||
}
|
|
@ -0,0 +1,68 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestMatch(t *testing.T) {
|
||||
runMapTests(t, []mapTest{
|
||||
{
|
||||
"simple match",
|
||||
Match("title", "sample text"),
|
||||
map[string]interface{}{
|
||||
"match": map[string]interface{}{
|
||||
"title": map[string]interface{}{
|
||||
"query": "sample text",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"match with more params",
|
||||
Match("issue_number").Query(16).Transpositions(false).MaxExpansions(32).Operator(AND),
|
||||
map[string]interface{}{
|
||||
"match": map[string]interface{}{
|
||||
"issue_number": map[string]interface{}{
|
||||
"query": 16,
|
||||
"max_expansions": 32,
|
||||
"transpositions": false,
|
||||
"operator": "and",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"match_bool_prefix",
|
||||
MatchBoolPrefix("title", "sample text"),
|
||||
map[string]interface{}{
|
||||
"match_bool_prefix": map[string]interface{}{
|
||||
"title": map[string]interface{}{
|
||||
"query": "sample text",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"match_phrase",
|
||||
MatchPhrase("title", "sample text"),
|
||||
map[string]interface{}{
|
||||
"match_phrase": map[string]interface{}{
|
||||
"title": map[string]interface{}{
|
||||
"query": "sample text",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"match_phrase_prefix",
|
||||
MatchPhrasePrefix("title", "sample text"),
|
||||
map[string]interface{}{
|
||||
"match_phrase_prefix": map[string]interface{}{
|
||||
"title": map[string]interface{}{
|
||||
"query": "sample text",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
})
|
||||
}
|
|
@ -1,7 +1,7 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"github.com/fatih/structs"
|
||||
)
|
||||
|
||||
/*******************************************************************************
|
||||
|
@ -9,19 +9,18 @@ import (
|
|||
* https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-exists-query.html
|
||||
******************************************************************************/
|
||||
|
||||
type ExistsQuery string
|
||||
|
||||
func Exists(field string) *ExistsQuery {
|
||||
q := ExistsQuery(field)
|
||||
return &q
|
||||
type ExistsQuery struct {
|
||||
Field string `structs:"field"`
|
||||
}
|
||||
|
||||
func (q ExistsQuery) MarshalJSON() ([]byte, error) {
|
||||
return json.Marshal(map[string]interface{}{
|
||||
"exists": map[string]string{
|
||||
"field": string(q),
|
||||
},
|
||||
})
|
||||
func Exists(field string) *ExistsQuery {
|
||||
return &ExistsQuery{field}
|
||||
}
|
||||
|
||||
func (q *ExistsQuery) Map() map[string]interface{} {
|
||||
return map[string]interface{}{
|
||||
"exists": structs.Map(q),
|
||||
}
|
||||
}
|
||||
|
||||
/*******************************************************************************
|
||||
|
@ -29,19 +28,20 @@ func (q ExistsQuery) MarshalJSON() ([]byte, error) {
|
|||
* https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-ids-query.html
|
||||
******************************************************************************/
|
||||
|
||||
type IDsQuery []string
|
||||
|
||||
func IDs(vals ...string) *IDsQuery {
|
||||
q := IDsQuery(vals)
|
||||
return &q
|
||||
type IDsQuery struct {
|
||||
IDs struct {
|
||||
Values []string `structs:"values"`
|
||||
} `structs:"ids"`
|
||||
}
|
||||
|
||||
func (q IDsQuery) MarshalJSON() ([]byte, error) {
|
||||
return json.Marshal(map[string]interface{}{
|
||||
"ids": map[string][]string{
|
||||
"values": []string(q),
|
||||
},
|
||||
})
|
||||
func IDs(vals ...string) *IDsQuery {
|
||||
q := &IDsQuery{}
|
||||
q.IDs.Values = vals
|
||||
return q
|
||||
}
|
||||
|
||||
func (q *IDsQuery) Map() map[string]interface{} {
|
||||
return structs.Map(q)
|
||||
}
|
||||
|
||||
/*******************************************************************************
|
||||
|
@ -55,8 +55,8 @@ type PrefixQuery struct {
|
|||
}
|
||||
|
||||
type prefixQueryParams struct {
|
||||
Value string `json:"value"`
|
||||
Rewrite string `json:"rewrite,omitempty"`
|
||||
Value string `structs:"value"`
|
||||
Rewrite string `structs:"rewrite,omitempty"`
|
||||
}
|
||||
|
||||
func Prefix(field, value string) *PrefixQuery {
|
||||
|
@ -71,12 +71,12 @@ func (q *PrefixQuery) Rewrite(s string) *PrefixQuery {
|
|||
return q
|
||||
}
|
||||
|
||||
func (q PrefixQuery) MarshalJSON() ([]byte, error) {
|
||||
return json.Marshal(map[string]interface{}{
|
||||
"prefix": map[string]prefixQueryParams{
|
||||
q.field: q.params,
|
||||
func (q *PrefixQuery) Map() map[string]interface{} {
|
||||
return map[string]interface{}{
|
||||
"prefix": map[string]interface{}{
|
||||
q.field: structs.Map(q.params),
|
||||
},
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/*******************************************************************************
|
||||
|
@ -90,14 +90,14 @@ type RangeQuery struct {
|
|||
}
|
||||
|
||||
type rangeQueryParams struct {
|
||||
Gt interface{} `json:"gt,omitempty"`
|
||||
Gte interface{} `json:"gte,omitempty"`
|
||||
Lt interface{} `json:"lt,omitempty"`
|
||||
Lte interface{} `json:"lte,omitempty"`
|
||||
Format string `json:"format,omitempty"`
|
||||
Relation RangeRelation `json:"relation,omitempty"`
|
||||
TimeZone string `json:"time_zone,omitempty"`
|
||||
Boost float32 `json:"boost,omitempty"`
|
||||
Gt interface{} `structs:"gt,omitempty"`
|
||||
Gte interface{} `structs:"gte,omitempty"`
|
||||
Lt interface{} `structs:"lt,omitempty"`
|
||||
Lte interface{} `structs:"lte,omitempty"`
|
||||
Format string `structs:"format,omitempty"`
|
||||
Relation RangeRelation `structs:"relation,string,omitempty"`
|
||||
TimeZone string `structs:"time_zone,omitempty"`
|
||||
Boost float32 `structs:"boost,omitempty"`
|
||||
}
|
||||
|
||||
func Range(field string) *RangeQuery {
|
||||
|
@ -144,12 +144,12 @@ func (a *RangeQuery) Boost(b float32) *RangeQuery {
|
|||
return a
|
||||
}
|
||||
|
||||
func (a RangeQuery) MarshalJSON() ([]byte, error) {
|
||||
return json.Marshal(map[string]interface{}{
|
||||
"range": map[string]rangeQueryParams{
|
||||
a.field: a.params,
|
||||
func (a *RangeQuery) Map() map[string]interface{} {
|
||||
return map[string]interface{}{
|
||||
"range": map[string]interface{}{
|
||||
a.field: structs.Map(a.params),
|
||||
},
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
type RangeRelation uint8
|
||||
|
@ -160,20 +160,17 @@ const (
|
|||
WITHIN
|
||||
)
|
||||
|
||||
func (a RangeRelation) MarshalJSON() ([]byte, error) {
|
||||
var s string
|
||||
func (a RangeRelation) String() string {
|
||||
switch a {
|
||||
case INTERSECTS:
|
||||
s = "INTERSECTS"
|
||||
return "INTERSECTS"
|
||||
case CONTAINS:
|
||||
s = "CONTAINS"
|
||||
return "CONTAINS"
|
||||
case WITHIN:
|
||||
s = "WITHIN"
|
||||
return "WITHIN"
|
||||
default:
|
||||
return nil, ErrInvalidValue
|
||||
return ""
|
||||
}
|
||||
|
||||
return json.Marshal(s)
|
||||
}
|
||||
|
||||
/*******************************************************************************
|
||||
|
@ -188,10 +185,10 @@ type RegexpQuery struct {
|
|||
}
|
||||
|
||||
type regexpQueryParams struct {
|
||||
Value string `json:"value"`
|
||||
Flags string `json:"flags,omitempty"`
|
||||
MaxDeterminizedStates uint16 `json:"max_determinized_states,omitempty"`
|
||||
Rewrite string `json:"rewrite,omitempty"`
|
||||
Value string `structs:"value"`
|
||||
Flags string `structs:"flags,omitempty"`
|
||||
MaxDeterminizedStates uint16 `structs:"max_determinized_states,omitempty"`
|
||||
Rewrite string `structs:"rewrite,omitempty"`
|
||||
}
|
||||
|
||||
func Regexp(field, value string) *RegexpQuery {
|
||||
|
@ -227,18 +224,18 @@ func (q *RegexpQuery) Rewrite(r string) *RegexpQuery {
|
|||
return q
|
||||
}
|
||||
|
||||
func (q RegexpQuery) MarshalJSON() ([]byte, error) {
|
||||
func (q *RegexpQuery) Map() map[string]interface{} {
|
||||
var qType string
|
||||
if q.wildcard {
|
||||
qType = "wildcard"
|
||||
} else {
|
||||
qType = "regexp"
|
||||
}
|
||||
return json.Marshal(map[string]interface{}{
|
||||
qType: map[string]regexpQueryParams{
|
||||
q.field: q.params,
|
||||
return map[string]interface{}{
|
||||
qType: map[string]interface{}{
|
||||
q.field: structs.Map(q.params),
|
||||
},
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/*******************************************************************************
|
||||
|
@ -267,12 +264,12 @@ type FuzzyQuery struct {
|
|||
}
|
||||
|
||||
type fuzzyQueryParams struct {
|
||||
Value string `json:"value"`
|
||||
Fuzziness string `json:"fuzziness,omitempty"`
|
||||
MaxExpansions uint16 `json:"max_expansions,omitempty"`
|
||||
PrefixLength uint16 `json:"prefix_length,omitempty"`
|
||||
Transpositions *bool `json:"transpositions,omitempty"`
|
||||
Rewrite string `json:"rewrite,omitempty"`
|
||||
Value string `structs:"value"`
|
||||
Fuzziness string `structs:"fuzziness,omitempty"`
|
||||
MaxExpansions uint16 `structs:"max_expansions,omitempty"`
|
||||
PrefixLength uint16 `structs:"prefix_length,omitempty"`
|
||||
Transpositions *bool `structs:"transpositions,omitempty"`
|
||||
Rewrite string `structs:"rewrite,omitempty"`
|
||||
}
|
||||
|
||||
func Fuzzy(field, value string) *FuzzyQuery {
|
||||
|
@ -314,12 +311,12 @@ func (q *FuzzyQuery) Rewrite(s string) *FuzzyQuery {
|
|||
return q
|
||||
}
|
||||
|
||||
func (q FuzzyQuery) MarshalJSON() ([]byte, error) {
|
||||
return json.Marshal(map[string]interface{}{
|
||||
"fuzzy": map[string]fuzzyQueryParams{
|
||||
q.field: q.params,
|
||||
func (q *FuzzyQuery) Map() map[string]interface{} {
|
||||
return map[string]interface{}{
|
||||
"fuzzy": map[string]interface{}{
|
||||
q.field: structs.Map(q.params),
|
||||
},
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/*******************************************************************************
|
||||
|
@ -333,8 +330,8 @@ type TermQuery struct {
|
|||
}
|
||||
|
||||
type termQueryParams struct {
|
||||
Value interface{} `json:"value"`
|
||||
Boost float32 `json:"boost,omitempty"`
|
||||
Value interface{} `structs:"value"`
|
||||
Boost float32 `structs:"boost,omitempty"`
|
||||
}
|
||||
|
||||
func Term(field string, value interface{}) *TermQuery {
|
||||
|
@ -356,12 +353,12 @@ func (q *TermQuery) Boost(b float32) *TermQuery {
|
|||
return q
|
||||
}
|
||||
|
||||
func (q TermQuery) MarshalJSON() ([]byte, error) {
|
||||
return json.Marshal(map[string]interface{}{
|
||||
"term": map[string]termQueryParams{
|
||||
q.field: q.params,
|
||||
func (q *TermQuery) Map() map[string]interface{} {
|
||||
return map[string]interface{}{
|
||||
"term": map[string]interface{}{
|
||||
q.field: structs.Map(q.params),
|
||||
},
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/*******************************************************************************
|
||||
|
@ -392,12 +389,13 @@ func (q *TermsQuery) Boost(b float32) *TermsQuery {
|
|||
return q
|
||||
}
|
||||
|
||||
func (q TermsQuery) MarshalJSON() ([]byte, error) {
|
||||
func (q TermsQuery) Map() map[string]interface{} {
|
||||
innerMap := map[string]interface{}{q.field: q.values}
|
||||
if q.boost > 0 {
|
||||
innerMap["boost"] = q.boost
|
||||
}
|
||||
return json.Marshal(map[string]interface{}{"terms": innerMap})
|
||||
|
||||
return map[string]interface{}{"terms": innerMap}
|
||||
}
|
||||
|
||||
/*******************************************************************************
|
||||
|
@ -411,9 +409,9 @@ type TermsSetQuery struct {
|
|||
}
|
||||
|
||||
type termsSetQueryParams struct {
|
||||
Terms []string `json:"terms"`
|
||||
MinimumShouldMatchField string `json:"minimum_should_match_field,omitempty"`
|
||||
MinimumShouldMatchScript string `json:"minimum_should_match_script,omitempty"`
|
||||
Terms []string `structs:"terms"`
|
||||
MinimumShouldMatchField string `structs:"minimum_should_match_field,omitempty"`
|
||||
MinimumShouldMatchScript string `structs:"minimum_should_match_script,omitempty"`
|
||||
}
|
||||
|
||||
func TermsSet(field string, terms ...string) *TermsSetQuery {
|
||||
|
@ -440,10 +438,10 @@ func (q *TermsSetQuery) MinimumShouldMatchScript(script string) *TermsSetQuery {
|
|||
return q
|
||||
}
|
||||
|
||||
func (q TermsSetQuery) MarshalJSON() ([]byte, error) {
|
||||
return json.Marshal(map[string]interface{}{
|
||||
"terms_set": map[string]termsSetQueryParams{
|
||||
q.field: q.params,
|
||||
func (q TermsSetQuery) Map() map[string]interface{} {
|
||||
return map[string]interface{}{
|
||||
"terms_set": map[string]interface{}{
|
||||
q.field: structs.Map(q.params),
|
||||
},
|
||||
})
|
||||
}
|
||||
}
|
|
@ -0,0 +1,151 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestTermLevel(t *testing.T) {
|
||||
runMapTests(t, []mapTest{
|
||||
{
|
||||
"exists",
|
||||
Exists("title"),
|
||||
map[string]interface{}{
|
||||
"exists": map[string]interface{}{
|
||||
"field": "title",
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"ids",
|
||||
IDs("1", "4", "100"),
|
||||
map[string]interface{}{
|
||||
"ids": map[string]interface{}{
|
||||
"values": []string{"1", "4", "100"},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"simple prefix",
|
||||
Prefix("user", "ki"),
|
||||
map[string]interface{}{
|
||||
"prefix": map[string]interface{}{
|
||||
"user": map[string]interface{}{
|
||||
"value": "ki",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"complex prefix",
|
||||
Prefix("user", "ki").Rewrite("ji"),
|
||||
map[string]interface{}{
|
||||
"prefix": map[string]interface{}{
|
||||
"user": map[string]interface{}{
|
||||
"value": "ki",
|
||||
"rewrite": "ji",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"int range",
|
||||
Range("age").Gte(10).Lte(20).Boost(2.0),
|
||||
map[string]interface{}{
|
||||
"range": map[string]interface{}{
|
||||
"age": map[string]interface{}{
|
||||
"gte": 10,
|
||||
"lte": 20,
|
||||
"boost": 2.0,
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"string range",
|
||||
Range("timestamp").Gte("now-1d/d").Lt("now/d").Relation(CONTAINS),
|
||||
map[string]interface{}{
|
||||
"range": map[string]interface{}{
|
||||
"timestamp": map[string]interface{}{
|
||||
"gte": "now-1d/d",
|
||||
"lt": "now/d",
|
||||
"relation": "CONTAINS",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"regexp",
|
||||
Regexp("user", "k.*y").Flags("ALL").MaxDeterminizedStates(10000).Rewrite("constant_score"),
|
||||
map[string]interface{}{
|
||||
"regexp": map[string]interface{}{
|
||||
"user": map[string]interface{}{
|
||||
"value": "k.*y",
|
||||
"flags": "ALL",
|
||||
"max_determinized_states": 10000,
|
||||
"rewrite": "constant_score",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"wildcard",
|
||||
Wildcard("user", "ki*y").Rewrite("constant_score"),
|
||||
map[string]interface{}{
|
||||
"wildcard": map[string]interface{}{
|
||||
"user": map[string]interface{}{
|
||||
"value": "ki*y",
|
||||
"rewrite": "constant_score",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"fuzzy",
|
||||
Fuzzy("user", "ki").Fuzziness("AUTO").MaxExpansions(50).Transpositions(true),
|
||||
map[string]interface{}{
|
||||
"fuzzy": map[string]interface{}{
|
||||
"user": map[string]interface{}{
|
||||
"value": "ki",
|
||||
"fuzziness": "AUTO",
|
||||
"max_expansions": 50,
|
||||
"transpositions": true,
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"term",
|
||||
Term("user", "Kimchy").Boost(1.3),
|
||||
map[string]interface{}{
|
||||
"term": map[string]interface{}{
|
||||
"user": map[string]interface{}{
|
||||
"value": "Kimchy",
|
||||
"boost": 1.3,
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"terms",
|
||||
Terms("user").Values("bla", "pl").Boost(1.3),
|
||||
map[string]interface{}{
|
||||
"terms": map[string]interface{}{
|
||||
"user": []string{"bla", "pl"},
|
||||
"boost": 1.3,
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
"terms_set",
|
||||
TermsSet("programming_languages", "go", "rust", "COBOL").MinimumShouldMatchField("required_matches"),
|
||||
map[string]interface{}{
|
||||
"terms_set": map[string]interface{}{
|
||||
"programming_languages": map[string]interface{}{
|
||||
"terms": []string{"go", "rust", "COBOL"},
|
||||
"minimum_should_match_field": "required_matches",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
})
|
||||
}
|
|
@ -1,33 +0,0 @@
|
|||
package esquery
|
||||
|
||||
import (
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestTermLevel(t *testing.T) {
|
||||
runTests(t, []queryTest{
|
||||
{"exists", Exists("title"), "{\"exists\":{\"field\":\"title\"}}\n"},
|
||||
|
||||
{"ids", IDs("1", "4", "100"), "{\"ids\":{\"values\":[\"1\",\"4\",\"100\"]}}\n"},
|
||||
|
||||
{"simple prefix", Prefix("user", "ki"), "{\"prefix\":{\"user\":{\"value\":\"ki\"}}}\n"},
|
||||
|
||||
{"complex prefix", Prefix("user", "ki").Rewrite("ji"), "{\"prefix\":{\"user\":{\"value\":\"ki\",\"rewrite\":\"ji\"}}}\n"},
|
||||
|
||||
{"int range", Range("age").Gte(10).Lte(20).Boost(2.0), "{\"range\":{\"age\":{\"gte\":10,\"lte\":20,\"boost\":2}}}\n"},
|
||||
|
||||
{"string range", Range("timestamp").Gte("now-1d/d").Lt("now/d").Relation(CONTAINS), "{\"range\":{\"timestamp\":{\"gte\":\"now-1d/d\",\"lt\":\"now/d\",\"relation\":\"CONTAINS\"}}}\n"},
|
||||
|
||||
{"regexp", Regexp("user", "k.*y").Flags("ALL").MaxDeterminizedStates(10000).Rewrite("constant_score"), "{\"regexp\":{\"user\":{\"value\":\"k.*y\",\"flags\":\"ALL\",\"max_determinized_states\":10000,\"rewrite\":\"constant_score\"}}}\n"},
|
||||
|
||||
{"wildcard", Wildcard("user", "ki*y").Rewrite("constant_score"), "{\"wildcard\":{\"user\":{\"value\":\"ki*y\",\"rewrite\":\"constant_score\"}}}\n"},
|
||||
|
||||
{"fuzzy", Fuzzy("user", "ki").Fuzziness("AUTO").MaxExpansions(50).Transpositions(true), "{\"fuzzy\":{\"user\":{\"value\":\"ki\",\"fuzziness\":\"AUTO\",\"max_expansions\":50,\"transpositions\":true}}}\n"},
|
||||
|
||||
{"term", Term("user", "Kimchy").Boost(1.3), "{\"term\":{\"user\":{\"value\":\"Kimchy\",\"boost\":1.3}}}\n"},
|
||||
|
||||
{"terms", Terms("user").Values("bla", "pl").Boost(1.3), "{\"terms\":{\"boost\":1.3,\"user\":[\"bla\",\"pl\"]}}\n"},
|
||||
|
||||
{"terms_set", TermsSet("programming_languages", "go", "rust", "COBOL").MinimumShouldMatchField("required_matches"), "{\"terms_set\":{\"programming_languages\":{\"terms\":[\"go\",\"rust\",\"COBOL\"],\"minimum_should_match_field\":\"required_matches\"}}}\n"},
|
||||
})
|
||||
}
|
Loading…
Reference in New Issue