Elasticsearch Python查询超过10000笔数据解决方法
起因
最近在做数据收集以及分析,目前收集的数据使用的是ES目前已经超过10W笔,当我想要将所以有数据从ES抓下来做分析的时候遇到了问题我使用form size 来做分页一开始查询第0至10000笔数据都是正常的但是当我想查询10000 至20000 笔数据就报错了查询代码如下
GET index/_search
{
"from ":10000,
"size" : 10000,
"query":{
"match_all":{}
}
}
报错如下
{
"error" : {
"root_cause" : [
{
"type" : "illegal_argument_exception",
"reason" : "Result window is too large, from + size must be less than or equal to: [10000] but was [20000]. See the scroll api for a more efficient way to request large data sets. This limit can be set by changing the [index.max_result_window] index level setting."
}
],
"type" : "search_phase_execution_exception",
"reason" : "all shards failed",
"phase" : "query",
"grouped" : true,
"failed_shards" : [
{
"shard" : 0,
"index" : "new_channel",
"node" : "dLHMyyNfQVuY-RSE1tPguQ",
"reason" : {
"type" : "illegal_argument_exception",
"reason" : "Result window is too large, from + size must be less than or equal to: [10000] but was [20000]. See the scroll api for a more efficient way to request large data sets. This limit can be set by changing the [index.max_result_window] index level setting."
}
}
],
"caused_by" : {
"type" : "illegal_argument_exception",
"reason" : "Result window is too large, from + size must be less than or equal to: [10000] but was [20000]. See the scroll api for a more efficient way to request large data sets. This limit can be set by changing the [index.max_result_window] index level setting.",
"caused_by" : {
"type" : "illegal_argument_exception",