Skip to content

Auto import dangling indices #2067

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
kimchy opened this issue Jun 28, 2012 · 4 comments
Closed

Auto import dangling indices #2067

kimchy opened this issue Jun 28, 2012 · 4 comments

Comments

@kimchy
Copy link
Member

kimchy commented Jun 28, 2012

Dangling indices happen when a node that has several indices stored locally, joins a cluster and those local indices do not exists in the cluster metadata. This usually does not happen, especially not with proper gateway.recover_after_nodes flag, but still, users can by mistake get into this state.

A new setting gateway.local.auto_import_dangled setting, with possible values of no (never import dangling indices, but also delay the delation of them), yes (import dangling indices), and closed (import dangling indices, but in closed state). The default value is yes.

@kimchy kimchy closed this as completed in f2e39e4 Jun 28, 2012
hibnico pushed a commit to hibnico/elasticsearch that referenced this issue Jul 6, 2012
@medcl
Copy link
Contributor

medcl commented Aug 11, 2012

mark

@synhershko
Copy link
Contributor

What is the intended behavior?

We are seeing the following: running a cluster of 2 data nodes, and a Java application which uses an instance of a Client as a non-data node, we index some documents into several different indices, and shutdown the application.

Then we issue a delete all indexes request: curl -XDELETE 'http://localhost:9200/', and can see the command was acknowledged and performed.

The next time we run the application, we see the following - all indexes being recreated on one of the data nodes, even tho the Java app was hosting a non-data node, and they were explicitly marked for deletion on the data node:

[2012-10-03 12:17:41,131][INFO ][cluster.service ] [Firearm] added {[Landslide][pT7UUb00TNWYBTeluR6mcw][inet[/192.168.1.8:9302]],}, reason: zen-disco-receive(join from node[[Landslide][pT7UUb00TNWYBTeluR6mcw][inet[/192.168.1.8:9302]]])
[2012-10-03 12:17:41,320][INFO ][gateway.local.state.meta ] [Firearm] auto importing dangled indices [2008-01-01-1200/OPEN][2010-10-29-12-00/OPEN][2005-01-01-0000/OPEN][2006-01-01-0000/OPEN][2009-01-01-0000/OPEN][2002-01-01-1200/OPEN][2004-01-01-0000/OPEN][2011-08-08-12-00/OPEN][2007-09-02-12-00/OPEN][2004-08-28-12-00/OPEN][2009-08-09-12-00/OPEN][2005-01-01-1200/OPEN][foo/OPEN][2010-01-01-0000/OPEN][2001-01-01-1200/OPEN][2002-01-01-0000/OPEN][2011-12-12-12-00/OPEN][2010-10-28-12-00/OPEN][2001-01-01-0000/OPEN][2007-01-01-1200/OPEN][2002-12-09-12-00/OPEN][2010-01-01-1200/OPEN][2004-01-01-1200/OPEN][2012-01-12-12-00/OPEN][2011-01-01-0000/OPEN][2010-11-23-12-00/OPEN][2011-08-07-12-00/OPEN][2011-01-01-1200/OPEN][2012-01-13-12-00/OPEN][2012-01-01-1200/OPEN][2012-01-01-0000/OPEN][2007-09-08-12-00/OPEN][2009-01-01-1200/OPEN][el-2005-01-01-0000/OPEN][2007-01-01-0000/OPEN][2008-01-01-0000/OPEN][2009-08-15-12-00/OPEN][2010-11-22-12-00/OPEN][2003-01-01-1200/OPEN][2004-08-22-12-00/OPEN][2006-01-01-1200/OPEN][2003-01-01-0000/OPEN][2011-12-13-12-00/OPEN] from [[Landslide][pT7UUb00TNWYBTeluR6mcw][inet[/192.168.1.8:9302]]]

It's either I'm missing something, or it is an issue with both a non-data node storing data it shouldn't and no tombstones are kept.

mute pushed a commit to mute/elasticsearch that referenced this issue Jul 29, 2015
@gvkrf
Copy link

gvkrf commented Feb 27, 2017

Hi kimchy,

I would like to make Auto import dangling indices to "no", can you please help me out here how to do it and which file i need to change. Can you tell me the file location.

@tvernum
Copy link
Contributor

tvernum commented Feb 27, 2017

@gvkrf Please do not use GitHub issues to ask questions about how to use Elasticsearch.

Please head to our discussion site http://discuss.elastic.co/ and ask your question there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants