Could not execute search -


(J T) #1

Getting message “Could not execute search” “Error Message: Unable to perform search query” “Search status code: 500” after snapshot and restore on the main search page.

elasticsearch.yml

luster.name: graylog
path.repo: ["/mnt/nfsmount/0529_backup"]
network.host: <ServerIP>

Graylog Server.conf

is_master = true
node_id_file = /etc/graylog/server/node-id
password_secret = <password_secret>
root_username = admin
root_password_sha2 = <password hash>
root_timezone = America/New_York
plugin_dir = /usr/share/graylog-server/plugin
rest_listen_uri = http://<ServerIP>:9000/api/
web_listen_uri = http://<ServerIP>:9000/
rotation_strategy = count
elasticsearch_max_docs_per_index = 20000000
elasticsearch_max_number_of_indices = 20
retention_strategy = delete
elasticsearch_shards = 4
elasticsearch_replicas = 0
elasticsearch_index_prefix = graylog
allow_leading_wildcard_searches = false
allow_highlighting = false
elasticsearch_analyzer = standard
output_batch_size = 500
output_flush_interval = 1
output_fault_count_threshold = 5
output_fault_penalty_seconds = 30
processbuffer_processors = 5
outputbuffer_processors = 3
processor_wait_strategy = blocking
ring_size = 65536
inputbuffer_ring_size = 65536
inputbuffer_processors = 2
inputbuffer_wait_strategy = blocking
message_journal_enabled = true
message_journal_dir = /var/lib/graylog-server/journal
lb_recognition_period_seconds = 3
mongodb_uri = mongodb://localhost/graylog
mongodb_max_connections = 1000
mongodb_threads_allowed_to_block_multiplier = 5
content_packs_dir = /usr/share/graylog-server/contentpacks
content_packs_auto_load = grok-patterns.json
proxied_requests_thread_pool_size = 32

Graylog Server.log

2018-05-29T19:07:23.683Z INFO [LookupTableService] Data Adapter whois/5b045ae2cb93e50aaf27e254 [551c73ba] STARTING
2018-05-29T19:07:23.684Z INFO [LookupTableService] Data Adapter abuse-ch-ransomware-domains/5b045ae2cb93e50aaf27e257 [5608bd7a] STARTING
2018-05-29T19:07:23.690Z INFO [LookupTableService] Data Adapter abuse-ch-ransomware-ip/5b045ae2cb93e50aaf27e255 [4ea53447] RUNNING
2018-05-29T19:07:23.705Z INFO [LookupTableService] Data Adapter spamhaus-drop/5b045ae2cb93e50aaf27e258 [71c866ec] RUNNING
2018-05-29T19:07:23.689Z ERROR [LookupDataAdapter] Couldn’t start data adapter <abuse-ch-ransomware-domains/5b045ae2cb93e50aaf27e257/5608bd7a>
org.graylog.plugins.threatintel.tools.AdapterDisabledException: Abuse.ch service is disabled, not starting adapter. To enable it please go to System / Configurations.
at org.graylog.plugins.threatintel.adapters.abusech.AbuseChRansomAdapter.doStart(AbuseChRansomAdapter.java:80) ~[?:?]
at org.graylog2.plugin.lookup.LookupDataAdapter.startUp(LookupDataAdapter.java:59) [graylog.jar:?]
at com.google.common.util.concurrent.AbstractIdleService$DelegateService$1.run(AbstractIdleService.java:62) [graylog.jar:?]
at com.google.common.util.concurrent.Callables$4.run(Callables.java:122) [graylog.jar:?]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_171]
2018-05-29T19:07:23.722Z INFO [LookupTableService] Data Adapter abuse-ch-ransomware-domains/5b045ae2cb93e50aaf27e257 [5608bd7a] RUNNING
2018-05-29T19:07:23.721Z INFO [LookupTableService] Data Adapter otx-api-ip/5b045ae2cb93e50aaf27e253 [5b17fb18] RUNNING
2018-05-29T19:07:23.724Z INFO [LookupTableService] Data Adapter whois/5b045ae2cb93e50aaf27e254 [551c73ba] RUNNING
2018-05-29T19:07:23.725Z INFO [LookupTableService] Data Adapter tor-exit-node/5b045ae2cb93e50aaf27e256 [1ad00d2c] RUNNING
2018-05-29T19:07:23.734Z INFO [LookupTableService] Data Adapter otx-api-domain/5b045ae2cb93e50aaf27e252 [4adfd877] RUNNING
2018-05-29T19:07:23.802Z INFO [LookupTableService] Cache whois-cache/5b045ae2cb93e50aaf27e24d [74c592a1] STARTING
2018-05-29T19:07:23.804Z INFO [LookupTableService] Cache threat-intel-uncached-adapters/5b045ae2cb93e50aaf27e24f [41e8da9d] STARTING
2018-05-29T19:07:23.804Z INFO [LookupTableService] Cache spamhaus-e-drop-cache/5b045ae2cb93e50aaf27e24c [7930d868] STARTING
2018-05-29T19:07:23.805Z INFO [LookupTableService] Cache otx-api-domain-cache/5b045ae2cb93e50aaf27e24e [2577617b] STARTING
2018-05-29T19:07:23.822Z INFO [LookupTableService] Cache otx-api-ip-cache/5b045ae2cb93e50aaf27e24b [22a79239] STARTING
2018-05-29T19:07:23.860Z INFO [LookupTableService] Cache whois-cache/5b045ae2cb93e50aaf27e24d [74c592a1] RUNNING
2018-05-29T19:07:23.866Z INFO [LookupTableService] Cache threat-intel-uncached-adapters/5b045ae2cb93e50aaf27e24f [41e8da9d] RUNNING
2018-05-29T19:07:23.866Z INFO [LookupTableService] Cache otx-api-domain-cache/5b045ae2cb93e50aaf27e24e [2577617b] RUNNING
2018-05-29T19:07:23.869Z INFO [LookupTableService] Cache spamhaus-e-drop-cache/5b045ae2cb93e50aaf27e24c [7930d868] RUNNING
2018-05-29T19:07:23.874Z INFO [LookupTableService] Cache otx-api-ip-cache/5b045ae2cb93e50aaf27e24b [22a79239] RUNNING
2018-05-29T19:07:23.962Z INFO [LookupTableService] Starting lookup table abuse-ch-ransomware-ip/5b045ae2cb93e50aaf27e25a [1a30312d] using cache threat-intel-uncached-adapters/5b045ae2cb93e50aaf27e24f [41e8da9d], data adapter abuse-ch-ransomware-ip/5b045ae2cb93e50aaf27e255 [4ea53447]
2018-05-29T19:07:23.962Z INFO [LookupTableService] Starting lookup table abuse-ch-ransomware-domains/5b045ae2cb93e50aaf27e25b [32b7be8f] using cache threat-intel-uncached-adapters/5b045ae2cb93e50aaf27e24f [41e8da9d], data adapter abuse-ch-ransomware-domains/5b045ae2cb93e50aaf27e257 [5608bd7a]
2018-05-29T19:07:23.962Z INFO [LookupTableService] Starting lookup table tor-exit-node-list/5b045ae2cb93e50aaf27e25c [53497d2d] using cache threat-intel-uncached-adapters/5b045ae2cb93e50aaf27e24f [41e8da9d], data adapter tor-exit-node/5b045ae2cb93e50aaf27e256 [1ad00d2c]
2018-05-29T19:07:23.962Z INFO [LookupTableService] Starting lookup table otx-api-ip/5b045ae2cb93e50aaf27e25d [3a8d0c57] using cache otx-api-ip-cache/5b045ae2cb93e50aaf27e24b [22a79239], data adapter otx-api-ip/5b045ae2cb93e50aaf27e253 [5b17fb18]
2018-05-29T19:07:23.963Z INFO [LookupTableService] Starting lookup table whois/5b045ae2cb93e50aaf27e25e [212ffb72] using cache whois-cache/5b045ae2cb93e50aaf27e24d [74c592a1], data adapter whois/5b045ae2cb93e50aaf27e254 [551c73ba]
2018-05-29T19:07:23.963Z INFO [LookupTableService] Starting lookup table spamhaus-drop/5b045ae2cb93e50aaf27e25f [2e636a1e] using cache spamhaus-e-drop-cache/5b045ae2cb93e50aaf27e24c [7930d868], data adapter spamhaus-drop/5b045ae2cb93e50aaf27e258 [71c866ec]
2018-05-29T19:07:23.963Z INFO [LookupTableService] Starting lookup table otx-api-domain/5b045ae2cb93e50aaf27e260 [36dbcaea] using cache otx-api-domain-cache/5b045ae2cb93e50aaf27e24e [2577617b], data adapter otx-api-domain/5b045ae2cb93e50aaf27e252 [4adfd877]
2018-05-29T19:07:24.295Z INFO [JerseyService] Enabling CORS for HTTP endpoint
2018-05-29T19:07:36.391Z INFO [NetworkListener] Started listener bound to [<ServerIP>:9000]
2018-05-29T19:07:36.393Z INFO [HttpServer] [HttpServer] Started.
2018-05-29T19:07:36.394Z INFO [JerseyService] Started REST API at <http://<ServerIP>:9000/api/>
2018-05-29T19:07:36.394Z INFO [JerseyService] Started Web Interface at <http://<ServerIP>:9000/>
2018-05-29T19:07:36.395Z INFO [ServiceManagerListener] Services are healthy
2018-05-29T19:07:36.395Z INFO [ServerBootstrap] Services started, startup times in ms: {InputSetupService [RUNNING]=4, OutputSetupService [RUNNING]=20, BufferSynchronizerService [RUNNING]=33, KafkaJournal [RUNNING]=38, StreamCacheService [RUNNING]=108, ConfigurationEtagService [RUNNING]=108, JournalReader [RUNNING]=142, PeriodicalsService [RUNNING]=181, LookupTableService [RUNNING]=546, JerseyService [RUNNING]=12986}
2018-05-29T19:07:36.397Z INFO [InputSetupService] Triggering launching persisted inputs, node transitioned from Uninitialized [LB:DEAD] to Running [LB:ALIVE]
2018-05-29T19:07:36.401Z INFO [ServerBootstrap] Graylog server up and running.
2018-05-29T19:07:36.438Z INFO [InputStateListener] Input [Syslog UDP/58779f40b85fe8065554aa6f] is now STARTING
2018-05-29T19:07:36.440Z INFO [InputStateListener] Input [GELF UDP/58779f40b85fe8065554aa72] is now STARTING
2018-05-29T19:07:36.444Z INFO [InputStateListener] Input [GELF UDP/5877c7f3b85fe8038960fafa] is now STARTING
2018-05-29T19:07:36.584Z WARN [NettyTransport] receiveBufferSize (SO_RCVBUF) for input GELFUDPInput{title=WinLogs-gelf, type=org.graylog2.inputs.gelf.udp.GELFUDPInput, nodeId=null} should be 1048576 but is 212992.
2018-05-29T19:07:36.584Z WARN [NettyTransport] receiveBufferSize (SO_RCVBUF) for input GELFUDPInput{title=appliance-gelf-udp, type=org.graylog2.inputs.gelf.udp.GELFUDPInput, nodeId=75678ef0-d302-48d8-8b5e-f9a439b9e7d1} should be 1048576 but is 212992.
2018-05-29T19:07:36.598Z INFO [InputStateListener] Input [GELF UDP/5877c7f3b85fe8038960fafa] is now RUNNING
2018-05-29T19:07:36.603Z WARN [NettyTransport] receiveBufferSize (SO_RCVBUF) for input SyslogUDPInput{title=appliance-syslog-udp, type=org.graylog2.inputs.syslog.udp.SyslogUDPInput, nodeId=75678ef0-d302-48d8-8b5e-f9a439b9e7d1} should be 262144 but is 212992.
2018-05-29T19:07:36.605Z INFO [InputStateListener] Input [GELF UDP/58779f40b85fe8065554aa72] is now RUNNING
2018-05-29T19:07:36.613Z INFO [InputStateListener] Input [Syslog UDP/58779f40b85fe8065554aa6f] is now RUNNING
2018-05-29T19:07:38.479Z INFO [IndexRangesCleanupPeriodical] Removing index range information for unavailable indices: [test__0]
2018-05-29T19:08:44.131Z ERROR [AESTools] Could not decrypt value.
javax.crypto.BadPaddingException: Given final block not properly padded. Such issues can arise if a bad key is used during decryption.
at com.sun.crypto.provider.CipherCore.doFinal(CipherCore.java:991) ~[sunjce_provider.jar:1.8.0_171]
at com.sun.crypto.provider.CipherCore.doFinal(CipherCore.java:847) ~[sunjce_provider.jar:1.8.0_171]
at com.sun.crypto.provider.AESCipher.engineDoFinal(AESCipher.java:446) ~[sunjce_provider.jar:1.8.0_171]
at javax.crypto.Cipher.doFinal(Cipher.java:2164) ~[?:1.8.0_171]
at org.graylog2.security.AESTools.decrypt(AESTools.java:52) [graylog.jar:?]
at org.graylog2.security.ldap.LdapSettingsImpl.getSystemPassword(LdapSettingsImpl.java:137) [graylog.jar:?]
at org.graylog2.security.ldap.LdapSettingsServiceImpl.load(LdapSettingsServiceImpl.java:57) [graylog.jar:?]
at org.graylog2.security.realm.LdapUserAuthenticator.isEnabled(LdapUserAuthenticator.java:172) [graylog.jar:?]
at org.graylog2.security.realm.LdapUserAuthenticator.doGetAuthenticationInfo(LdapUserAuthenticator.java:90) [graylog.jar:?]
at org.apache.shiro.realm.AuthenticatingRealm.getAuthenticationInfo(AuthenticatingRealm.java:571) [graylog.jar:?]
at org.apache.shiro.authc.pam.ModularRealmAuthenticator.doMultiRealmAuthentication(ModularRealmAuthenticator.java:219) [graylog.jar:?]
at org.apache.shiro.authc.pam.ModularRealmAuthenticator.doAuthenticate(ModularRealmAuthenticator.java:269) [graylog.jar:?]
at org.apache.shiro.authc.AbstractAuthenticator.authenticate(AbstractAuthenticator.java:198) [graylog.jar:?]
at org.apache.shiro.mgt.AuthenticatingSecurityManager.authenticate(AuthenticatingSecurityManager.java:106) [graylog.jar:?]
at org.apache.shiro.mgt.DefaultSecurityManager.login(DefaultSecurityManager.java:274) [graylog.jar:?]
at org.apache.shiro.subject.support.DelegatingSubject.login(DelegatingSubject.java:260) [graylog.jar:?]
at org.graylog2.rest.resources.system.SessionsResource.newSession(SessionsResource.java:134) [graylog.jar:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_171]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_171]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_171]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_171]
at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81) [graylog.jar:?]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:144) [graylog.jar:?]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:161) [graylog.jar:?]
at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:205) [graylog.jar:?]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:99) [graylog.jar:?]
at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:389) [graylog.jar:?]
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:347) [graylog.jar:?]
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:102) [graylog.jar:?]
at org.glassfish.jersey.server.ServerRuntime$2.run(ServerRuntime.java:326) [graylog.jar:?]
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271) [graylog.jar:?]
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267) [graylog.jar:?]
at org.glassfish.jersey.internal.Errors.process(Errors.java:315) [graylog.jar:?]
at org.glassfish.jersey.internal.Errors.process(Errors.java:297) [graylog.jar:?]
at org.glassfish.jersey.internal.Errors.process(Errors.java:267) [graylog.jar:?]
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:317) [graylog.jar:?]
at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:305) [graylog.jar:?]
at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1154) [graylog.jar:?]
at org.glassfish.jersey.grizzly2.httpserver.GrizzlyHttpContainer.service(GrizzlyHttpContainer.java:384) [graylog.jar:?]
at org.glassfish.grizzly.http.server.HttpHandler$1.run(HttpHandler.java:224) [graylog.jar:?]
at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:176) [graylog.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_171]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_171]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_171]

I am at a loss, can anyone shed some more light on where I am going wrong?


(J T) #3

My bad here. I deleted an index and I had to go into mongodb and drop the index_ranges collection:

db.index_ranges.drop()

and then reindex in Graylog


(system) #4

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.