Failed to connect to http://localhost:8086 on Digital ocean droplet

#1

I have created a Droplet in Digital Ocean and installed influxdb inside it. I have attached a Volume to which it writes data. It works fine for sometime ( may be a day or two ).
Then suddenly it shows the error :

root@influxdb-01:~# influx
Failed to connect to http://localhost:8086: Get http://localhost:8086/ping: dial tcp 127.0.0.1:8086: connect: connection refused
Please check your connection settings and ensure 'influxd' is running.
root@influxdb-01:~# sudo systemctl status influxd
● influxdb.service - InfluxDB is an open-source, distributed, time series database
   Loaded: loaded (/lib/systemd/system/influxdb.service; enabled; vendor preset: enabled)
   Active: active (running) since Wed 2019-05-15 09:41:21 UTC; 1s ago
     Docs: https://docs.influxdata.com/influxdb/
 Main PID: 13886 (influxd)
    Tasks: 5 (limit: 1152)
   CGroup: /system.slice/influxdb.service
           └─13886 /usr/bin/influxd -config /etc/influxdb/influxdb.conf

May 15 09:41:21 influxdb-01 systemd[1]: Started InfluxDB is an open-source, distributed, time series database.
May 15 09:41:23 influxdb-01 influxd[13886]: ts=2019-05-15T09:41:23.385025Z lvl=info msg="InfluxDB starting" log_id=0F
May 15 09:41:23 influxdb-01 influxd[13886]: ts=2019-05-15T09:41:23.389548Z lvl=info msg="Go runtime" log_id=0FQijFEG0
lines 1-12/12 (END)...skipping...
● influxdb.service - InfluxDB is an open-source, distributed, time series database
   Loaded: loaded (/lib/systemd/system/influxdb.service; enabled; vendor preset: enabled)
   Active: active (running) since Wed 2019-05-15 09:41:21 UTC; 1s ago
     Docs: https://docs.influxdata.com/influxdb/
 Main PID: 13886 (influxd)
    Tasks: 5 (limit: 1152)
   CGroup: /system.slice/influxdb.service
           └─13886 /usr/bin/influxd -config /etc/influxdb/influxdb.conf

May 15 09:41:21 influxdb-01 systemd[1]: Started InfluxDB is an open-source, distributed, time series database.
May 15 09:41:23 influxdb-01 influxd[13886]: ts=2019-05-15T09:41:23.385025Z lvl=info msg="InfluxDB starting" log_id=0FQijFEG000 version=1.7.6 branch=1.7 commit=01c8dd416270f424ab0c40f9291e269ac6921964
May 15 09:41:23 influxdb-01 influxd[13886]: ts=2019-05-15T09:41:23.389548Z lvl=info msg="Go runtime" log_id=0FQijFEG000 version=go1.11 maxprocs=1

I am not getting it why it happens each time, since the volume has so much memory left.
Please help.

#2

Hi @itsksaurabh ,

do you see errors with : journalctl -u influxdb ?

best regards ,

Marc

#3

@MarcV I am geting this.

root@influxdb-01:~# journalctl -u influxdb
-- Logs begin at Wed 2019-05-15 04:45:10 UTC, end at Wed 2019-05-15 10:21:36 UTC. --
May 15 04:45:10 influxdb-01 influxd[28481]: ts=2019-05-15T04:45:10.338543Z lvl=info msg="Compacting file" log_id=0FQSla60000 engine=tsm1 tsm1_level=1 tsm1_strategy=level trace_id=0FQSmTGG000 op_name=tsm1_
May 15 04:45:10 influxdb-01 influxd[28481]: ts=2019-05-15T04:45:10.338553Z lvl=info msg="Compacting file" log_id=0FQSla60000 engine=tsm1 tsm1_level=1 tsm1_strategy=level trace_id=0FQSmTGG000 op_name=tsm1_
May 15 04:45:10 influxdb-01 influxd[28481]: [httpd] 68.183.77.31 - abc [15/May/2019:04:45:10 +0000] "POST /write?consistency=any&db=trades HTTP/1.1" 204 0 "-" "Telegraf/1.10.3" 38411938-76cc-11e9-800
May 15 04:45:10 influxdb-01 influxd[28481]: [httpd] 68.183.77.31 - abc [15/May/2019:04:45:10 +0000] "POST /write?consistency=any&db=markets HTTP/1.1" 204 0 "-" "Telegraf/1.10.3" 38371b37-76cc-11e9-80
May 15 04:45:11 influxdb-01 influxd[28481]: [httpd] 68.183.77.31 - abc [15/May/2019:04:45:10 +0000] "POST /write?consistency=any&db=markets HTTP/1.1" 204 0 "-" "Telegraf/1.10.3" 38538ac3-76cc-11e9-80
May 15 04:45:11 influxdb-01 influxd[28481]: [httpd] 68.183.77.31 - abc [15/May/2019:04:45:10 +0000] "POST /write?consistency=any&db=trades HTTP/1.1" 204 0 "-" "Telegraf/1.10.3" 3838a1d4-76cc-11e9-800
May 15 04:45:11 influxdb-01 influxd[28481]: [httpd] 68.183.77.31 - abc [15/May/2019:04:45:10 +0000] "POST /write?consistency=any&db=markets HTTP/1.1" 204 0 "-" "Telegraf/1.10.3" 383af1f1-76cc-11e9-80
May 15 04:45:11 influxdb-01 influxd[28481]: [httpd] 68.183.77.31 - abc [15/May/2019:04:45:10 +0000] "POST /write?consistency=any&db=markets HTTP/1.1" 204 0 "-" "telegraf" 38442dcc-76cc-11e9-8005-625b
May 15 04:45:11 influxdb-01 influxd[28481]: [httpd] 68.183.77.31 - abc [15/May/2019:04:45:11 +0000] "POST /write?consistency=any&db=trades HTTP/1.1" 204 0 "-" "Telegraf/1.10.3" 38e29ffe-76cc-11e9-800
May 15 04:45:11 influxdb-01 influxd[28481]: [httpd] 68.183.77.31 - abc [15/May/2019:04:45:11 +0000] "POST /write?consistency=any&db=markets HTTP/1.1" 204 0 "-" "Telegraf/1.10.3" 38e2c150-76cc-11e9-80
May 15 04:45:11 influxdb-01 influxd[28481]: [httpd] 68.183.77.31 - abc [15/May/2019:04:45:11 +0000] "POST /write?consistency=any&db=trades HTTP/1.1" 204 0 "-" "Telegraf/1.10.3" 39065c42-76cc-11e9-800
May 15 04:45:11 influxdb-01 influxd[28481]: fatal error: runtime: out of memory
May 15 04:45:11 influxdb-01 influxd[28481]: runtime stack:
May 15 04:45:11 influxdb-01 influxd[28481]: runtime.throw(0x12dae55, 0x16)
May 15 04:45:11 influxdb-01 influxd[28481]:         /usr/local/go/src/runtime/panic.go:608 +0x72
May 15 04:45:11 influxdb-01 influxd[28481]: runtime.sysMap(0xc030000000, 0x4000000, 0x21cb618)
May 15 04:45:11 influxdb-01 influxd[28481]:         /usr/local/go/src/runtime/mem_linux.go:156 +0xc7
May 15 04:45:11 influxdb-01 influxd[28481]: runtime.(*mheap).sysAlloc(0x21b1860, 0x4000000, 0xffffffff000adb21, 0x7f4cd211bd00)
May 15 04:45:11 influxdb-01 influxd[28481]:         /usr/local/go/src/runtime/malloc.go:619 +0x1c7
May 15 04:45:11 influxdb-01 influxd[28481]: runtime.(*mheap).grow(0x21b1860, 0x1ac, 0x0)
May 15 04:45:11 influxdb-01 influxd[28481]:         /usr/local/go/src/runtime/mheap.go:920 +0x42
May 15 04:45:11 influxdb-01 influxd[28481]: runtime.(*mheap).allocSpanLocked(0x21b1860, 0x1ac, 0x21cb628, 0x20300b00000000)
May 15 04:45:11 influxdb-01 influxd[28481]:         /usr/local/go/src/runtime/mheap.go:848 +0x337
May 15 04:45:11 influxdb-01 influxd[28481]: runtime.(*mheap).alloc_m(0x21b1860, 0x1ac, 0x410100, 0x7f4cd238c500)
May 15 04:45:11 influxdb-01 influxd[28481]:         /usr/local/go/src/runtime/mheap.go:692 +0x119
May 15 04:45:11 influxdb-01 influxd[28481]: runtime.(*mheap).alloc.func1()
May 15 04:45:11 influxdb-01 influxd[28481]:         /usr/local/go/src/runtime/mheap.go:759 +0x4c
May 15 04:45:11 influxdb-01 influxd[28481]: runtime.(*mheap).alloc(0x21b1860, 0x1ac, 0x7f4cd2010100, 0x418095)
May 15 04:45:11 influxdb-01 influxd[28481]:         /usr/local/go/src/runtime/mheap.go:758 +0x8a
May 15 04:45:11 influxdb-01 influxd[28481]: runtime.largeAlloc(0x358000, 0x450001, 0x7f4d03e0e000)
May 15 04:45:11 influxdb-01 influxd[28481]:         /usr/local/go/src/runtime/malloc.go:1019 +0x97
May 15 04:45:11 influxdb-01 influxd[28481]: runtime.mallocgc.func1()
May 15 04:45:11 influxdb-01 influxd[28481]:         /usr/local/go/src/runtime/malloc.go:914 +0x46
May 15 04:45:11 influxdb-01 influxd[28481]: runtime.systemstack(0x0)
May 15 04:45:11 influxdb-01 influxd[28481]:         /usr/local/go/src/runtime/asm_amd64.s:351 +0x66
May 15 04:45:11 influxdb-01 influxd[28481]: runtime.mstart()
May 15 04:45:11 influxdb-01 influxd[28481]:         /usr/local/go/src/runtime/proc.go:1229
May 15 04:45:11 influxdb-01 influxd[28481]: goroutine 894 [running]:
May 15 04:45:11 influxdb-01 influxd[28481]: runtime.systemstack_switch()
May 15 04:45:11 influxdb-01 influxd[28481]:         /usr/local/go/src/runtime/asm_amd64.s:311 fp=0xc02785ac70 sp=0xc02785ac68 pc=0x45b890
May 15 04:45:11 influxdb-01 influxd[28481]: runtime.mallocgc(0x358000, 0x1090f40, 0xc00025b901, 0xc00bf1a000)
May 15 04:45:11 influxdb-01 influxd[28481]:         /usr/local/go/src/runtime/malloc.go:913 +0x896 fp=0xc02785ad10 sp=0xc02785ac70 pc=0x40def6
May 15 04:45:11 influxdb-01 influxd[28481]: runtime.growslice(0x1090f40, 0xc02f078000, 0x1a217, 0x1c800, 0x1d0ca, 0x2e54, 0x2e54, 0x2e54)
May 15 04:45:11 influxdb-01 influxd[28481]:         /usr/local/go/src/runtime/slice.go:204 +0x145 fp=0xc02785ad78 sp=0xc02785ad10 pc=0x444ad5
May 15 04:45:11 influxdb-01 influxd[28481]: github.com/influxdata/influxdb/tsdb/engine/tsm1.(*ring).keys(0xc000ae6fa0, 0x101, 0xc0000576a0, 0xf592b3, 0x15bfee0)
May 15 04:45:11 influxdb-01 influxd[28481]:         /go/src/github.com/influxdata/influxdb/tsdb/engine/tsm1/ring.go:123 +0x1b4 fp=0xc02785ae18 sp=0xc02785ad78 pc=0xfa93b4
May 15 04:45:11 influxdb-01 influxd[28481]: github.com/influxdata/influxdb/tsdb/engine/tsm1.(*Cache).Keys(0xc013564c60, 0x1, 0x159eaf9d2405d8e9, 0xc000000008)
May 15 04:45:11 influxdb-01 influxd[28481]:         /go/src/github.com/influxdata/influxdb/tsdb/engine/tsm1/cache.go:504 +0x78 fp=0xc02785ae68 sp=0xc02785ae18 pc=0xf43b58
May 15 04:45:11 influxdb-01 influxd[28481]: github.com/influxdata/influxdb/tsdb/engine/tsm1.NewCacheKeyIterator(0xc013564c60, 0x3e8, 0xc0122430e0, 0x20, 0xc00555aac0)
May 15 04:45:11 influxdb-01 influxd[28481]:         /go/src/github.com/influxdata/influxdb/tsdb/engine/tsm1/compact.go:1875 +0x40 fp=0xc02785af00 sp=0xc02785ae68 pc=0xf54d80
May 15 04:45:11 influxdb-01 influxd[28481]: github.com/influxdata/influxdb/tsdb/engine/tsm1.(*Compactor).WriteSnapshot.func1(0xc0122430e0, 0xc000117e60, 0xc01992d962, 0xc00ef1ad20, 0xc013564c60)
May 15 04:45:11 influxdb-01 influxd[28481]:         /go/src/github.com/influxdata/influxdb/tsdb/engine/tsm1/compact.go:846 +0x56 fp=0xc02785afb8 sp=0xc02785af00 pc=0xfbd5c6
May 15 04:45:11 influxdb-01 influxd[28481]: runtime.goexit()
May 15 04:45:11 influxdb-01 influxd[28481]:         /usr/local/go/src/runtime/asm_amd64.s:1333 +0x1 fp=0xc02785afc0 sp=0xc02785afb8 pc=0x45d971
May 15 04:45:11 influxdb-01 influxd[28481]: created by github.com/influxdata/influxdb/tsdb/engine/tsm1.(*Compactor).WriteSnapshot
May 15 04:45:11 influxdb-01 influxd[28481]:         /go/src/github.com/influxdata/influxdb/tsdb/engine/tsm1/compact.go:845 +0x226
May 15 04:45:11 influxdb-01 influxd[28481]: goroutine 1 [chan receive]:

#4

Hi @itsksaurabh ,

looks there is a memory problem …

can you check if tsi is used or tsm with : influxd config|grep -i index
update : better is grep index /etc/influxdb/influxdb.conf

#5

@MarcV

influxd config|grep -i index
Merging with configuration at: /etc/influxdb/influxdb.conf
  index-version = "inmem"
  max-index-log-file-size = 1048576
#6

depending on your current influxd version you could convert your indexes to tsi1 ,
what is the version of the influxd ?

influxd version
InfluxDB v1.7.4 (git: 1.7 ef77e72f435b71b1ad6da7d6a6a4c4a262439379)
#7

@MarcV
current version is

InfluxDB v1.7.6 (git: 1.7 01c8dd416270f424ab0c40f9291e269ac6921964)
#8

Here is some info about the tsi1 indexes and how to convert your existing indexes …
tsi1 indexes
in fact it is stopping the database ,
execute : influx_inspect buildtsi -datadir <data_dir> -waldir <wal_dir> ( with the user that runs influxd )
changing the parameter index-version
and restaring the database …

hope this helps :slight_smile:

1 Like
#9

@MarcV Thank you so much. :slight_smile:

#10

you are welcome , let us know in a few days if the problem is solved :slight_smile:

#11

@MarcV the problem is still the same. Again it crashed after few hours.

#12

hi @itsksaurabh ,

do you see the “out of memory” again in journalctl ?

How much memory does the server have ?

#13

@MarcV no this time not the “out of memory” issue
The server is for testing purposes so has 2GB/40GB and runs on Ubuntu 18.04.2 LTS

#14

do you see other issues in journalctl ?
Or in /var/log/messages ?
Maybe the process was killed by oom …

#15

@MarcV I have recreated the droplet with slightly increasing the memory. I will let you know if it crashes this time. Btw do yo know how to tweak influxdb to run on a machine with low hardware config?

#16

there are some hardware guidelines here … hardware sizing

you can also limit the cpu usage with the gomaxprocs environment variable

hope this helps ,