Weâre overhauling Dgraphâs docs to make them clearer and more approachable. If
you notice any issues during this transition or have suggestions, please
let us know.
Running out of memory
When you bulk load or backup your data, Dgraph can consume more memory than usual due to a high volume of writes. This can cause Out Of Memory (OOM) crashes. You can take the following steps to help avoid OOM crashes:- Increase the amount of memory available: If you run Dgraph with insufficient memory, that can result in OOM crashes. The recommended minimum RAM to run Dgraph on desktops and laptops (single-host deployment) is 16 GB. For servers in a cluster deployment, the recommended minimum is 8 GB per server. This applies to EC2 and GCE instances, as well as on-premises servers.
-
Reduce the number of Go routines: You can troubleshoot OOM issues by
reducing the number of Go routines (
goroutines
) used by Dgraph from the default value of eight. For example, you can reduce thegoroutines
that Dgraph uses to four by calling thedgraph alpha
command with the following option:--badger "goroutines=4"
âToo many open filesâ errors
If Dgraph logs âtoo many open filesâ errors, you should increase the per-process open file descriptor limit to permit more open files. During normal operations, Dgraph must be able to open many files. Your operating system may have an open file descriptor limit with a low default value that isnât adequate for a database like Dgraph. If so, you might need to increase this limit. On Linux and Mac, you can get file descriptor limit settings with theulimit
command, as follows:
- Get hard limit:
ulimit -n -H
- Get soft limit:
ulimit -n -S
1048576
open files is the recommended minimum to use Dgraph in
production, but you can try increasing this soft limit if you continue to see
this error. To learn more, see the ulimit
documentation for your operating
system.
Depending on your OS, your shell session limits might not be the same as the
Dgraph process limits.
ulimit
values on Ubuntu 20.04 systems: