We’re overhauling Dgraph’s docs to make them clearer and more approachable. If you notice any issues during this transition or have suggestions, please let us know.
As an Administrator
you can export data from Dgraph to an object store, NFS,
or a file path.
When you export data, typically three files are generated:
g01.gql_schema.gz
: The GraphQL schema file. This file can be imported using
the Schema APIsg01.json.gz
or g01.rdf.gz
: the data from your instance in JSON format or
RDF format. By default, Dgraph exports data in RDF format.g01.schema.gz
: This file is the internal Dgraph schema.You can export the entire data by executing a GraphQL mutation on the /admin
endpoint of any Alpha node.
Before you begin:
Ensure that there is sufficient space on disk to store the export. Each Dgraph
Alpha leader for a group writes output as a gzipped file to the export
directory specified through the --export
flag (defaults to an export
directory). If any of the groups fail because of insufficient space on the
disk, the entire export process is considered failed and an error is returned.
Make a note of the export directories of the Alpha server nodes. For more information about configuring the Dgraph Alpha server, see Configuration.
This mutation triggers the export from each of the Alpha leader for a group. Depending on the Dgraph configuration several files are exported. It is recommended that you copy the files from the Alpha server nodes to a safe place when the export is complete.
The export data of the group:
You need to retrieve the right export files from the Alpha instances in the cluster. Dgraph does not copy all files to the Alpha that initiated the export.
When the export is complete a response similar to this appears:
By default, Dgraph exports data in RDF format. Replace <FORMAT>
with json
or
rdf
in this GraphQL mutation:
You can override the default folder path by adding the destination
input field
to the directory where you want to export data. Replace <PATH>
in this GraphQL
mutation with the absolute path of the directory to export data.
You can export to an AWS S3, Azure Blob Storage or Google Cloud Storage.
aws s3
command, which uses a shortened format:
s3://<bucket-name>
. You can use MinIO as a gateway to other object stores, such as Azure Blob Storage or Google Cloud Storage.
You can use Azure Blob Storage through the MinIO Azure Gateway.
Before you begin:
<bucket-name>
when
specifying the destination
in the GraphQL mutation.MINIO_ACCESS_KEY
and MINIO_SECRET_KEY
to correspond to Azure Storage
Account AccountName
and AccountKey
.You can access Azure Blob Storage locally using one of these methods:
Using MinIO Azure Gateway with the MinIO Binary
Using MinIO Azure Gateway with Docker
Using MinIO Azure Gateway with the MinIO Helm chart for Kubernetes:
You can use the MinIO GraphQL mutation with MinIO configured as a gateway.
You can use Google Cloud Storage through the MinIO GCS Gateway.
Before you begin:
When you have a credentials.json
, you can access GCS locally using one of
these methods:
Using MinIO GCS Gateway with the MinIO Binary
Using MinIO GCS Gateway with Docker
Using MinIO GCS Gateway with the MinIO Helm chart for Kubernetes:
You can use the MinIO GraphQL mutation with MinIO configured as a gateway.
By default, Dgraph assumes the destination bucket is using HTTPS. If that’s not
the case, the export fails. To export to a bucket using HTTP (insecure), set the
query parameter secure=false
with the destination endpoint in the
destination
field:
When exporting to S3 or MinIO where credentials aren’t required, can set
anonymous
to true.
Export is available wherever an Alpha is running. To encrypt an export, the
Alpha must be configured with the --encryption key-file=value
.
The --encryption key-file
used for Encryption at
Rest and is also used for encrypted
exports.
curl
to trigger an exportThis is an example of how you can use curl
to trigger an export.
Create GraphQL file for the desired mutation:
Trigger an export with curl
We’re overhauling Dgraph’s docs to make them clearer and more approachable. If you notice any issues during this transition or have suggestions, please let us know.
As an Administrator
you can export data from Dgraph to an object store, NFS,
or a file path.
When you export data, typically three files are generated:
g01.gql_schema.gz
: The GraphQL schema file. This file can be imported using
the Schema APIsg01.json.gz
or g01.rdf.gz
: the data from your instance in JSON format or
RDF format. By default, Dgraph exports data in RDF format.g01.schema.gz
: This file is the internal Dgraph schema.You can export the entire data by executing a GraphQL mutation on the /admin
endpoint of any Alpha node.
Before you begin:
Ensure that there is sufficient space on disk to store the export. Each Dgraph
Alpha leader for a group writes output as a gzipped file to the export
directory specified through the --export
flag (defaults to an export
directory). If any of the groups fail because of insufficient space on the
disk, the entire export process is considered failed and an error is returned.
Make a note of the export directories of the Alpha server nodes. For more information about configuring the Dgraph Alpha server, see Configuration.
This mutation triggers the export from each of the Alpha leader for a group. Depending on the Dgraph configuration several files are exported. It is recommended that you copy the files from the Alpha server nodes to a safe place when the export is complete.
The export data of the group:
You need to retrieve the right export files from the Alpha instances in the cluster. Dgraph does not copy all files to the Alpha that initiated the export.
When the export is complete a response similar to this appears:
By default, Dgraph exports data in RDF format. Replace <FORMAT>
with json
or
rdf
in this GraphQL mutation:
You can override the default folder path by adding the destination
input field
to the directory where you want to export data. Replace <PATH>
in this GraphQL
mutation with the absolute path of the directory to export data.
You can export to an AWS S3, Azure Blob Storage or Google Cloud Storage.
aws s3
command, which uses a shortened format:
s3://<bucket-name>
. You can use MinIO as a gateway to other object stores, such as Azure Blob Storage or Google Cloud Storage.
You can use Azure Blob Storage through the MinIO Azure Gateway.
Before you begin:
<bucket-name>
when
specifying the destination
in the GraphQL mutation.MINIO_ACCESS_KEY
and MINIO_SECRET_KEY
to correspond to Azure Storage
Account AccountName
and AccountKey
.You can access Azure Blob Storage locally using one of these methods:
Using MinIO Azure Gateway with the MinIO Binary
Using MinIO Azure Gateway with Docker
Using MinIO Azure Gateway with the MinIO Helm chart for Kubernetes:
You can use the MinIO GraphQL mutation with MinIO configured as a gateway.
You can use Google Cloud Storage through the MinIO GCS Gateway.
Before you begin:
When you have a credentials.json
, you can access GCS locally using one of
these methods:
Using MinIO GCS Gateway with the MinIO Binary
Using MinIO GCS Gateway with Docker
Using MinIO GCS Gateway with the MinIO Helm chart for Kubernetes:
You can use the MinIO GraphQL mutation with MinIO configured as a gateway.
By default, Dgraph assumes the destination bucket is using HTTPS. If that’s not
the case, the export fails. To export to a bucket using HTTP (insecure), set the
query parameter secure=false
with the destination endpoint in the
destination
field:
When exporting to S3 or MinIO where credentials aren’t required, can set
anonymous
to true.
Export is available wherever an Alpha is running. To encrypt an export, the
Alpha must be configured with the --encryption key-file=value
.
The --encryption key-file
used for Encryption at
Rest and is also used for encrypted
exports.
curl
to trigger an exportThis is an example of how you can use curl
to trigger an export.
Create GraphQL file for the desired mutation:
Trigger an export with curl