Search
Close this search box.

Check Elasticsearch cluster disk space with one liner curl

Table of Contents

1. Introduction

Having cluster with 100s of nodes and got task to check disk space available can cause a headache. Fortunately there is a easy way to do it with just one line command. In this article I will show you how quickly get things done.

2. [Optional] Start your Elasticsearch cluster

If you have already your Elasticsearch cluster you can go directly to step 3.

Otherwise run command to startup cluster.

				
					docker volume create --opt type=tmpfs --opt device=tmpfs --opt o=size=5m europe01data
docker volume create --opt type=tmpfs --opt device=tmpfs --opt o=size=7m africa01data
docker volume create --opt type=tmpfs --opt device=tmpfs --opt o=size=11m arctica01data

docker run --rm \
--name europe01 \
--net elknodes \
-d \
-e ES_JAVA_OPTS="-Xms2g -Xmx2g" \
-e node.name="europe01" \
-p 9200:9200 \
-v europe01data:/usr/share/elasticsearch/data \
docker.elastic.co/elasticsearch/elasticsearch:8.12.1
				
			

After a moment you can execute password reset command

				
					docker exec -it europe01 bash -c "(mkfifo pipe1); ( (elasticsearch-reset-password -u elastic -i < pipe1) & ( echo $'y\n123456\n123456' > pipe1) );sleep 5;rm pipe1"
				
			

And get token

				
					token=`docker exec -it europe01 elasticsearch-create-enrollment-token -s node | tr -d '\r\n'`
				
			

now you can start other 2 nodes

				
					docker run --rm \
-e ENROLLMENT_TOKEN="$token" \
-e node.name="africa01" \
-v africa01data:/usr/share/elasticsearch/data \
-p 9201:9200 \
--name africa01 \
--net elknodes \
-d \
-m 2GB docker.elastic.co/elasticsearch/elasticsearch:8.12.1


docker run --rm \
-e ENROLLMENT_TOKEN="$token" \
-e node.name="arctica01" \
-v arctica01data:/usr/share/elasticsearch/data \
--name arctica01 \
--net elknodes \
-d \
-m 2GB docker.elastic.co/elasticsearch/elasticsearch:8.12.1
				
			

3. Check disk usage and total capacity

Finally your one liner

				
					curl -k -u elastic:123456 -s "https://localhost:9200/_nodes/stats?filter_path=nodes.*.fs.data.available_in_bytes,nodes.*.fs.data.total_in_bytes" | jq -r '[.nodes | to_entries[].value.fs.data[]] | reduce .[] as $item ({}; .sum_total_in_megabytes += $item.total_in_bytes / (1024*1024)  | .sum_available_in_megabytes += $item.available_in_bytes / (1024*1024) )'
				
			

and your response

				
					{
  "sum_total_in_megabytes": 23,
  "sum_available_in_megabytes": 22.59765625
}
				
			

4. [Optional] Load test data

If you follow step 2 then you can load some data to see how disk space is changing.

				
					curl -k -u elastic:123456 -XPUT "https://localhost:9200/customerdata" \
-H 'content-type: application/json' -d'
{
  "settings": {
    "number_of_shards": 1,
    "number_of_replicas": 2
  }
}'

				
			

I prepared for you sample data kept in IPFS network.

				
					docker run --rm -it \
-v "$PWD:/tmp" \
-e IPFS_GATEWAY="https://ipfs.filebase.io/" \
curlimages/curl:8.5.0 --output "/tmp/#1.png" "ipfs://{QmPC2FFxBp9Lh97rdSo3wj4pqmpnZ7LsZCCud8QpU8ukwK}"
				
			
				
					someData=`cat QmPC2FFxBp9Lh97rdSo3wj4pqmpnZ7LsZCCud8QpU8ukwK.png |base64`

echo -n '' > QmPC2FFxBp9Lh97rdSo3wj4pqmpnZ7LsZCCud8QpU8ukwK.json
echo '{"index": {"_id": 1}}' >> QmPC2FFxBp9Lh97rdSo3wj4pqmpnZ7LsZCCud8QpU8ukwK.json
echo  '{"customer_name": "'$someData'"}' >> QmPC2FFxBp9Lh97rdSo3wj4pqmpnZ7LsZCCud8QpU8ukwK.json

curl -k -u elastic:123456 -XPOST "https://localhost:9200/customerdata/_bulk" -H 'Content-Type: application/x-ndjson' --data-binary @QmPC2FFxBp9Lh97rdSo3wj4pqmpnZ7LsZCCud8QpU8ukwK.json
				
			

5. [Optional] Check disk space again

Now after loading some sample you will lost approximately 1mb

				
					{
  "sum_total_in_megabytes": 23,
  "sum_available_in_megabytes": 21.609375
}
				
			

6. Summary

In this quick tutorial you have learned how to check total disk capacity of your cluster plus how to check current usage. I am sure it will be useful for your daily work.

Leave a Reply

Your email address will not be published. Required fields are marked *

Follow me on LinkedIn
Share the Post:

Enjoy Free Useful Amazing Content

Related Posts