Grafana on ECS with Fargate
I am using ECS for a while, and to monitor my infrastructure the well known tool Grafana comes at hand for this job.
On AWS Marketplace you can find a Grafana AMI, but I wasn't willing to set up an EC2 with that AMI. I wanted to do serverless. With Fargate.
On this post I will share with you the using of Secrets on a task, and the policies for that. If you are not used of using ECS and Fargate, you will be able to access the full script on the end of this post.
As always I am using Terraform for that task.
Using Fargate I need to have a way of Grafana to persist the basic configuration and some data. By default Grafana saves the data in a local SQLite database, but if for some reason our task dies, we will lose those data. However, Grafana allows you to use a another database to save the data, and for this example I am using Postgres.
Pre-requisites:
A Postgres instance with a grafana user and database with the same name.
The next step is to save your credentials on AWS System Manager Parameter Store, yes a big name for a service to store your credentials. With that, I can expose to my ECS Task the credentials on the environment without any problem.
resource "aws_ssm_parameter" "db_type" {
name = "/${var.app_name}/grafana/GF_DATABASE_TYPE"
type = "String"
value = data.aws_db_instance.database.engine
}
resource "aws_ssm_parameter" "db_host" {
name = "/${var.app_name}/grafana/GF_DATABASE_HOST"
type = "String"
value = data.aws_db_instance.database.address
}
resource "aws_ssm_parameter" "db_user" {
name = "/${var.app_name}/grafana/GF_DATABASE_USER"
type = "String"
value = "grafana"
}
resource "aws_ssm_parameter" "db_name" {
name = "/${var.app_name}/grafana/GF_DATABASE_NAME"
type = "String"
value = "grafana"
}
resource "aws_ssm_parameter" "db_pass" {
name = "/${var.app_name}/grafana/GF_DATABASE_PASSWORD"
type = "SecureString"
key_id = data.aws_kms_key.selected.id
value = var.password
}
On the above Terraform snnipet you can see that I am storing the credentials in a Path, inside the parameter store, that way you can do a better management of the passwords and other credentials that you may need. The host, user and db name is non encrypted, and the password is encrypted with a KMS key.
The next step is to setup the policies for this Task/Service. I need to allow my task to access the credentials on those AWS services:
data "template_file" "ssm_policy" {
template = file("${path.module}/policies/ecs-ssm-role-policy.json")
vars = {
ssm_path = "arn:aws:ssm:${var.region}:${var.account_id}:parameter/${var.app_name}/${var.container_name}/*"
kms_key_arn = "arn:aws:kms:${var.region}:${var.account_id}:key/${data.aws_kms_key.selected.id}"
}
}
I am not using the Secrets Manager, so I am not adding that to my policy. If you need to use that service you need to check on the docs on how to add that to the policy. With this policy the Grafana task can only get the credentials inside the grafana path. And the password of the db can be decrypted with the KMS key.
The next step is to setup the ECS Task. Here I need to pass a secrets list to my task definition so they are retrieved and exposed to the task on the environment, I am building this using the locals variables of Terraform:
locals {
secrets = [{
name = "GF_DATABASE_TYPE"
valueFrom = aws_ssm_parameter.db_type.arn
},
{
name = "GF_DATABASE_HOST"
valueFrom = aws_ssm_parameter.db_host.arn
},
{
name = "GF_DATABASE_USER"
valueFrom = aws_ssm_parameter.db_user.arn
},
{
name = "GF_DATABASE_NAME"
valueFrom = aws_ssm_parameter.db_name.arn
},
{
name = "GF_DATABASE_PASSWORD"
valueFrom = aws_ssm_parameter.db_pass.arn
}]
}
I am adding the task to a external load balancer so I can access from the internet, setting up the security groups and the basic configuration that we need when working with ECS.
But Lays, there's a lot of scripting that you are missing here! No worries my little friend. You can find and study the full Terraform script on my Gitlab on this link.
That's all folks.