▬▬▬▬▬▬ Announcements📢 ▬▬▬▬▬▬▬ 🔥 If you're interested in a step-by-step course to learn the basics of HashiCorp Vault, check this course out: HashiCorp Vault 101 - Certified Vault Associate ► bit.ly/hc-vault101 In this course you will get to: ⭐ Learn everything you need to know about Vault to ace the Vault Associate Exam ⭐ 8+ hours of video content ⭐ Instructor has his camera on making you feel that you're right in the classroom ⭐ Hand-drawn animated diagrams to help you grasp the topics better ⭐ Lots of hands-on labs to learn by doing ⭐ English closed captions that are searchable so you won't miss a word ⭐ Quizzes to help you grasp the material well ⭐ Join our Community
Dude holy shit I swear you really explain things so well so clear someone that is a beginner in IT could fully understand this. You rock my guy!!!!!!! Keep making these lessons you are making everyones lives so much easier and not make me feel dumb :)
Thank you so much for making this tutorial. Now I have a better understanding of the Vault. I am a Junior and it was a struggle for me to actually realise why we use this technology at my job and how it works. Thanks a lot!
You have to use a utility that rotates the logs such as logrotate. You need to be careful because if you run out of disk space, Vault will stop working by design. I talk about all that in depth and give you the config in my Vault 202 course if you’re interested.
I think you're talking about two different things. HashiCorp Vault can integrate with external storage providers through its storage backends. These backends are used to persist the data that Vault manages, such as secrets and policies. Here are a few examples of external storage backends that Vault supports:Amazon S3: Vault can use Amazon S3 as a storage backend. You need to configure the S3 bucket and provide necessary credentials.Google Cloud Storage: Similar to S3, Google Cloud Storage can also be used as a storage backend for Vault.Azure Blob Storage: Vault supports Azure Blob Storage as well.Consul: While not strictly an "external" storage provider, Consul is commonly used with Vault for storage and high availability.DynamoDB: Vault can use AWS DynamoDB for storage, which helps in achieving high availability and reliability. In comparison to External Secrets Operator (ESO), which allows you to dynamically fetch secrets from various external secret management services into Kubernetes, Vault's external storage backends are primarily for persisting Vault's own data securely. However, Vault can also serve as a secret management service itself, and its secrets can be accessed by Kubernetes using integrations like the Vault Kubernetes Auth method and the Vault Agent Injector. Vault has a Vault Secrets Operator VSO similar to ESO. developer.hashicorp.com/vault/tutorials/kubernetes/vault-secrets-operator
Thanks, does it mean Vault is used as a kind of proxy between client and servers? For example DB server credentials are staying the same, but vault can dynamically create different sets of creds rather than distributing DB credentials to applications. So in that case does it does it somehow works as reverse proxy for the credentials? :)
In my project, we use vault to login to different aws and on prem servers. We run the script vault login and it asks prompt we are willing to login like aws/azure/gcp and then it asks prod or np or dev , we shd pass on our creds linked to ldap, otp later which we get while we setup vault for individual thru generated secret. Just info for others
Hi there, first of all thank you for making and uploading this video. I did learn a lot from Vault's features and setup. I do have a question however, is there a reason why it instantiates (at 40:48) a vault.db file that has a whopping 100GB in size? And if not, is there a way to control this size? Again, thanks for the video!
Thank you. I hadn't looked into it before. I don't think you can tune it. I came across this learn guide if it's helpful. learn.hashicorp.com/tutorials/vault/performance-tuning?in=vault/operations#storage-backend-tuning
@@TeKanAid thanks for taking the time to reply and advise on this. Came across this documentation too. Tried to add in some config values, but it also seemed to me that you can't control the value. Strange that it takes up so much space. Anyway, thanks again!
@@vincentverweij1053 I actually took a look and don't see that large of a file. Not sure why you're getting that. (⎈ |docker-desktop:default) Gabrail-Windows:sam:~/Deployment_Linux/Vault/Training/vault-101/Section06-Starting_a_Production_Vault_Server/vault/data$ll total 196K drwxr-xr-x 3 sam sam 4.0K Feb 18 16:53 . drwxr-xr-x 3 sam sam 4.0K Feb 18 16:52 .. drwxr-xr-x 3 sam sam 4.0K Feb 18 16:53 raft -rw------- 1 sam sam 180K Mar 2 17:34 vault.db (⎈ |docker-desktop:default) Gabrail-Windows:sam:~/Deployment_Linux/Vault/Training/vault-101/Section06-Starting_a_Production_Vault_Server/vault/data$du -h ./vault.db 184K ./vault.db
What is the right way to manage the tokens in the secert.txt file? Moreover, what is the right way to manage the token we get from vault after authenticating?
These are great questions and I cover them all in my Vault 101 and Vault 202 courses, but quickly, the root token should only be used to configure auth methods. One of which should give admin access then you should revoke the root token. You can always recreate a root token from the unseal keys.
Thanks for the nice video on vault, can you tell me how we can authenticate with value with an AWS Sso user ? With normal user when I pass access key and secrets it works but with SSO it is not Have to tried authenticating with vault using aws sso ?
Sorry for the late reply. I haven't seen this. There is an old discussion here, but seems to not be resolved discuss.hashicorp.com/t/vault-integration-with-aws-sso-saml-2-0/5461
Hi Maitheen, we use groups to group entities. I go into much more details with examples in my course: courses.tekanaid.com/p/hashicorp-vault-101-certified-vault-associate You can also read this tutorial from HashiCorp: learn.hashicorp.com/tutorials/vault/identity
I'm getting the below error, when I tried to enable log path ~$ vault audit enable file file_path=./logs/vault_audit.log Error enabling audit device: Error making API request. URL: PUT localhost:8200/v1/sys/audit/file Code: 400. Errors: * sanity check failed; unable to open "./logs/vault_audit.log" for writing: open ./logs/vault_audit.log: permission denied
This error message is indicating that the Vault server is unable to write to the specified log file path "./logs/vault_audit.log" due to a permission denied error. This could be caused by a few things: The directory "./logs" does not exist and needs to be created. The user running the command does not have permission to write to the specified directory. The permissions on the directory are not set correctly and need to be changed. You can check the directory is present or not using ls -ld ./logs and check the permissions of the directory using ls -ld /path/to/logs . You may need to adjust the permissions on the directory to allow the user running the command to write to it or you can run the command with root or sudo.