Esc
Type to search posts, tags, and more...
Skip to content

Unclutter Your Environment Variables

Stop manually sourcing env files and juggling export statements. Use direnv and modern secret managers to load project-specific environment variables automatically.

Contents

The Problem

Every project has its own set of environment variables — API keys, database URLs, feature flags, cloud credentials. The 12-Factor App methodology gets this right: store config in environment variables, not in code.

But managing those variables across multiple projects and environments (dev, staging, production, sandbox) gets messy fast. You end up with bash functions that source different files, VS Code workspace settings that only work in VS Code, or a .bashrc full of exports that bleed across contexts.

direnv: Per-Directory Environment Variables

direnv solves this cleanly. It hooks into your shell and automatically loads/unloads environment variables when you cd into or out of a project directory.

Installation

On macOS:

brew install direnv

On Linux, use your package manager or grab the binary from the releases page.

Then hook it into your shell. For bash, add to ~/.bashrc:

eval "$(direnv hook bash)"

For zsh, add to ~/.zshrc:

eval "$(direnv hook zsh)"

For fish, add to ~/.config/fish/config.fish:

direnv hook fish | source

Basic Usage

Create a .envrc file in your project root:

export DATABASE_URL="postgres://localhost:5432/myapp_dev"
export API_KEY="dev-key-12345"
export ENVIRONMENT="development"

The first time direnv sees a new or changed .envrc, it blocks loading until you explicitly approve it:

direnv allow .

This is a security feature — you do not want random .envrc files from cloned repos executing automatically.

Once allowed, every time you cd into the project directory, the variables load. When you cd out, they unload. No stale exports polluting your shell.

Loading .env Files

Many tools (Docker Compose, Node.js dotenv, Python python-dotenv) use .env files by convention. You can tell direnv to load these natively by creating ~/.config/direnv/direnv.toml:

[global]
load_dotenv = true

Now direnv will pick up .env files in addition to .envrc.

Python Virtual Environments

direnv can also activate Python virtual environments automatically. For venv:

# .envrc
export DATABASE_URL="postgres://localhost:5432/myapp_dev"
layout python3

This creates and activates a venv in .direnv/python-<version>. For pipenv projects:

# .envrc
export DATABASE_URL="postgres://localhost:5432/myapp_dev"
layout pipenv

No more forgetting to source .venv/bin/activate.

Beyond direnv: Modern Secret Management

direnv handles variable loading, but you still need to manage the actual secrets. Here are tools worth combining with it.

1Password CLI

If you use 1Password, the op CLI lets you reference secrets stored in your vault directly from .envrc:

# .envrc
export API_KEY="$(op read 'op://DevVault/MyApp/api-key')"
export DB_PASSWORD="$(op read 'op://DevVault/MyApp/db-password')"

Secrets never touch disk in plaintext. When direnv loads, it calls op read to fetch them from 1Password at runtime.

Install with:

brew install 1password-cli

AWS SSM Parameter Store

For AWS-heavy workflows, pull secrets from SSM:

# .envrc
export DB_PASSWORD="$(aws ssm get-parameter --name /myapp/dev/db-password --with-decryption --query 'Parameter.Value' --output text)"

HashiCorp Vault

Same pattern with Vault:

# .envrc
export DB_PASSWORD="$(vault kv get -field=password secret/myapp/dev/db)"

sops (Secrets OPerationS)

sops encrypts files with AWS KMS, GCP KMS, Azure Key Vault, or PGP. You can keep encrypted .env.enc files in version control and decrypt them in your .envrc:

# .envrc
eval "$(sops --decrypt --output-type dotenv .env.enc | direnv dotenv bash /dev/stdin)"

Security Essentials

Always add .envrc and .env to your .gitignore:

.envrc
.env
.env.*
!.env.example

Keep an .env.example with placeholder values checked into the repo so new contributors know which variables they need.

Putting It Together

My typical project setup:

# .envrc
# Load .env if present (for simple local-only vars)
dotenv_if_exists

# Pull secrets from 1Password
export API_KEY="$(op read 'op://DevVault/ProjectX/api-key')"

# Activate Python venv
layout python3

# Project-specific PATH additions
PATH_add ./scripts

The result: cd into the project and everything is ready. Variables loaded, secrets fetched, virtualenv activated, PATH extended. cd out and it all goes away.

No more sourcing scripts manually. No more “it works on my machine” because someone forgot to export a variable. Just clean, per-project environment management.

! Was this useful?