home / skills / astronomer / agents / setting-up-astro-project
This skill initializes and configures Astro Airflow projects with dependencies, connections, and project structure, accelerating setup and deployment.
npx playbooks add skill astronomer/agents --skill setting-up-astro-projectReview the files below or copy the command above to add this skill to your agents.
---
name: setting-up-astro-project
description: Initialize and configure Astro/Airflow projects. Use when the user wants to create a new project, set up dependencies, configure connections/variables, or understand project structure. For running the local environment, see managing-astro-local-env.
---
# Astro Project Setup
This skill helps you initialize and configure Airflow projects using the Astro CLI.
> **To run the local environment**, see the **managing-astro-local-env** skill.
> **To write DAGs**, see the **authoring-dags** skill.
---
## Initialize a New Project
```bash
astro dev init
```
Creates this structure:
```
project/
├── dags/ # DAG files
├── include/ # SQL, configs, supporting files
├── plugins/ # Custom Airflow plugins
├── tests/ # Unit tests
├── Dockerfile # Image customization
├── packages.txt # OS-level packages
├── requirements.txt # Python packages
└── airflow_settings.yaml # Connections, variables, pools
```
---
## Adding Dependencies
### Python Packages (requirements.txt)
```
apache-airflow-providers-snowflake==5.3.0
pandas==2.1.0
requests>=2.28.0
```
### OS Packages (packages.txt)
```
gcc
libpq-dev
```
### Custom Dockerfile
For complex setups (private PyPI, custom scripts):
```dockerfile
FROM quay.io/astronomer/astro-runtime:12.4.0
RUN pip install --extra-index-url https://pypi.example.com/simple my-package
```
**After modifying dependencies:** Run `astro dev restart`
---
## Configuring Connections & Variables
### airflow_settings.yaml
Loaded automatically on environment start:
```yaml
airflow:
connections:
- conn_id: my_postgres
conn_type: postgres
host: host.docker.internal
port: 5432
login: user
password: pass
schema: mydb
variables:
- variable_name: env
variable_value: dev
pools:
- pool_name: limited_pool
pool_slot: 5
```
### Export/Import
```bash
# Export from running environment
astro dev object export --connections --file connections.yaml
# Import to environment
astro dev object import --connections --file connections.yaml
```
---
## Validate Before Running
Parse DAGs to catch errors without starting the full environment:
```bash
astro dev parse
```
---
## Related Skills
- **managing-astro-local-env**: Start, stop, and troubleshoot the local environment
- **authoring-dags**: Write and validate DAGs (uses MCP tools)
- **testing-dags**: Test DAGs (uses MCP tools)
This skill initializes and configures Astro (Astronomer) Airflow projects to get a reproducible development environment and deployment-ready project layout. It guides creating project structure, adding Python and OS-level dependencies, and managing Airflow connections, variables, and pools. Use it to standardize setup before authoring or running DAGs locally or in CI.
The skill uses the Astro CLI to scaffold a project (astro dev init) that includes dags, plugins, include, tests, and configuration files. Dependencies are declared in requirements.txt and packages.txt, with optional Dockerfile customization for private indexes or build steps. Connections, variables, and pools are managed through an airflow_settings.yaml that the local environment loads automatically and can be exported/imported with astro dev object commands. Validate DAGs quickly using astro dev parse before starting the full environment.
How do I apply dependency changes to the running local environment?
After updating requirements.txt, packages.txt, or Dockerfile, run astro dev restart to rebuild and apply changes.
Where should I store secrets for connections and variables?
Avoid committing secrets in airflow_settings.yaml. Use a secrets backend, environment variables, or a vault solution integrated into your CI/deployment pipeline.