home / skills / markus41 / claude / harness-ci

This skill accelerates container-native CI with test intelligence, caching, and parallel builds, simplifying infrastructure management for rapid, reliable

npx playbooks add skill markus41/claude --skill harness-ci

Review the files below or copy the command above to add this skill to your agents.

Files (1)
SKILL.md
7.8 KB
---
name: harness-ci
description: Harness CI (Continuous Integration) for container-native builds with test intelligence, caching, parallelization, and build infrastructure management
allowed-tools: [Bash, Read, Write, Edit, Glob, Grep, Task, WebFetch, WebSearch]
dependencies: [harness-mcp, harness-cd]
triggers: [harness ci, harness build, build pipeline, ci pipeline, test intelligence, ci infrastructure]
---

# Harness CI Skill

Container-native CI builds with test intelligence, caching, parallelization, and infrastructure management.

## Build Infrastructure

- **Cloud (Recommended):** Zero-config hosted, auto-scaling, pre-installed tools
  ```yaml
  infrastructure:
    type: Cloud
    spec:
      os: Linux  # Linux, MacOS, Windows
  ```

- **Kubernetes:** Self-hosted via k8s clusters
  ```yaml
  infrastructure:
    type: KubernetesDirect
    spec:
      connectorRef: k8s_connector
      namespace: harness-builds
      os: Linux
  ```

- **VMs:** AWS, Azure, GCP pool-based scaling

## Basic Pipeline Structure

```yaml
pipeline:
  name: Build Pipeline
  identifier: build_pipeline
  properties:
    ci:
      codebase:
        connectorRef: harness_code
        repoName: my-service
        build: <+input>
  stages:
    - stage:
        name: Build and Test
        type: CI
        spec:
          cloneCodebase: true
          infrastructure:
            type: Cloud
            spec:
              os: Linux
          execution:
            steps:
              - step:
                  name: Install
                  type: Run
                  spec:
                    shell: Sh
                    command: npm ci
              - step:
                  name: Test
                  type: Run
                  spec:
                    command: npm test -- --coverage
              - step:
                  name: Build
                  type: Run
                  spec:
                    command: npm run build
```

## Step Types

**Run:** Execute shell commands
```yaml
- step:
    name: Build
    type: Run
    spec:
      shell: Sh
      command: npm run build
      envVariables:
        NODE_ENV: production
      resources:
        limits:
          memory: 2Gi
          cpu: "1"
```

**RunTests (Test Intelligence):** Language/framework-aware test execution
```yaml
- step:
    type: RunTests
    spec:
      language: Java  # Java, Kotlin, Scala, C#, Python, Ruby
      buildTool: Maven  # Maven, Gradle, Bazel, etc.
      runOnlySelectedTests: true  # Enable TI
      enableTestSplitting: true   # Parallel execution
      testAnnotations: org.junit.Test
      packages: com.myapp
```

**Docker Registry Build/Push**
```yaml
- step:
    name: Build and Push
    type: BuildAndPushDockerRegistry
    spec:
      connectorRef: docker_connector
      repo: myorg/myapp
      tags: [<+pipeline.sequenceId>, <+codebase.shortCommitSha>, latest]
      dockerfile: Dockerfile
      caching: true
      buildArgs:
        VERSION: <+pipeline.sequenceId>
```

**ECR/GCR/ACR:** Replace `BuildAndPushDockerRegistry` with `BuildAndPushECR`, `BuildAndPushGCR`, or `BuildAndPushACR` with appropriate connector refs.

## Caching

**S3 Cache:**
```yaml
- step:
    name: Save Cache
    type: SaveCacheS3
    spec:
      connectorRef: aws_connector
      bucket: harness-cache
      key: npm-{{ checksum "package-lock.json" }}
      sourcePaths: [node_modules]
- step:
    name: Restore Cache
    type: RestoreCacheS3
    spec:
      connectorRef: aws_connector
      bucket: harness-cache
      key: npm-{{ checksum "package-lock.json" }}
      failIfKeyNotFound: false
```

**GCS Cache:** Replace S3 steps with `SaveCacheGCS`/`RestoreCacheGCS`.

## Parallelism

**Matrix Strategy:** Run steps with multiple configurations
```yaml
- step:
    name: Test Matrix
    type: Run
    spec:
      command: npm test
      envVariables:
        NODE_VERSION: <+matrix.nodeVersion>
        DB_TYPE: <+matrix.database>
    strategy:
      matrix:
        nodeVersion: ["16", "18", "20"]
        database: [postgres, mysql]
      maxConcurrency: 4
```

**Parallelism:** Run same step multiple times
```yaml
- step:
    name: Parallel Tests
    type: Run
    spec:
      command: npm test -- --shard=$HARNESS_STAGE_INDEX/$HARNESS_STAGE_TOTAL
    strategy:
      parallelism: 4
```

**Parallel Step Groups:**
```yaml
- stepGroup:
    name: Parallel Build
    steps:
      - parallel:
          - step:
              name: Build Frontend
              type: Run
              spec:
                command: npm run build:frontend
          - step:
              name: Build Backend
              type: Run
              spec:
                command: npm run build:backend
```

## Background Services

Start services (databases, caches) for integration tests:
```yaml
- step:
    name: PostgreSQL
    type: Background
    spec:
      image: postgres:14
      envVariables:
        POSTGRES_USER: test
        POSTGRES_PASSWORD: test
        POSTGRES_DB: testdb
      portBindings:
        "5432": "5432"
      resources:
        limits:
          memory: 1Gi

- step:
    name: Wait for DB
    type: Run
    spec:
      command: until pg_isready -h localhost -p 5432; do sleep 1; done
```

## Plugins & Actions

**Slack Notification:**
```yaml
- step:
    name: Notify Slack
    type: Plugin
    spec:
      image: plugins/slack
      settings:
        webhook: <+secrets.getValue("slack_webhook")>
        channel: builds
        template: "Build {{#success build.status}}succeeded{{else}}failed{{/success}}"
```

**S3 Upload:**
```yaml
- step:
    name: Upload Artifacts
    type: Plugin
    spec:
      image: plugins/s3
      settings:
        bucket: build-artifacts
        source: dist/**/*
        target: builds/<+pipeline.sequenceId>
```

**GitHub Actions:**
```yaml
- step:
    name: Setup Node
    type: Action
    spec:
      uses: actions/setup-node@v3
      with:
        node-version: "18"
        cache: npm
```

## Artifact Management

Upload build outputs to cloud storage:
- **S3:** Type `S3Upload`, spec: `bucket`, `sourcePath`, `target`
- **GCS:** Type `GCSUpload`, spec: `bucket`, `sourcePath`, `target`

## CI Expressions

| Expression | Description |
|------------|-------------|
| `<+codebase.branch>` | Git branch name |
| `<+codebase.commitSha>` | Full commit SHA |
| `<+codebase.shortCommitSha>` | Short SHA (7 chars) |
| `<+codebase.commitMessage>` | Commit message |
| `<+pipeline.sequenceId>` | Build number |
| `<+pipeline.executionId>` | Execution UUID |
| `<+secrets.getValue("key")>` | Secret value |

## Triggers

**Push Trigger:**
```yaml
trigger:
  name: Build on Push
  pipelineIdentifier: build_pipeline
  source:
    type: Webhook
    spec:
      type: Push
      connectorRef: harness_code
      repoName: my-service
      payloadConditions:
        - key: targetBranch
          operator: In
          value: [main, develop]
```

**Pull Request & Tag:** Use `type: PullRequest` or `type: Tag` with `actions` or `tagCondition`.

## Troubleshooting

| Issue | Solution |
|-------|----------|
| Build timeout | Increase timeout, optimize steps |
| Cache miss | Verify checksum file path |
| Image pull failed | Check connector credentials |
| TI not working | Verify language/buildTool config |
| Out of memory | Increase step memory limits |

**Debug:**
```yaml
- step:
    name: Debug
    type: Run
    spec:
      command: |
        echo "Branch: <+codebase.branch>"
        echo "Build: <+pipeline.sequenceId>"
        env | sort
        df -h
```

## Related Documentation

- [Harness CI Docs](https://developer.harness.io/docs/continuous-integration)
- [Test Intelligence](https://developer.harness.io/docs/continuous-integration/use-ci/run-tests/ti-overview)
- [Caching](https://developer.harness.io/docs/continuous-integration/use-ci/caching-ci-data/caching-overview)
- [Build Infrastructure](https://developer.harness.io/docs/continuous-integration/use-ci/set-up-build-infrastructure)

Overview

This skill integrates Harness CI for container-native continuous integration with test intelligence, caching, parallelization, and build infrastructure management. It lets teams run reproducible builds and tests across cloud-hosted or self-hosted infrastructure with built-in optimizations for speed and reliability. The skill focuses on practical pipeline steps, caching strategies, matrix/parallel execution, and artifact handling.

How this skill works

Pipelines are defined as YAML stages and steps that run on Cloud, Kubernetes, or VM infrastructure. Steps include Run (shell commands), RunTests (framework-aware test intelligence with test splitting), Docker build-and-push steps, and cache/restore steps for S3 or GCS. Matrix and parallel strategies distribute work across multiple nodes, while background services provide ephemeral databases and caches for integration tests.

When to use it

  • When you need fast, container-native CI with automatic scaling and minimal infra setup.
  • When you want test intelligence to run only impacted tests and split suites for parallel execution.
  • When caching node_modules, build artifacts, or other dependencies can reduce build times.
  • When you need matrix builds or parallelism to validate multiple runtime versions or configurations.
  • When you require managed artifact uploads to S3/GCS and notifications (Slack, webhooks).

Best practices

  • Prefer Cloud infrastructure for zero-config, auto-scaling builds; use Kubernetes for self-hosted control.
  • Enable caching keyed by checksum (e.g., package-lock.json) to avoid stale caches and reduce misses.
  • Use RunTests with runOnlySelectedTests and enableTestSplitting to minimize test time and flakiness.
  • Limit maxConcurrency for matrix strategies to control cost and avoid resource contention.
  • Run background services for integration tests and add explicit readiness checks before test steps.
  • Store secrets with the secrets API and reference via expressions like <+secrets.getValue("key")>.

Example use cases

  • Node.js monorepo: restore npm cache, run parallel test shards, build frontend and backend in parallel, push multi-tag Docker images.
  • Java microservices: use RunTests with Maven, enable TI to run only affected tests, split suites across runners.
  • Cross-version validation: matrix build across Node versions and database types to catch regressions early.
  • CI for pull requests: trigger pipelines on PRs to run fast targeted tests and report status back to Git hosting.
  • Artifact pipeline: build artifacts, upload to S3/GCS, and notify Slack on success or failure.

FAQ

How do I speed up test runs?

Enable Test Intelligence (runOnlySelectedTests) and test splitting (enableTestSplitting), use caching for dependencies, and run tests in parallel or with a matrix.

Which cache backend should I use?

Use S3 for AWS users or GCS for GCP users; key caches by checksum of lockfiles to ensure consistency.