Bitbucket Pipelines Basic Build
A bitbucket-pipelines.yml configuration that runs tests on every branch push and adds a full deploy pipeline for the main branch — building assets and syncing them to AWS S3 using an Atlassian Marketplace pipe.
Overview
Bitbucket Pipelines is the native CI/CD service embedded in Bitbucket Cloud. Like other modern CI systems it reads a single YAML configuration file — bitbucket-pipelines.yml — from the root of your repository. What sets Bitbucket Pipelines apart is its tight integration with the Atlassian ecosystem: native support for Jira issue transitions triggered by deploys, built-in deployment environments that appear on your Bitbucket board, and a curated marketplace of "pipes" (think GitHub Actions actions or GitLab CI templates, but specific to Atlassian tooling).
This example demonstrates the two most important pipeline sections: default, which runs on every push to every branch and provides fast feedback to developers, and branches.main, which overrides the pipeline for the main branch to include a full deployment sequence. The separation keeps feature branches lightweight while ensuring that merges to main always trigger a complete build-and-deploy cycle.
Complete YAML
image: node:20
pipelines:
default:
- step:
name: Install & Test
caches:
- node
script:
- npm ci
- npm test
branches:
main:
- step:
name: Install & Test
caches:
- node
script:
- npm ci
- npm test
- step:
name: Deploy to S3
deployment: production
script:
- npm run build
- pipe: atlassian/aws-s3-deploy:1.6.0
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: 'us-east-1'
S3_BUCKET: $S3_BUCKET
LOCAL_PATH: 'dist'
definitions:
caches:
node: node_modules
Pipelines and branch overrides
The global image: node:20 at the top of the file sets the default Docker image for every step that doesn't specify its own image. This avoids repeating the image declaration in every step while still allowing individual steps to override it when needed (for example, a deploy step might use a lightweight Alpine image with AWS CLI pre-installed instead of the full Node image).
The pipelines.default section defines the pipeline that runs when a push is made to any branch that doesn't match a more specific rule. It contains a single step that installs dependencies and runs the test suite. This is the "minimum viable CI" that gives every developer immediate feedback without consuming unnecessary build minutes on branches that aren't ready to deploy.
The pipelines.branches.main section completely overrides the default pipeline for the main branch. When Bitbucket sees a push to main, it ignores the default pipeline and runs the branches.main pipeline instead. This pipeline has two sequential steps: the same install-and-test step (it always runs first to confirm the build is still passing after the merge), followed by the deploy step. Steps within a pipeline run sequentially by default; if the test step fails, the deploy step does not run.
Caching
Bitbucket Pipelines supports caching via the caches key on each step and a definitions.caches block at the bottom of the file. The definitions.caches block maps a cache name to a directory path. Here, node: node_modules tells Bitbucket to cache the node_modules directory under the name node.
When a step with caches: [node] starts, Bitbucket checks whether a cache with the key matching the content hash of package-lock.json already exists. If it does, it downloads and extracts the archive into node_modules before running your script, potentially saving 30–120 seconds of npm ci time. If the lockfile has changed, Bitbucket rebuilds the cache at the end of the step and stores a fresh archive for future runs.
Bitbucket also ships with several built-in cache definitions (pip, maven, gradle, composer) that you can reference by name without defining them in definitions.caches. The node cache is one of these built-ins, so technically the definitions block in this example is redundant — but it is good practice to include it explicitly so the path is visible to anyone reading the config, and it gives you the ability to customise the path if your project keeps node_modules in a non-standard location.
Atlassian pipes
Pipes are Bitbucket's equivalent of GitHub Actions actions. They are pre-built Docker-based integrations published to the Atlassian Marketplace that encapsulate complex deployment logic. Instead of writing a shell script to install the AWS CLI, configure credentials, run aws s3 sync, and handle errors, you reference atlassian/aws-s3-deploy:1.6.0 and pass a handful of variables. The pipe version is pinned to 1.6.0 for reproducibility — this is important because unpinned pipes can receive breaking updates.
Variables passed to a pipe can reference Bitbucket repository variables (secret values set in the repository settings UI, never committed to the config file). Here, $AWS_ACCESS_KEY_ID, $AWS_SECRET_ACCESS_KEY, and $S3_BUCKET are all repository variables. Bitbucket masks their values in pipeline logs automatically. The AWS_DEFAULT_REGION is a non-secret value so it is hardcoded directly in the config.
The deployment: production key on the deploy step registers the step with Bitbucket's Environments feature. This causes the deployment to appear in the "Deployments" tab of the repository with a timestamp, the deployer's name, and a link to the specific pipeline run. If your Bitbucket workspace is connected to Jira, the deployment also triggers any configured Jira automation rules — for example, automatically transitioning linked issues from "In Review" to "Done" when a deploy succeeds.
bitbucket.org/product/features/pipelines/integrations. Popular pipes cover AWS (S3, ECS, Lambda, CloudFormation), Google Cloud, Azure, Heroku, Slack, and Datadog — most common deployment targets are covered without any custom scripting.