Aller au contenu principal

How to Use FastBCP with Docker

Pierre-Antoine Collet
Pierre-Antoine Collet, ARPE.IO Developer
2026-02-16 · 3 min

The FastBCP Docker image allows you to easily integrate this high-performance bulk copy tool into your data integration workflows, without having to install FastBCP directly on your machines.

Why Use FastBCP with Docker?

The official arpeio/fastbcp Docker image offers several advantages:

  • Simplified deployment: No installation required, just Docker
  • Portability: Works anywhere Docker is installed
  • Automatic updates: Images automatically updated with each new version and weekly security patches
  • Native integration: Compatible with Kubernetes, Docker Compose, Airflow, and other orchestrators

Prerequisites

  • Docker 24+ installed on your machine
  • A valid FastBCP license (≥ 0.28.0)
  • Access to a source database (SQL Server, PostgreSQL, Oracle, etc.)

Pull the Docker Image

The image is available on DockerHub. You can use the latest version or a specific version:

# Latest version
docker pull arpeio/fastbcp:latest

# Specific version
docker pull arpeio/fastbcp:v0.28.3

Example: Export SQL Server Data to Parquet on S3

Here's a complete example of using FastBCP with Docker to export data from SQL Server to a Parquet file stored on Amazon S3.

1. Prepare Your License

Since version 0.28.0, the license is passed directly as a parameter:

export licenseContent=$(cat ./FastBCP.lic)

2. Run FastBCP in Docker

docker run --rm \
-e AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID} \
-e AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY} \
-e AWS_REGION=${AWS_REGION} \
arpeio/fastbcp:latest \
--connectiontype "mssql" \
--server "host.docker.internal,1433" \
--user "FastUser" \
--password "FastPassword" \
--database "tpch_test" \
--query "SELECT * FROM dbo.orders WHERE year(o_orderdate)=1998" \
--fileoutput "orders.parquet" \
--directory "s3://arpeioftoutput/dockertest/" \
--paralleldegree 12 \
--parallelmethod "Ntile" \
--distributekeycolumn "o_orderkey" \
--merge false \
--license "$licenseContent"

Command Details

  • Environment variables: AWS credentials are passed via -e to access the S3 bucket
  • host.docker.internal: Allows access to a SQL Server running on your host machine
  • --fileoutput: Output file name (Parquet format)
  • --directory: S3 path where to store the data
  • --paralleldegree 12: Uses 12 parallel threads for optimal performance
  • --license: Your FastBCP license passed inline

To discover all the possibilities of the Docker image, check out the complete documentation.

Conclusion

The FastBCP Docker image greatly simplifies the deployment and use of FastBCP in your data integration environments. Whether for one-off exports or automated data pipelines, Docker provides the flexibility and portability required by modern data architectures.

Try it now and share your feedback!