Any feature in BigQuery that can migrate a whole dataset in another project w/o executing copy data?

There's no built-in feature but I helped write a tool that we've open-sourced that will do this for you: https://github.com/uswitch/big-replicate.

It will let you synchronise/copy tables between projects or datasets (within the same project). Most of the details are in the project's README but for reference it looks a little like:

java -cp big-replicate-standalone.jar \
  uswitch.big_replicate.sync \
  --source-project source-project-id \
  --source-dataset 98909919 \
  --destination-project destination-project-id \
  --destination-dataset 98909919

You can set options that will control how many tables to copy, how many jobs run concurrently and where to store the intermediate data in Cloud Storage. The destination dataset must already exist but this means you'll be able to copy data between locations too (US, EU, Asia etc.).

Binaries are built on CircleCI and published to GitHub releases.


You can first copy BigQuery dataset to the new project, then delete the original dataset.

The copy dataset UI is similar to copy table. Just click "copy dataset" button from the source dataset, and specify the destination dataset in the pop-up form. See screenshot below. Check out the public documentation for more use cases.

Copy dataset button

enter image description here

Copy dataset form

enter image description here


A short shell script which copies all tables from a dataset to another dataset:

export SOURCE_DATASET=$1  # project1:dataset
export DEST_PREFIX=$2  # project2:dataset2.any_prefix_
for f in `bq ls $SOURCE_DATASET |grep TABLE | awk '{print $1}'`
do
  export CP_COMMAND="bq cp $SOURCE_DATASET.$f $DEST_PREFIX$f"
  echo $CP_COMMAND
  echo `$CP_COMMAND`
done