Creating a Subset .bak
How to build a curated, automated daily .bak that contains only the tables Mosaic needs.
Creating a Subset .bak
This guide walks through building a daily, automated .bak that contains only the tables Mosaic needs — typically a small subset (for example, 13 tables for Deltek Ajera) of a much larger ERP database.
Why a subset database?
SQL Server Express backs up entire databases, not arbitrary table lists. SSMS has no checkbox for "include only these tables in my backup."
The standard pattern is:
- Create a small staging database (e.g.
MosaicExport) on the same SQL Server instance. - Each night, copy the required tables from the source ERP database into
MosaicExport. - Back up
MosaicExportto a.bakfile. - Sync that
.bakto S3 (see the parent page for the AWS CLI step).
This keeps your .bak small (well under the 15 GB upload limit), avoids exposing unrelated company data, and runs unattended.
Recommended approach: nightly rebuild via T-SQL
The simplest approach uses plain T-SQL (SELECT … INTO) wrapped in a Windows Scheduled Task. It runs on SQL Server Express and requires no extra tooling.
Step A — Create the staging database (one-time)
Run once in SSMS, connected to the same instance that hosts your ERP database:
CREATE DATABASE MosaicExport;Step B — Author the rebuild + backup script
Save as C:\MosaicExport\rebuild_and_backup.sql. Replace [SourceDbName] with your ERP database name and add one DROP + SELECT INTO block per table you need to send.
The example below uses the Deltek Ajera On-Premise integration — Mosaic reads up to 13 tables. Adapt the table list to your own integration as needed.
USE MosaicExport;
GO
-- Drop and recreate each table from the source DB.
-- Example: Deltek Ajera On-Premise (up to 13 tables).
IF OBJECT_ID('dbo.AxVEC', 'U') IS NOT NULL DROP TABLE dbo.AxVEC;
SELECT * INTO dbo.AxVEC FROM [SourceDbName].dbo.AxVEC;
IF OBJECT_ID('dbo.AxProject', 'U') IS NOT NULL DROP TABLE dbo.AxProject;
SELECT * INTO dbo.AxProject FROM [SourceDbName].dbo.AxProject;
IF OBJECT_ID('dbo.AxCompany', 'U') IS NOT NULL DROP TABLE dbo.AxCompany;
SELECT * INTO dbo.AxCompany FROM [SourceDbName].dbo.AxCompany;
IF OBJECT_ID('dbo.AxEntity', 'U') IS NOT NULL DROP TABLE dbo.AxEntity;
SELECT * INTO dbo.AxEntity FROM [SourceDbName].dbo.AxEntity;
IF OBJECT_ID('dbo.AxProjectType', 'U') IS NOT NULL DROP TABLE dbo.AxProjectType;
SELECT * INTO dbo.AxProjectType FROM [SourceDbName].dbo.AxProjectType;
IF OBJECT_ID('dbo.AxActivity', 'U') IS NOT NULL DROP TABLE dbo.AxActivity;
SELECT * INTO dbo.AxActivity FROM [SourceDbName].dbo.AxActivity;
IF OBJECT_ID('dbo.AxTransaction', 'U') IS NOT NULL DROP TABLE dbo.AxTransaction;
SELECT * INTO dbo.AxTransaction FROM [SourceDbName].dbo.AxTransaction;
IF OBJECT_ID('dbo.AxTimesheet', 'U') IS NOT NULL DROP TABLE dbo.AxTimesheet;
SELECT * INTO dbo.AxTimesheet FROM [SourceDbName].dbo.AxTimesheet;
IF OBJECT_ID('dbo.AxEmployeeType', 'U') IS NOT NULL DROP TABLE dbo.AxEmployeeType;
SELECT * INTO dbo.AxEmployeeType FROM [SourceDbName].dbo.AxEmployeeType;
IF OBJECT_ID('dbo.AxRateTable', 'U') IS NOT NULL DROP TABLE dbo.AxRateTable;
SELECT * INTO dbo.AxRateTable FROM [SourceDbName].dbo.AxRateTable;
IF OBJECT_ID('dbo.AxRateTableDate', 'U') IS NOT NULL DROP TABLE dbo.AxRateTableDate;
SELECT * INTO dbo.AxRateTableDate FROM [SourceDbName].dbo.AxRateTableDate;
IF OBJECT_ID('dbo.AxRateTableDetail', 'U') IS NOT NULL DROP TABLE dbo.AxRateTableDetail;
SELECT * INTO dbo.AxRateTableDetail FROM [SourceDbName].dbo.AxRateTableDetail;
IF OBJECT_ID('dbo.AxInvoiceGroup', 'U') IS NOT NULL DROP TABLE dbo.AxInvoiceGroup;
SELECT * INTO dbo.AxInvoiceGroup FROM [SourceDbName].dbo.AxInvoiceGroup;
BACKUP DATABASE MosaicExport
TO DISK = 'C:\MosaicExport\MosaicExport.bak'
WITH INIT, STATS = 10;A few notes:
SELECT … INTOcopies schema + data but not indexes, constraints, or foreign keys. That's fine for an export — Mosaic only reads the data.WITH INIToverwrites the previous.bakso the file stays small and the daily sync uploads a single, current file.- The account running this script needs
SELECTon the source tables anddb_owner(or at leastdb_backupoperator+db_ddladmin) onMosaicExport.
Step C — Wrap it in a batch file
Save as C:\MosaicExport\rebuild_and_backup.bat:
sqlcmd -S .\SQLEXPRESS -E -i C:\MosaicExport\rebuild_and_backup.sql -o C:\MosaicExport\rebuild_and_backup.log-S .\SQLEXPRESS— change to match your instance name (e.g..\MSSQLSERVER,SERVER01\SQL2019).-Euses the Windows account the task runs as. If you need SQL authentication instead, replace with-U <user> -P <password>.-owrites a log so you can confirm success after each run.
Step D — Schedule it in Windows Task Scheduler
- Open Task Scheduler → Create Task (not "Create Basic Task" — you need the advanced options).
- General tab:
- Name:
Mosaic — Rebuild & Backup. - Select Run whether user is logged on or not.
- Check Run with highest privileges.
- Configure the run-as account to a domain or local user with the SQL permissions described above.
- Name:
- Triggers tab → New → Daily at e.g. 1:00 AM local time. Pick a time at least an hour before Mosaic's 3:00 AM EST refresh, leaving room for the backup and the S3 sync to complete.
- Actions tab → New → Start a program → Program/script:
C:\MosaicExport\rebuild_and_backup.bat. - Conditions / Settings: leave defaults, but you may want to enable "Run task as soon as possible after a scheduled start is missed."
- Save. You'll be prompted for the run-as account password.
Step E — Sync the .bak to S3
.bak to S3Mosaic will provide you with three values for your S3 destination:
| Field | Example value (replace with what Mosaic sends you) |
|---|---|
| Access key | {AWS_ACCESS_KEY_ID} |
| Secret access key | {AWS_SECRET_ACCESS_KEY} |
| Bucket name | {S3_BUCKET_NAME} |
If the AWS CLI isn't already on the SQL Server machine, install it first using AWS's official installation instructions (the MSI installer is the simplest path on Windows).
Then configure it once using aws configure. When prompted, paste the values Mosaic provided:
> aws configure
AWS Access Key ID [None]: {AWS_ACCESS_KEY_ID}
AWS Secret Access Key [None]: {AWS_SECRET_ACCESS_KEY}
Default region name [None]: us-east-1
Default output format [None]: jsonThis stores the credentials in %USERPROFILE%\.aws\credentials for the run-as account, so subsequent uploads don't need them on the command line.
Then upload the .bak:
aws s3 cp C:\MosaicExport\MosaicExport.bak s3://{S3_BUCKET_NAME}/Wrap that command in a second batch file (e.g. C:\MosaicExport\sync_to_s3.bat) and add a second Windows Scheduled Task that runs ~30 minutes after Step D, so the upload happens after the backup file is fully written.
Verification checklist
Before considering the integration "done":
- Run
rebuild_and_backup.batmanually. ConfirmMosaicExport.bakappears inC:\MosaicExport\andrebuild_and_backup.logends withBACKUP DATABASE successfully processed …. - Restore the
.bakto a scratch SQL Server instance. Confirm every table in your list is present and that row counts match the source DB. - After the first overnight run, open Task Scheduler → History and confirm Last Run Result =
(0x0)for both the backup task and the S3 sync task. - Confirm the
.bakis visible in the S3 bucket the next morning, and that Mosaic's 3:00 AM EST refresh ingests it (your CSM can verify the integration ran).
Alternatives
If SELECT … INTO doesn't fit (e.g. you need to preserve indexes/constraints, or you can't grant cross-database SELECT), these are also viable — but heavier:
- SSMS Generate Scripts wizard — script selected tables (schema + data) into a fresh DB, then back it up. Good for one-offs, awkward to automate.
- BCP utility — export each table to a flat file from the source DB and
BULK INSERTintoMosaicExport. Useful when the source and staging DBs are on different instances.
For most customers, the T-SQL approach above is the path of least resistance.
Updated about 18 hours ago
