Copy DB backup to AWS S3

Explore seamless data protection with our latest tech blog, guiding you through the process of securely copying database backups to AWS S3. Unlock the power of cloud storage and safeguard your valuable data with ease

AMAZON WEB SERVICES

Abhishek Gupta

1/16/20243 min read

Copy DB backup to AWS S3
Copy DB backup to AWS S3

In today’s tech-driven landscape, safeguarding your database backups is paramount for ensuring data integrity and business continuity. In this blog post, we’ll explore a robust solution using PowerShell to seamlessly copy your database backups to Amazon S3, the highly scalable cloud storage service provided by AWS. By leveraging this solution, you’ll not only enhance the security and accessibility of your backups but also streamline your backup management process.

Benefits:

Enhanced Data Security: By storing your database backups on Amazon S3, you benefit from AWS’s robust security features, including encryption and access controls, ensuring your data remains protected at all times.

Scalability and Durability: Amazon S3 offers virtually unlimited scalability and durability, eliminating concerns about storage limitations or data loss. Your backups are stored redundantly across multiple availability zones, enhancing data resilience.

Cost-effectiveness: With Amazon S3, you only pay for the storage you use, making it a cost-effective solution for storing database backups. Additionally, by automating the backup process, you minimize manual effort and associated costs.

Accessibility and Flexibility: Amazon S3 provides seamless access to your backups from anywhere, enabling easy retrieval and restoration of data when needed. This flexibility is crucial for meeting compliance requirements and responding to data recovery scenarios.

Automation with Windows Task Scheduler: Our PowerShell script can be easily scheduled using Windows Task Scheduler, allowing you to automate the backup copy process according to your desired schedule. This ensures consistency and reliability in your backup strategy, without the need for manual intervention

# Define variables
$scriptname ='copy-db-backup-daily-to-s3'
$accessKey = '<AccessKey>'
$secretKey = '<SecretKey>'
$region = '<Region>'
$server = '<ServerName>'
$logpath = 'F:db-backup-copy-logs'
$date = Get-Date

function Get-FreshTimestamp {
return Get-Date -Format "yyyyMMddHHmmss"
}
$timestamp = Get-FreshTimestamp

# Specify local backup directory path
$BackupDir = "F:db-backupdaily"

# Writing to log file
$logFile = "$logpath$scriptname-$timestamp.log"
Add-Content -Path $logFile -Value "Script completed at $date"

# Import AWS PowerShell module
try {
Import-Module AWSPowerShell
Add-Content -Path $logFile -Value "$timestamp Import-Module AWSPowerShell - Successful"
}
catch {
# If an error occurs, log the error details
$errorMessage = $_.Exception.Message
$timestamp = Get-FreshTimestamp
Add-Content -Path $logFile -Value "$timestamp Error occurred $errorMessage"
exit
Add-Content -Path $logFile -Value "Script completed at $date"
}

# Define S3 bucket names
$dailyBucket = "db-backup-daily"

# List SQL backup folders in the local directory
$BackupFolders = Get-ChildItem -Path $BackupDir

# Loop through the backup folders
foreach ($localBackupDir in $BackupFolders) {

# List SQL backup files in the local directory
$backupFiles = Get-ChildItem -Path $localBackupDir -Filter "*.BAK" | Where-Object { $_.LastWriteTime -gt (Get-Date).AddDays(-1) }

# Loop through the backup files
foreach ($backupFile in $backupFiles) {

# Check if the backup was successful
$backupStatus = $true
if ($backupStatus) {
$timestamp = Get-FreshTimestamp
Add-Content -Path $logFile -Value "$timestamp Copy started for backup file $backupFile to $dailyBucket"
try {
# Get the S3 object metadata
$s3Object = Get-S3Object -BucketName $dailyBucket -Key "$server/$backupFile" -Region $region -AccessKey $accessKey -SecretKey $secretKey

# Check if the S3 object exists and compare the last modified timestamps
if ($s3Object -ne $null) {
$localFileModified = $backupFile.LastWriteTime
$s3ObjectModified = $s3Object.LastModified
if ($localFileModified -gt $s3ObjectModified) {
# If the local file is newer than the S3 object, upload the file
Write-Host "Copying $($backupFile.Name) to S3..."
$timestamp = Get-FreshTimestamp
Add-Content -Path $logFile -Value "$timestamp Copy completed for backup file $backupFile to $dailyBucket"
Write-S3Object -BucketName $dailyBucket -Key "$server/$backupFile" -File $backupFile.FullName -Region $region -AccessKey $accessKey -SecretKey $secretKey
} else {
Write-Host "Skipped $($backupFile.Name). It's already up to date on S3."
$timestamp = Get-FreshTimestamp
Add-Content -Path $logFile -Value "$timestamp Copy skipped for backup file $backupFile to $dailyBucket"
}
} else {
# If the S3 object doesn't exist, upload the file
Write-Host "Copying $($backupFile.Name) to S3..."
Write-S3Object -BucketName $dailyBucket -Key "$server/$backupFile" -File $backupFile.FullName -Region $region -AccessKey $accessKey -SecretKey $secretKey
$timestamp = Get-FreshTimestamp
Add-Content -Path $logFile -Value "$timestamp Copy completed for backup file $backupFile to $dailyBucket"
}

}
catch {
# If an error occurs, log the error details
$errorMessage = $_.Exception.Message
$timestamp = Get-FreshTimestamp
Add-Content -Path $logFile -Value "$timestamp Error occurred : $errorMessage"
}

}
}
}
$date = Get-Date
Add-Content -Path $logFile -Value "Script completed at $date"

Script Placement:

To ensure seamless execution and management of your backup process, it’s recommended to store your PowerShell script in a secure location accessible to your database administrators or backup operators. Consider creating a dedicated folder or repository within your organization’s file system or version control system, ensuring proper access controls are in place to restrict unauthorized access.

Automation:

Once the script is stored in the designated location, you can configure Windows Task Scheduler to execute the script at scheduled intervals, providing a hassle-free and automated solution for copying your database backups to Amazon S3.