I’ve been writing a bunch about Azure Data Studio. I’ve also been recording videos on the topic. A comment I received recently asked how to export a database from Azure Data Studio. It made me want to explore the topic of exporting a database as it relates to Azure Data Studio.
Export?
When we say export, what exactly do we mean. It could be as simple as exporting data to a flat file for consumption in Excel or something. It could be creating a backup. Maybe we mean creating a bacpac file. We could also be looking at creating individual scripts for objects within the database. Finally, what about a full export of the database object definitions? Any or all of these could be what the question was about. So, let’s quickly address them each.
Each of these could be a manual process or an automated process. Instead of trying to address all of these at the same time, I’m going to break them down to individual steps, unique to Azure Data Studio, and then summarize with a section on automation at the end.
Flat File Export
While there is an Extension that lets you import flat files, there is not yet a tool for reversing that process within Azure Data Studio. However, that doesn’t mean that there isn’t a way to get this done. Let’s say we want to get some data out to Excel and we can define a query (which intentionally has a * because we want to export all the columns):
SELECT *
FROM Sales.SalesOrderHeader AS soh
JOIN Sales.SalesOrderDetail AS sod
ON soh.SalesOrderID = sod.SalesOrderID
WHERE soh.OrderDate > '2014-06-26';
If we execute the query within Azure Data Studio the results will look something like this (click to make bigger):
If you look all the way over to the right side of the screen, you’ll see this:
These are the export icons. In order they are;
- Save as CSV
- Save as Excel
- Save as JSON
- Save as XML
- Chart
In short, there’s a pretty easy way to get a result set out to a flat file in a variety of formats.
Backup
This topic is pretty short. It’s SQL Server. Even though I’m running these examples against a Linux container in Docker, a backup is a backup is a backup. Scripting a backup in SQL Server doesn’t change a lick. There is a built-in backup utility in Azure Data Studio. Right click on a database and select Backup from the context menu (not at all dissimilar to working in SQL Server Management Studio). You’ll get a new window that looks like this (again, click to make bigger):
I’m not going to run through all of that. It’s a backup. You can see that all the standard settings for a backup are available. Finally, you can choose to either run the backup defined within the window or you can script it out.
BACPAC
I’m not a fan. However, the bacpac is one method of extracting a database definition or exporting an entire database, so it’s worth discussing. To get this functionality, you will need to install the SQL Server Import extension.
With that installed, you will see a couple of new context menus. The one we’re interested in is the “Data tier Application Wizard”. That will open the following:
There are two choices that interest us. First is the “Extract a data-tier application…”. This wizard will extract a dacpac that will contain all the structures of your database, but not the data. The second is the “Export the schema and data from a database…” wizard. This will create a bacpac, which is a dacpac, plus data. Extracting everything we need.
T-SQL Scripts
You also have the ability to create scripts. Within Azure Data Studio, right click on any object and the context menu will include “Script as Create.” This will generate your standard T-SQL script for the object in question. What you don’t get, at least that I’ve found, is any way to generate scripts for multiple objects.
Automating Export
Everything we’ve looked through so far, except for backups, has been GUI driven stuff. So, where’s the automation for all this? The simple answer is, same place it’s always been. You’ll write T-SQL for the things controlled within the database, just as you’ve always done. You’ll write PowerShell for everything else. That’s it. No real magic as such. However, if you are looking to automate process through PowerShell (which I strongly, STRONGLY, recommend), there’s nothing wrong with a little cheating. Go and get the DBA Tools and use them to make your life a lot easier.
Conclusion
Azure Data Studio does include graphical mechanisms for exporting either data or objects from within the databases on your instances. However, nothing it introduces is actually new or different simply because you’ve always had the capacities that Azure Data Studio exposes. It just shows you different ways to get the export done.
Hi, Grant. Thanks for very good post.
> I’m not a fan.
Could you extend your opinion about whats wrong with bacpac approach?
Thanks. Fair question. I’ve never found the bacpac to be as stable as the backup/restore process or attach/detach. The goal of it is to work for migrations from on-prem to Azure, but it doesn’t work that well. It doesn’t fail too gracefully. Also, it’s not aware of transactions, therefore you have to quiesce the database in order for the export to be successful. Others have hit other issues with some aspects of how bacpac works. I hope that helps.
Thanks for good answer. Maybe new post – All demons of a .bacpac files or Why I hate .backpack files?)
I will add this to the list. It’s a very good suggestion.
Hi
How can i find physical address of my back up?
and how can use it for upload my project to my server ( host I mean)
Now i finish my project and want to upload iy on host
how do id for data base ( on azure data studio?)
You set the physical address of your backup. It’s not something you discover, it’s something you determine. As to moving it to another location, we’re talking either an import, a restore, or applying a bacpac. A quick search on any of these should give you an answer.
Sorry I forget that I have macOs
and run sql on docker and connect with Azure data studio
Now how can I access to physical address of my backup on azure data studio?
I’m not sure. I haven’t run it through a container on a Mac. I’m not sure what the path would be.
ok I see on container on mac is not a physical address
so I Think I should create volume and then access it to a physical address
but after create I see this error:
SQL Server 2019 will run as non-root by default.
This container is running as user mssql.
————————————————
any way, on: export the schema and data from dacpac….
Can I transfer and deploy my database to server on host ?
Sounds like possibly a permissions error inside the container to access the drive.
As to migrating the dacpac to another server, yes, you should be able to do that. You can’t go down in versions though.
Hey,
I cannot find the backup option when I connect the Azure SQL instance. Can you provide any idea on what can be the issue?
P.S. : I am using ADS in linux
That’s because there is no backup of Azure SQL Database.
Note: to enable BACPAC the option, instead of the “SQL Server Import” extension you need to install the “SQL Server dacpac” extension found here: https://docs.microsoft.com/en-us/sql/azure-data-studio/extensions/sql-server-dacpac-extension?view=sql-server-ver15