There occurs an irritating situation in Datadude 2008 (I hear that it is from GDR1 onwards) during Schema Comparison. I am not sure about Datadude 2005 though. It happens when the source is a project and the target is a database, and you try to write updates to the target. What happens is: nothing happens! The schema comparison screen continues to look exactly as it looked like; showing the differences and what action has to be performed, which is indeed irritating, until of course you open the Error List window.
What happens when you click on the Write Updates button is that; if there is some change which cannot be applied to the database destination such as an action which would add a non-null column without a default to a table with data, or if there is simply a syntax error in the source project such as ADD [NewColumn] varchar10; changes are not applied to the target database. The thing is, the Error List gets updated alright, but it does not pop-up, even when it is hidden. This just gives us the “illusion” that nothing happens. Also, in case there is a syntax error such as in the latter scenario; the Export to Editor button gets grayed out.
This may be a minor issue, but it sure irritates when you didn’t know it before hand. Solution: whenever you experience that updates are not written to the target database or if the Export to Editor button is grayed out, all you have to do is open the Error List…
I have been getting used to using Datadude (a.k.a. Visual Studio Team System 2008 Database Edition) in the recent days (I jumped directly onto the Datadude 2008 bandwagon).
Coming to the topic… When generating schema scripts, let’s say to create a new database or alter an existing one, datadude uses some SQLCMD codes. It uses these, for example to set the database name in a parameterized fashion. Like this:
:setvar DatabaseName “MyDatabase”
:setvar DefaultDataPath “D:\Databases\”
CREATE DATABASE [$(DatabaseName)]
NAME = [$(DatabaseName)],
FILENAME = ‘$(DefaultDataPath)My_Database.mdf’,
SIZE = 51200 KB,
MAXSIZE = UNLIMITED,
FILEGROWTH = 10240 KB
NAME = [$(DatabaseName)_log],
FILENAME = ‘$(DefaultDataPath)My_Database.ldf’,
SIZE = 102400 KB,
MAXSIZE = 2097152 MB,
FILEGROWTH = 1024 KB
When you need to parameterize stuff such as database file paths or database names in your create database script or something you may be in for a fix when using the usual T-SQL parameters, and may need to write a little more complex T-SQL for the task, such as dynamic SQL which I feel is somewhat unnatural. Using SQLCMD, therefore makes it quite simple and clean. What hurts me though, is that SQLCMD has been around since SQL Server 2005, and I have only just adopted it.
When using SQLCMD in the Query Editor of Management Studio, it would not work under the default settings. You would as a result need to enable SQLCMD mode from the Query menu.
The primary usage of SQLCMD though, is for command prompt based queries. Yet, it also comes in quite handy in times like these mentioned earlier.
SQL Server 2008 introduced a few Enterprise Edition-only features which changes the structure of a database when implemented. These features are:
- Change data capture
- Transparent data encryption
- Data compression
Due to this, whenever you do a database restore or attach on an SQL Server 2008 instance with a lower edition, SQL Server would not allow you.
So what can you do when you have to move a database to a lower edition? Remove the offending features of course, there’s nothing else that you could do. But, how would you know which of the offending enterprise edition-only features are enabled on your database? Use the new dynamic management view sys.dm_db_persisted_sku_features. It’s a simple view which returns the feature you have enabled on your database along with its id. Hence, if the view shows any records, all you got to do now is disable the feature(s) before backing up.
Note: Of course partitioning was present in SQL Server 2005 as well, and even there you had to remove partitioning if you had to restore a database on a lower edition.