Quantcast
Channel: Patrick Sirr's Blog
Viewing all 13 articles
Browse latest View live

Importing Multiple Scripts using DTE

$
0
0

We have had a lot of requests from folks that would rather not go through the Import Script Wizard when they have multiple T-SQL source scripts. Fortunately Visual Studio provides the ability to execute parameterized commands by using the Design Time Extensibility (DTE) framework. For a typical list of such commands visit this MSDN Library entry. In this blog I’ll lead you through a useful command provided by General Distribution Release (GDR) for Visual Studio Team System 2008 Database Edition which allows you to bypass the Import Script Wizard and import your scripts directly into your project system.

Step 1: Create a script

Create a script as c:\myScript.txt with the following contents:

CREATE TABLE [dbo].[Table1]

(

column_1 int NOT NULL,

column_2 int NULL,

column_3 as (column_1 * 14.3)

)

GO

EXEC sp_addextendedproperty @name='TableWithComputedColumn',

@value ='column_3',

@level0type = 'SCHEMA',

@level0name = 'dbo',

@level1type = 'TABLE',

@level1name = 'Table1',

@level2type = NULL,

@level2name = NULL

GO

CREATE VIEW [dbo].[View1] AS SELECT column_3 FROM [dbo].[Table1]

GO

CREATE USER [User1] WITHOUT LOGIN WITH DEFAULT_SCHEMA = dbo;

GO

GRANT UPDATE ON [dbo].[Table1] TO [User1]

Create another script “c:\myScript2.txt” with the following contents:

CREATE PROCEDURE [dbo].[Procedure1]

@param1 int = 0,

@param2 int

AS

SELECT @param1, @param2

RETURN 0

Step 2: Create a Project

Launch Visual Studio and create a SQL Server 2008 project:

image

Step 3: Select the new project in the Solution Explorer

When you execute DTE commands, they execute in the context of the project that is selected in the Solution Explorer. You should back up your project or, better yet, check in your project to version control before you run this DTE command.

image

Step 4: Display the Command Window

By using the Command Window, you can run DTE commands and provide parameters to those commands that allow them. To display the Command Window, open the View menu, point to Other Windows, and click Command Window.

Step 5: Import the Script

The import script DTE command is “Project.ImportScript”. Now you are at the point where you can run the command and have some fun.

First try the command without any parameters (“Project.ImportScript”):

image

You’ll notice that the Import Script wizard appears. This is the command that is executed when you right-click on the project node and click “Import Script”. Not so interesting. Dismiss this dialog box. Next you will parameterize the command.

The parameters for “Project.ImportScript” are as follows.

Project.ImportScript /FileName [filename] [/encoding Unicode|UTF32|UTF8|UTF7] [/Overwrite] [/IgnoreExtendedProperties] [/IgnorePermissions] [/AddImportedPermissionsToModel]

/FileName [filename]

This is the file that you want import. Place quotes around the file name if it has embedded spaces.

/encoding Unicode|UTF32|UTF8|UTF7

The encoding of the file

[/Overwrite]

If duplicate objects are discovered should they be overwritten?

[/IgnoreExtendedProperties]

Suppress importing extended properties

[/IgnorePermissions]

Suppress importing permissions

[/AddImportedPermissionsToModel]

If provided, any imported permissions will not be added to the model. This means your permissions XML file will have a Build Action of “Not In Build”

Now import your myScript.sql file. You can ignore extended properties and permissions. Run the script below (“Project.ImportScript /FileName “C:\myScript.sql” /IgnoreExtendedProperties /IgnorePermissions”):

image

Notice that your project now has the following files:

MyProject\Schema Objects\Schemas\dbo\Tables\Table1.table.sql

CREATE TABLE [dbo].[Table1]

(

column_1 int NOT NULL,

column_2 int NULL,

column_3 as (column_1 * 14.3)

)

MyProject \Schema Objects\Schemas\dbo\Views\View1.view.sql

CREATE VIEW [dbo].[View1] AS SELECT column_3 FROM [dbo].[Table1]

If you now change the command to “Project.ImportScript /FileName "C:\myScript.sql" /Overwrite /IgnorePermissions” the table script will be altered to:

CREATE TABLE [dbo].[Table1]

(

column_1 int NOT NULL,

column_2 int NULL,

column_3 as (column_1 * 14.3)

)

GO

EXEC sp_addextendedproperty @name='TableWithComputedColumn',

@value ='column_3',

@level0type = 'SCHEMA',

@level0name = 'dbo',

@level1type = 'TABLE',

@level1name = 'Table1',

@level2type = NULL,

@level2name = NULL

To import your second script, change the command in the Command Window to ““Project.ImportScript /FileName "C:\myScript2.sql”. After you run the command, the project system contains the following file:

MyProject \Schema Objects\Schemas\dbo\Programmability\Stored Procedures\Procedure1.proc.sql

CREATE PROCEDURE [dbo].[Procedure1]

@param1 int = 0,

@param2 int

AS

SELECT @param1, @param2

RETURN 0

Obviously you can now repeat this process for all the SQL files that make up your schema.

Conclusion

Note that any DTE command is available through DTE automation and is accessible through EnvDTE80.DTE2.ExecuteCommand or EnvDTE80.DTE2.Commands.Raise. You can use the DTE commands to create Add-Ins, external processes, or packages to automate your specific system. GDR for Visual Studio Team System 2008 Database Edition GDR provides other parameterized commands. I will blog about these soon.

 

- Patrick Sirr

“DataDude” Programmer


Schema Compare DTE Commands

$
0
0

In this blog I’ll lead you through the various parameters for the “Data.NewSchemaComparison” command available in the General Distribution Release (GDR) for Visual Studio Team System 2008 Database Edition and Visual Studio 2010 Beta 2.  This command launches the Visual Studio Schema Compare editor and can, optionally begin a comparison between database projects, .dbschema files or SQL Server databases.

The C# console application source code for Visual Studio Team System 2008 GDR is available here.

The C# console application source code for Visual Studio 2010 Beta 2 is available here.

If you have the Visual Studio Team System 2008 GDR or Visual Studio 2010 Beta 2 installed, open it and navigate to the Command Window. To display the Command Window, open the View menu, point to Other Windows, and click Command Window. At the prompt type “Data.NewSchemaComparison”. The New Schema Comparison dialog will appear. This is the expected result when executing “Data.NewSchemaComparison” without parameters. To avoid this dialog and immediately produce a schema comparison result the following parameters are available:

Visual Studio Team System 2008 GDR :

Data.NewSchemaComparison [/ProviderType ConnectionBased | ProjectBased | FileBased] [/ConnectionString connection] | [/DatabaseName databaseName] | [/ConnectionName name] | [ProjectName proj] | [/FileName fileName] [/DataSourceGuid dataSource][ProviderType ConnectionBased | ProjectBased | FileBased] [/ConnectionString connection] | [/DatabaseName databaseName] | [/ConnectionName name] | [ProjectName proj] | [/FileName fileName] [/DataSourceGuid dataSource]

Visual Studio 2010 Beta 2 :

Data.NewSchemaComparison [/ProviderType ConnectionBased | ProjectBased | FileBased] [/ConnectionString connection] | [/DatabaseName databaseName] | [/ConnectionName name] | [ProjectName proj] | [/FileName fileName] [/DspFamily family][ProviderType ConnectionBased | ProjectBased | FileBased] [/ConnectionString connection] | [/DatabaseName databaseName] | [/ConnectionName name] | [ProjectName proj] | [/FileName fileName] [/DspFamily family]

The command parameters may look daunting, but the general form is “Data.NewSchemaComparision /ProviderType <source options> /ProviderType <target options>”

/ProviderType ConnectionBased | ProjectBased | FileBased

This identifies the type of the source or target provider in the schema compare.

/ConnectionString connection

For ConnectionBased providers, this is the connection string

/DatabaseName

For ConnectionBased providers, this is the name of the database

/ConnectionName name

This is the title used in the schema compare editor for this source or target

/ProjectName proj

For ProjectBased providers, this is the name of the project. This project must be open in the Solution Explorer.

/FileName fileName

For FileBased providers, this is the full path to the .dbschema file

/DataSourceGuid dataSource

NOTE: This option has been deprecated in Visual Studio 2010 Beta 2

For ConnectionBased providers, this is the data source guid. For SQL Server providers this guid is “067EA0D9-BA62-43f7-9106-34930C60C528”

/DspFamily

NOTE: This option is only available in Visual Studio 2010 Beta 2
Specifies the family of database schema providers which should be used to model the database.  For SQL Server use “sql”.

Example 1: Connection-Based Providers

I have a local SQL Server 2008 named instance called “SQL2008”. On that server I have two databases – “Larry” and “Moe”. The following command will launch a schema compare session between these two databases when entered into the Command Window:

Visual Studio Team System 2008 GDR :

Data.NewSchemaComparison /ProviderType ConnectionBased /ConnectionString "Data Source=.\sql2008;Integrated Security=True;Pooling=False" /ConnectionName Larry /DatabaseName Larry /DataSourceGuid 067EA0D9-BA62-43f7-9106-34930C60C528 /ProviderType ConnectionBased /ConnectionString "Data Source=.\sql2008;Integrated Security=True;Pooling=False" /ConnectionName Moe /DatabaseName Moe /DataSourceGuid 067EA0D9-BA62-43f7-9106-34930C60C528

Visual Studio 2010 Beta 2 :

Data.NewSchemaComparison /ProviderType ConnectionBased /ConnectionString "Data Source=.\sql2008;Integrated Security=True;Pooling=False" /ConnectionName Larry /DatabaseName Larry /DspFamily sql /ProviderType ConnectionBased /ConnectionString "Data Source=.\sql2008;Integrated Security=True;Pooling=False" /ConnectionName Moe /DatabaseName Moe /DspFamily sql

Example 2: Project-Based Providers

I have a solution with two database projects. The solution is currently open in Visual Studio and both projects are loaded. I can schema compare the projects by typing the following command into the Command Window.

Visual Studio Team System 2008 GDR & Visual Studio 2010 Beta 2 :

Data.NewSchemaComparison /ProviderType ProjectBased /ProjectName Abbott /ProviderType ProjectBased /ProjectName Costello

Example 3: File-Based Providers

I have two .dbschema files located at “C:\Laurel.dbschema” and “C:\Hardy.dbschema”. I can schema compare the files by typing the following command into the Command Window.

Visual Studio Team System 2008 GDR & Visual Studio 2010 Beta 2 :

Data.NewSchemaComparison /ProviderType FileBased /FileName c:\Laurel.dbschema /ProviderType FileBased /FileName c:\Hardy.dbschema

Example 4: Mixed Providers

At this point I have done schema comparisons between connections, projects and files. Any mix of source or target providers is also allowed. For example, I can compare my project Abbott.dbproj with my database Larry located in my local SQL Server 2008 instance through the command below.

Visual Studio Team System 2008 GDR :

Data.NewSchemaComparison /ProviderType ConnectionBased /ConnectionString "Data Source=.\sql2005;Integrated Security=True;Pooling=False" /ConnectionName Larry /DatabaseName Larry /DataSourceGuid {067EA0D9-BA62-43f7-9106-34930C60C528} /ProviderType ProjectBased /ProjectName Abbott

Visual Studio 2010 Beta 2 :

Data.NewSchemaComparison /ProviderType ConnectionBased /ConnectionString "Data Source=.\sql2005;Integrated Security=True;Pooling=False" /ConnectionName Larry /DatabaseName Larry /DspFamily sql /ProviderType ProjectBased /ProjectName Abbott

Example 5: Driving Schema Compare through Automation

I’ve used the Command Window to launch the schema compare editor. In this example I will control Visual Studio from another process, launch the schema compare editor, and save the change script to an editor. When Visual Studio launches it places a DTE object on the Running Object Table (ROT). Other applications can use the ROT entry to control Visual Studio using the DTE object model. Each DTE entry on the ROT has a unique name beginning with “!VisualStudio” and ending with the process id. During the development of this application I used “irotview.exe” to view the various Visual Studio instance names. Below is a screenshot of irotview.exe (I have outlined the Visual Studio instances).

image

I first launch Visual Studio and wait for it to start responding (the following example is for Visual Studio 2008)

   1: // Launch Visual Studio 
   2: Process vs = new Process(); 
   3: vs.StartInfo.FileName =   
   4:  Environment.ExpandEnvironmentVariables(@"%VS90COMNTOOLS%..\IDE\devenv.exe"); 
   5: vs.Start(); 
   6: while (!vs.Responding) 
   7: { 
   8:    System.Threading.Thread.Sleep(1000); 
   9: } 

The next step is to get the EnvDTE._DTE instance from the ROT. Accessing the ROT is documented elsewhere so I won’t spend time with it here. I’ve authored a class that, given a Process object, will return either null or an EnvDTE._DTE instance. Note that there is some delay between when Visual Studio is responding and when it places the _DTE instance on the ROT. This is why I’ve placed the ROT query in a while loop.

   1: EnvDTE._DTE dteInstance = null;
   2:  
   3: while (vs.HasExited == false &&
   4:        dteInstance == null)
   5: {
   6:    dteInstance = ROTHelper.GetVisualStudioInstance(vs);
   7: }

Once the EnvDTE._DTE instance is available I use the ExecuteCommand method to launch a Schema Compare editor.

   1: // Create a schema compare session which compares 
   2: // two databases
   3: string commandArg = string.Format(CultureInfo.InvariantCulture,
   4:             "/ProviderType ConnectionBased "+
   5:             "/ConnectionString {0} " +
   6:             "/ConnectionName {1} "+
   7:             "/DatabaseName {1}" +
   8:             "/DataSourceGuid {2} " +
   9:             "/ProviderType ConnectionBased " +
  10:             "/ConnectionString {3} " +
  11:             "/ConnectionName {4} " +
  12:             "/DatabaseName {4} " +
  13:             "/DataSourceGuid {5}",
  14:             "\"Data Source=.\\sql2008;Integrated Security=True;Pooling=False\"",
  15:             "Larry",
  16:             guidSqlServerDataSource.ToString(),
  17:             "\"Data Source=.\\sql2008;Integrated Security=True;Pooling=False\"",
  18:             "Larry",
  19:             guidSqlServerDataSource.ToString());
  20:  
  21: // Execute the command
  22: dteInstance.ExecuteCommand("Data.NewSchemaComparison", commandArg);
  23:  

The Schema Compare editor has no event telling me when it has finished comparing. In order to export the change script I use ExecuteCommand and catch COMExceptions until it succeeds or the Visual Studio process exits. The COMExceptions are expected if the command I am executing is currently disabled in the Visual Studio IDE.

   1: bool done = false;
   2: string outputFile = @"C:\Export.sql";
   3:  
   4: while (!done &&
   5:        vs.HasExited == false)
   6: {
   7:    try
   8:    {
   9:       Console.WriteLine("Ouputting script to " + outputFile);
  10:       dteInstance.ExecuteCommand("Data.SchemaCompareExportToFile", 
  11:           outputFile); 
  12:       done = true;
  13:    }
  14:    catch (COMException)
  15:    {
  16:       Console.WriteLine("Waiting for message to become available");
  17:       Thread.Sleep(1000);
  18:    }
  19: }

Finally, I close Visual Studio

   1: if (vs.HasExited == false)
   2: {
   3:    // Close VS
   4:    dteInstance.Application.Quit();
   5: }

Conclusion:

I’ve exercised the Schema Compare editor using files, projects and databases and had some fun controlling Visual Studio through the ROT! In future blogs I’ll introduce more of the DTE commands available to you.

- Patrick Sirr

Data Compare DTE Commands

$
0
0

In this blog I’ll lead you through the various parameters for the “Data.NewDataComparison” command available in the General Distribution Release (GDR) for Visual Studio Team System 2008 Database Edition. This command launches the Visual Studio Data Compare editor.

If you have the Visual Studio Team System 2008 GDR installed, open it and navigate to the Command Window. To display the Command Window, open the View menu, point to Other Windows, and click Command Window. At the prompt type “Data.NewDataComparison”. The New Data Comparison dialog will appear. This is the expected result when executing “Data.NewDataComparison” without parameters. To avoid this dialog and immediately produce a data comparison result the following parameters are available:

Data.NewDataComparison /SrcServerName srcServer /SrcDatabaseName srcDatabaseName /SrcDisplayName displayName [/SrcUserName username] [/SrcPassword password] /TargetServerName targetServer /TargetDatabaseName targetDatabaseName /TargetDisplayName displayName [/TargetUserName username] [/TargetPassword password]

The command will have the general form is “Data.NewDataComparison <source options> <target options>”

/SrcServerName

/TargetServerName

Specify the name of the server that contains the data for the source/target of the comparison.

/SrcDatabaseName

/TargetDatabaseName

Specify the name of the database that contains the data for the source/target of the comparison.

/SrcDisplayName

/TargetDisplayName

Specify the name that you want to appear in the Schema Compare window for the source/target of the comparison.

/SrcUserName

/TargetUserName

Specify the user name that you want to use to connect to the database that contains the data for the source/target of the comparison.

If you’re using windows authentication you can ignore this option.

/SrcPassword

/TargetPassword

Specify the password for the user name that you want to use to connect to the database that contains the data for the source/target of the comparison. 

If you’re using windows authentication you can ignore this option.

I have two named SQL Server 2008 instances on my box – SQL2008_1 and SQL2008_2. The first instance has a database named ‘Drake’ and the other ‘Josh’. I’m using windows authentication, so no username or password is required.  A Data Compare session can be launched by typing the following command into the Command Window

Data.NewDataComparison /SrcServerName .\SQL2008_1 /SrcDatabaseName Drake /SrcDisplayName Drake /TargetServerName .\SQL2008_2 /TargetDatabaseName Josh /TargetDisplayName Josh

If differences are available, I can now use the command “Data.DataCompareExportToFile [filename]” to create a data update script. When I execute “Data.DataCompareExportToFile C:\JoshUpgrade.sql" in the Command Window a data upgrade script will be created.

Conclusion

At this point I’ve illustrated how to use the Import Script, Schema Compare and Data Compare DTE commands. I hope you’re starting to see the flexibility this provides for authoring your own addins, packages or applications tailored to your specific requirements.

- Patrick Sirr

“DataDude” Programmer

Database Project Import Scripts Add In

$
0
0

In this blog I’ll introduce you to the Import Scripts add-in to the General Distribution Release (GDR) for Visual Studio Team System 2008 Database Edition. With this add-in Visual Studio can import multiple scripts to populate a target database project. I’ve included the source code but, in order to compile the install project you’ll need to install Votive 3.0 (I used 3.0.4707.0).

To install run InstallImportScripts.msi (available in this zip). This install will create these files on your filesystem:

%VSINSTALLDIR%\Common7\Ide\PublicAssemblies\ImportScriptsRecursively.dll

C:\ProgramData\Microsoft\MSEnvShared\Addins\ImportScriptsRecursively.AddIn

After installing run Visual Studio 2008 and bring up the Add-in Manager by selecting Tools.Add-in Manager from the main toolbar. Verify that “Database Project Script Import” is available and selected to load.

clip_image002

A new menu will appear in the Data menu called “Import Scripts…” whenever at least one Database Project is open in the solution explorer.

clip_image004

Selecting this menu button will bring up the Scripts Selector dialog. Use this dialog to

· Select the project which will receive the imported database schema objects (the target project)

· Select the directory to search for .sql files

· Select or deselect individual files or folders from the import process. Note that, by default any scripts named Script.PostDeployment.sql, Script.PreDeployment.sql or IgnoredOnImport.sql will not be selected.

· Select the default encoding for the .sql files. If you run the addin and no database schema objects are imported verify the encoding of your files match the encoding you’ve selected in the Scripts Selector dialog.

· Choose among the various import script options available on the Import Script Wizard.

· If you choose to concatenate your scripts into one large file memory pressure will increase but performance will be vastly increased.

clip_image006

Once you select ‘Run Import’ the Output Window will be activated and messages will appear providing you with feedback on the import script process. Each file will produce output similar to the one below. The output will include 1) a progress indicator (in the case below 4/700) and 2) the DTE command which can be executed into the Command Window to re-import that particular script (in the case below Project.ImportScript /FileName "C:\temp\AdventureWorks\AdventureWorks\Schema Objects\Database Level Objects\Security\Schemas\Person.schema.sql" /Encoding UTF8).

Executing 4/700 : Project.ImportScript /FileName "C:\temp\AdventureWorks\AdventureWorks\Schema Objects\Database Level Objects\Security\Schemas\Person.schema.sql" /Encoding UTF8

Started importing file: C:\temp\AdventureWorks\AdventureWorks\Schema Objects\Database Level Objects\Security\Schemas\Person.schema.sql

File name C:\temp\AdventureWorks\AdventureWorks\Schema Objects\Database Level Objects\Security\Schemas\Person.schema.sql (size: 270)

Parsing SQL script

Total number of batches in script: 2

Total number of statements in script: 2

Finished importing file: C:\temp\AdventureWorks\AdventureWorks\Schema Objects\Database Level Objects\Security\Schemas\Person.schema.sql

Conclusion

The Import Scripts Add-in allows you to import multiple scripts at once. This should be a useful addin when upgrading from CTPs (which do not support upgrade) or importing from a script archive into a database project for the first time.

- Patrick Sirr

- DataDude Programmer

Navigating the Data Dude Object Model

$
0
0

In this blog I’ll introduce you to the basics of navigating the General Distribution Release (GDR) for Visual Studio Team System 2008 Database Edition object model. I’ve wrapped it all into an add-in so you can step through the code and extend it as you wish. I hope you find it useful! The first part of this blog will describe the basics for navigating the model. After we’ve covered that ground I’ll introduce you to the add-in.

Basic Navigation

I’m assuming you are writing an add-in. If so then your add-in will be initialized with a DTE2 object in its OnConnection method. This provides you with all the open projects in your solution. To find the database projects loop over all projects finding those with a EnvDTE.Project.Kind of “C8D11400-126E-41CD-887F-60BD40844F9E”.

Now that you’ve identified the EnvDTE.Project instance you can gain access to the DataDude model through the properties collection of this project. Get the “DataSchemaModel” property and cast to a Microsoft.Data.Schema.SchemaModel.DataSchemaModel. Note that you’ll need to add references to the DataDude assemblies Microsoft.Data.Schema.dll and Microsoft.Data.Schema.Sql.dll.

   1: public static DataSchemaModel GetModel(EnvDTE.Project project)
   2: {
   3:     object obj = project.Properties.Item("DataSchemaModel");
   4:     return ((EnvDTE.Property)obj).Value as DataSchemaModel;
   5: }

Note that the DataSchemaModel is thread-safe but is not XmlSerializable. So if you’re using DTE from an external process the DataSchemaModel property will return null.

Let’s continue and query the model for all the tables. Use the member DataSchemaModel.GetElements querying for ISqlTable. Note that we are choosing to use ElementQueryFilter.Internal as the argument because we only want those tables which have been defined by the user – not those brought in through an external reference to a .dbschema file. Here’s an example of how to retrieve all the tables in the project:

   1: public static IList<ISqlTable> GetTables(DataSchemaModel model)
   2: {
   3:     return model.GetElements< ISqlTable >(ElementQueryFilter.Internal); 
   4: }

Great! Now we have all the tables. Note that the objects returned by this query support other interfaces. For instance, ISqlTable supports ISqlColumnSource which makes navigating the table columns much easier.

Even though you have access to the model changes to it are prohibited and will throw an exception. This is because the project system’s model is controled through scripts. It is only through these scripts that the model can be modified. Ok, so the next logical question is – “where is this table defined? What script? Where in that script?” Good question! Cast your object (i.e. the ISqlTable instance) to IScriptSourcedElement. If this interface returns an ElementSource then it does indeed live in a source file in the project system. The CacheIdentifier field will be the full path filename in the project system and and the length/offset the location, within that file, of this particular object. Note also, that IScriptSourcedElement gives you access to the ScriptDom allowing you to walk the ASTs if your needs include walking our parse tree. If you download the add-in source below and look at VsUtils.GetSelectedModelElements you’ll see where I use the ElementSource.

Add-In Example

Now I’ll introduce you to a General Distribution Release (GDR) for Visual Studio Team System 2008 Database Edition Add-in which will alter the script in your project to comment out selected schema-owned elements. I’ve included the source code but, in order to compile the install project you’ll need to install Votive 3.0 (I used 3.0.4707.0). This add-in uses the DataSchemaModel features I talked about in the previous section.

To install run CommentOutAddIn.msi (available in this zip). This install will create these files on your filesystem:

%VSINSTALLDIR%\Common7\Ide\PublicAssemblies\VSTSDBCommentAddIn.dll

C:\ProgramData\Microsoft\MSEnvShared\Addins\CommentOutElementAddIn.AddIn

After installing run Visual Studio 2008 and bring up the Add-in Manager by selecting Tools.Add-in Manager from the main toolbar. Verify that “Database Project Comment Out Add-in” is available and selected to load.

A new menu will appear in the Data menu called “Comment Out…” whenever at least one Database Project is open in the solution explorer.

Selecting this menu button will bring up the Comment Out dialog. Select the Database Project to search and click the “Find Schema Objects” button. This will search the project for all Tables, Views and Procedures and populate the tree control you see in the image below.

At this point select the schema objects you’d like to comment out and select “Command Out Objects”. At this point the dialog will close and all the selected elements will be commented out from their respective files.

For each file you should see a message similar to this in the output window:

C:\TEMP\DATABASE3\DATABASE3\SCHEMA OBJECTS\SCHEMAS\SALES\TABLES\SPECIALOFFERPRODUCT.TABLE.SQL is being modified so selected schema objects may be commented out.

Here’s the result of the commenting out a table:

   1: /*CREATE TABLE [Sales].[SpecialOfferProduct] (
   2:     [SpecialOfferID] INT              NOT NULL,
   3:     [ProductID]      INT              NOT NULL,
   4:     [rowguid]        UNIQUEIDENTIFIER ROWGUIDCOL NOT NULL,
   5:     [ModifiedDate]   DATETIME         NOT NULL
   6: );*/

Conclusion

The Comment Out Add-in is capable of commenting out Tables, Views and Procs in your script. But more than that, I hope the source code provides some help to for those wishing to write add-ins to tailor to their specific needs. Enjoy!

- Patrick Sirr

“DataDude” Senior Programmer

Using T4 Templates with GDR Database Projects

$
0
0

For this blog I’ve developed an add-in for General Distribution Release (GDR) for Visual Studio Team System 2008 Database Edition projects (better known as “Data Dude”) which will generate Create, Read, Update and Delete (CRUD) stored procedures for a particular table in your model.  I wanted to provide the ability to affect the content of those procedures without recompiling my add-in so I’m using Visual Studio’s built in, template based, code generation tool - known as T4.  It’s not well known but extremely powerful.  There’s a good MSDN article here. I’ll go through the key pieces of the code, but there’s nothing like running it for yourself, so I’ve provided the entire source in this zip.

Overview of the Add-In

The zip reference above contains an MSI called InstallCRUDGenerator.msi. Once you install, make sure the add-in is loaded by selecting the “Tools.Add-in Manager” menu button:

clip_image002

Once loaded any scripts containing tables will display a “CRUDGenerator” button in your Solution Explorer context menu:

clip_image004

If you select this the CRUD Generation Dialog will appear:

clip_image006

This dialog allows you to select your source table script and then view the generated CRUD stored procedures. If you’re happy with the generated procedures select “Save Procedures to Project” and 4 new stored procedures will be created in your project.

Now for the details! The text box in the middle of the CRUD Generation Dialog contains your T4 template. At the start of this blog I gave you a good T4 link so I won’t go into details on the scripting language itself, but let’s adjust the Create procedure seen in the image above. First let’s add my company name to the comments for this stored procedure. The comment is generated on lines 10 through 13 of the T4 Template:

   1: /*
   2: CREATE procedure for <#= tableFullName #>
   3: Generated <#= DateTime.Now.ToString() #>
   4: */

Let’s change this to

   1: /*
   2: Copyright (c) Microsoft Corporation. All rights reserved.
   3: CREATE procedure for <#= tableFullName #>
   4: Generated <#= DateTime.Now.ToString() #>
   5: */

.. and hit the “Generate” button:

clip_image008

Notice now that our copyright appears in the generated stored procedure T-SQL and we did this without modifying our add-in. Very cool!

Feel free to experiment and modify the T4 templates I provide in this add-in. There are 5 templates installed into “%ProgramFiles%\Microsoft Visual Studio 9.0 CRUD Template AddIn”:

· CrudCreate.tt

o Responsible for generated the ‘Create’ stored procedure

· CrudDelete.tt

o Responsible for generated the ‘Delete’ stored procedure

· CrudRead.tt

o Responsible for generated the ‘Read’ stored procedure

· CrudUpdate.tt

o Responsible for generated the ‘Update’ stored procedure

· CrudToolbox.tt

o Useful utilities for walking the Data Dude model. This is where the columns are analyzed to determine how they are used in the templates above.

CrudToolbox.tt contains the most code. It’s here where I walk the Data Dude object model to determine the column and column types for the selected table. If you wish to change this file remember that it too can be altered without having to restart Visual Studio. Just open it in a text editor or another Visual Studio instance, edit and then hit the ‘Generate’ button as we did before.

Overview of the Add-In Code

In this section I’ll provide you with an overview of the CRUD Generator Add-in code. For anyone that’s ever generated a Visual Studio Add-in project you’ll know that Connect.cs is where the add-in initializes. It’s there where I add the ‘CRUD Generator’ button to the Solution Explorer context menu and handle the button click. My handling is a simple pass-off to a class I call the CRUDEngine:

   1: public void Exec(string commandName, vsCommandExecOption executeOption, ref object varIn, ref object varOut, ref bool handled)
   2: {
   3:     handled = false;
   4:     if(executeOption == vsCommandExecOption.vsCommandExecOptionDoDefault)
   5:     {
   6:         if(commandName == "CRUDGenerator.Connect.CRUDGenerator")
   7:         {
   8:             CRUDEngine engine = new CRUDEngine(ProjectFinder.GetFirstDatabaseProject(_applicationObject));
   9:             engine.Run();
  10:             handled = true;
  11:             return;
  12:         }
  13:     }
  14: }

This engine creates an instance of my WinForms dialog and provides that dialog with a class I call the CRUDGenerator:

   1: public void Run()
   2: {
   3:     CRUDDialog dialog = new CRUDDialog();
   4:     dialog.Generator = new CRUDGenerator(_project);
   5:     dialog.ShowDialog();
   6: }

I won’t go over my CRUDDialog since most of it is simple WinForms programming. The CRUDGenerator is where we’ll focus next. This class is responsible for executing the T4 scripting engine and returning the results. For more information on executing T4 programmatically see this msdn page.

The most important thing to note when walking the Data Dude object model is that it is not Serializable (although it is thread-safe). This means you cannot access the object model from another process (i.e. through out of process DTE) or from another AppDomain. Fortunately during the process of executing the T4 template engine you have the opportunity to provide a class which must derive from ITextTemplatingEngineHost. My instance of that host is called VSDBCustomHost and it it I provide the current AppDomain

   1: public AppDomain ProvideTemplatingAppDomain(string content)
   2: {
   3:     return AppDomain.CurrentDomain;
   4: }

The core of this class is it’s RunTemplate method. In that method I create an instance of a VSDBCustomHost and process the template:

   1: public string RunTemplate(ISqlTable selectedTable, 
   2:                         string templateFullPath,
   3:                         out CompilerErrorCollection errors)
   4: {
   5:     string tableScript = GetTableText(selectedTable);
   6:     errors = null;
   7:     if (string.IsNullOrEmpty(templateFullPath))
   8:         return string.Empty;
   9:  
  10:     VSDBCustomHost host = new VSDBCustomHost();
  11:     Microsoft.VisualStudio.TextTemplating.Engine engine = new Microsoft.VisualStudio.TextTemplating.Engine();
  12:     host.TemplateFile = templateFullPath;
  13:     host.IncludePath = Path.Combine(VsUtils.GetVSTSDBDirectory(), @"Extensions\SqlServer\");
  14:     host.VsUtils = _vsUtils;
  15:     AppDomain domain = host.ProvideTemplatingAppDomain(string.Empty);
  16:     string output = string.Empty;
  17:     try
  18:     {
  19:         // Set the table to be generated into the app domain
  20:         domain.SetData("TT_TABLE", selectedTable);
  21:         //Read the text template.
  22:         string input = File.ReadAllText(templateFullPath);
  23:         //Transform the text template.
  24:         output = engine.ProcessTemplate(input, host);
  25:         errors = host.Errors;
  26:     }
  27:     finally
  28:     {
  29:         domain.SetData("TT_TABLE", null);
  30:     }
  31:     return output;
  32: }

At the beginning of this blog is the link to the zip file containing all the source for this add-in. From here I’ll leave you to paruse the source.

Conclusion

Hopefully you’ve now got an appreciation of the power of T4 templates! Remember that if you want to use these templates with Data Dude projects you’ll have to provide your own templating engine host since the Data Dude model cannot cross process or AppDomain boundaries since it is not serializable. Feel free to ping me if you’ve got questions!  Have a good 2009!

 

Patrick Sirr

Data Dude Programmer

Template Driven Sql Generation

$
0
0

Recently Duke Kamstra and I demonstrated how you can use Visual Studio 2008 Database and Server projects to deploy Sql Server permission objects to your various Sql Server instances. It’s common to develop on a local box with admin rights, and then restrict those rights as you start deploying to test, pre-production and, eventually, your production Sql Server instance. Database projects assume your schema model is static – an assumption that may work for Tables, Views and Sprocs, but not so much for Logins. In this blog I’ll show you three ways to manage SQL security principles changing as a schema is deployed to different staging environments: Pre-Build Events, Post-Deployment scripts and Visual Studio Extensibility.

Pre-Build Events

With this mechanism you’re swapping in new source files prior to each build. In the sample solution below I have two Visual Studio Configurations – ‘Sandbox’ and ‘Integration’. For each security object I’ve created 2 new files and navigated to the Properties Window to change their Build Action to ‘Not in Build’.

The pre-build event action is found in the project properties ‘Build Events’ tab:

In the pre-build textbox I’ve inserted a script for swapping in new security objects just prior to the build.

   1: if [$(Configuration)] == [Integration] (
   2:     xcopy /r /y "$(ProjectDir)User1.user.integration.sql" "$(ProjectDir)User1.user.sql" 
   3:     echo copied "$(ProjectDir)User1.user.integration.sql" to "$(ProjectDir)User1.user.sql" 
   4: ) else if [$(Configuration)] == [Sandbox] (
   5:     xcopy /r /y "$(ProjectDir)User1.user.sandbox.sql" "$(ProjectDir)User1.user.sql" 
   6:     echo copied "$(ProjectDir)User1.user.integration.sql" to "$(ProjectDir)User1.user.sql" 
   7: ) else (
   8:     del "$(ProjectDir) User1.user.sql" 
   9: )

Note that I’m using “xcopy /r” which overwrites the target file even if it’s marked readonly. I consider this a reasonable approach because, even if you do have this file in version control, it contains no useful information. The interesting, versionable information actually resides in the .sandbox.sql and .integrationl.sql files.

Advantages

  • Object references in the schema model will be validated during the build process
  • Deploy will accurately compare the identifiers for security objects between the project and the target database

Disadvantages

  • Refactoring the security object will not modify the definition in the configuration specific scripts.
  • Schema Compare will compare the identifier defined in the script which is “In Build” to the target. This may result in unexpected differences.
  • If you need to create a new Visual Studio configuration for each target SQL server you will be deploying to.

Post-Deployment File

Another approach to creating security objects is to place all your security-based T-Sql in your Post-Deployment Script.

Here’s an example of a Post-Deployment script used in this way:

   1: /*
   2: 
   3: Post-Deployment Script Template
   4: 
   5: */
   6:  
   7: IF (('$(DatabaseName)') = 'TR9_Database_Dev')
   8: BEGIN
   9:     PRINT N'Creating Sandbox Security Objects...';
  10:     IF NOT EXISTS (SELECT * FROM sys.server_principals WHERE name = N'TR9-DEV\TR9_CONTROL')
  11:     BEGIN
  12:         CREATE LOGIN [TR9-DEV\TR9_CONTROL] FROM WINDOWS;
  13:         EXECUTE sp_addsrvrolemember @loginame = N'TR9-DEV\TR9_CONTROL', @rolename = N'dbcreator';
  14:         IF NOT EXISTS (SELECT * FROM sys.database_principals WHERE name = N'TR9-DEV\TR9_CONTROL')
  15:         BEGIN
  16:             CREATE USER [TR9Control] FOR LOGIN [TR9-DEV\TR9_CONTROL];
  17:             EXECUTE sp_addrolemember @rolename = N'TR9_Control', @membername = N'TR9Control';
  18:         END
  19:     END 
  20:     IF NOT EXISTS (SELECT * FROM sys.server_principals WHERE name = N'TR9-DEV\TR9_USERS')
  21:     BEGIN
  22:         CREATE LOGIN [TR9-DEV\TR9_USERS] FROM WINDOWS
  23:         IF NOT EXISTS (SELECT * FROM sys.database_principals WHERE name = N'TR9-DEV\TR9_USERS')
  24:         BEGIN
  25:             CREATE USER [TR9Users] FOR LOGIN [TR9-DEV\TR9_USERS];
  26:             EXECUTE sp_addrolemember @rolename = N'TR9_Users', @membername = N'TR9Control';
  27:             EXECUTE sp_addrolemember @rolename = N'TR9_Users', @membername = N'TR9Users';
  28:         END
  29:     END 
  30: END
  31: ELSE
  32: BEGIN
  33:     PRINT N'Creating Integration Security Objects...'
  34:     IF NOT EXISTS (SELECT * FROM sys.server_principals WHERE name = N'TR9-INT\TR9_CONTROL')
  35:     BEGIN
  36:         CREATE LOGIN [TR9-INT\TR9_CONTROL] FROM WINDOWS;
  37:         EXECUTE sp_addsrvrolemember @loginame = N'TR9-INT\TR9_CONTROL', @rolename = N'dbcreator';
  38:         IF NOT EXISTS (SELECT * FROM sys.database_principals WHERE name = N'TR9-INT\TR9_CONTROL')
  39:         BEGIN
  40:             CREATE USER [TR9Control] FOR LOGIN [TR9-INT\TR9_CONTROL];
  41:             EXECUTE sp_addrolemember @rolename = N'TR9_Control', @membername = N'TR9Control';
  42:         END
  43:     END 
  44:     IF NOT EXISTS (SELECT * FROM sys.server_principals WHERE name = N'TR9-INT\TR9_USERS')
  45:     BEGIN
  46:         CREATE LOGIN [TR9-INT\TR9_USERS] FROM WINDOWS
  47:         IF NOT EXISTS (SELECT * FROM sys.database_principals WHERE name = N'TR9-INT\TR9_USERS')
  48:         BEGIN
  49:             CREATE USER [TR9Users] FOR LOGIN [TR9-INT\TR9_USERS];
  50:         EXECUTE sp_addrolemember @rolename = N'TR9_Users', @membername = N'TR9Control';
  51:         EXECUTE sp_addrolemember @rolename = N'TR9_Users', @membername = N'TR9Users';
  52:         END
  53:     END 
  54: END

Once again, since the objects do not exist as part of the project system model they cannot participate in the schema differencing process which occurs prior to deployment. Any errors in the objects or your script will not appear until you deploy.

Advantages

  • All of your security objects are defined in one script.

Disadvantages

  • Since the security objects are not defined in the database schema they will not participate in Refactoring or Schema Compare.
  • You may need conditional in the script for each target SQL server you will be deploying to.  If your security objects are similar enough you may be able to use SqlCmd variables to instead of the IF/ELSE scheme I used.

Visual Studio Extensibility

Now I’ll introduce you to the Visual Studio Add-In ProcessSqlTemplateAddIn and associated MSBuild Task. These components use Visual Studio Extensibility to pre-process your T-Sql scripts and bind the value of SQLCMD variables to the respective values defined in the active project configuration. When you select a project configuration the MSBuild task scans the project for template script files. Any variables that are discovered in the templates are replaced with the respective value.

Install the Add-In and MSBuild Task using this msi. The MSI will install the following components:

  • %ProgramFiles%MSBuild\Microsoft\VisualStudio\v9.0\TeamData\ProcessSqlTemplate.Task.Targets
  • %ProgramFiles%Microsoft Visual Studio 9.0\Common7\IDE\PublicAssemblies\ProcessSqlTemplateAddIn.dll
  • %ProgramFiles%Microsoft Visual Studio 9.0\VSTSDB\ProcessSqlTemplate.Task.dll
  • %ProgramData%\Microsoft\MSEnvShared\Addins\ProcessSqlTemplateAddIn.AddIn

After installing, create a Server Project and in that project create a file called ‘Login1.login.template.sql’. The “template.sql” at the end of the filename indicates to the Add-In that this is a template script. Copy the following text to that file:

   1: --*CREATE LOGIN [$(SVR)\TR9_CONTROL] FROM WINDOWS
   2: --*GO
   3: --*CREATE LOGIN [$(SVR)\TR9_USERS] FROM WINDOWS

The “--*” indicates that this text is part of the template.

Next edit your project properties to create two new .sqlcmdvars file – one for your Sandbox and one for your Integration configuration. In the Sandbox .sqlcmdvars file define $(SVR) as “SVR-DEV”. In your integration .sqlcmdvars file define $(SVR) as “SVR-INT”.

Now we need to chain our new MSBuild task into the build process. As part of the ProcessSqlTemplateAddIn MSI a targets file was installed. This target file contains the definition of the ProcessSqlTemplate MSBuild task. Unload the project and insert the following Import statement into your project file after the ‘Microsoft.Data.Schema.SqlTasks.targets’ import statement:

<Import Project="$(MSBuildExtensionsPath)\Microsoft\VisualStudio\v9.0\TeamData\ProcessSqlTemplate.Task.Targets" />

Reload the project and you’ll see that switching configurations will alter the contents of your ‘Login1.login.template.sql’ script according to the value of your sqlcmdvars for that configuration. A sample project is included here.

Before I leave this top there’s one advantage I’d like to explain. Because this approach is bound to your configurations SqlCmd variables file you are able to provide these variable values on the MSBuild commandline. To do this create a new ItemGroup in your project file with the name of the SqlCmd variable you may want to override:

   1: <ItemGroup>
   2:     <SqlCommandVariableOverride Include="SVR=$(SVR)" Condition="'$(SVR)' != ''"/>
   3: </ItemGroup>

With this ItemGroup in place I can now execute MSBuild supplying the configuration as well as an override for the $(SVR) sqlcmd variable. There’s a good forum post on this topic here.

msbuild /t:rebuild /property:Configuration=Sandbox;SVR=Patrick TR9_Database\TR9_Database.sln

Advantages

  • The database developer only needs to implement one SQL script defining the security object.
  • The identifier for the security object is represented as a variable that is bound to the correct value when the project is built.
  • Deploy will correctly compare the project’s schema definition with the target database.
  • It makes use of SqlCmd variables which are overridable on the command line.

Disadvantages

  • The Add-In and MSBuild task need to be installed on each Developer’s computer as well as the TFS Build server.
  • The .dbproj file must be modified to include the MSBuild target.
  • Refactoring will not modify the schema template definitions.

Visual Studio Extensibility Source Code

The source code for the Add-in, MSBuild task and the installer is located here. To compile this you’ll need Visual Studio 2008, the Visual Studio 2008 VSSDK, and Votive 3.0 (I used 3.0.4707.0).

The MSBuild addin is located in the ProcessSqlTemplateAddIn project. Visit the source file Connect.impl.cs for the most interesting details of the implemention. Note that Connect implements IVsUpdateSolutionEvents and IVsSolutionEvents. These events help the addin listen to database projects opening/closing as well as when their configuration changes. When a configuration does change ProcessAllTemplates is called which calls on the MSBuild task to do the template processing.

The MSBuild task is located in the ProcessSqlTemplate.Task project. Take a look at ProcessSqlTemplateTask.Execute for its entry method.

Conclusion

In this blog we’ve discussed three possibilities for deploying server objects to different Sql Server instances from a single Visual Studio Solution. To install the msi or view the source code you’ll need Visual Studio Team System 2008 Database Edition GDR.

The source code for the Visual Studio ProcessSqlTemplate MSBuild Task and Add-in is here.

The sample project demonstrating how to use the Add-In is here.

The ProcessSqlTemplateAddIn MSI is located here.

Sample Visual Studio 2008 Fonts and Color Provider

$
0
0

Adding a new Font and Color category to the Visual Studio Options dialog is, admittedly, a little tricky. There are several services your package must proffer, an interface you must implement and registry keys you must provide (one of them very poorly named!). I was recently asked to provide a fully working sample to help with a debugging session so I thought I’d share it.

For this blog I found two exceptional resources. Dr Ex. has a good blog post about the process for contributing to the Visual Studio fonts and colors dialog located here. There was also a post by Matthew Etter located here about creating a derived RegistrationAttribute to construct the necessary Visual Studio keys. Going from those postings to an actual working contributor is still a bit of a challenge though.

The sample involves one package which contributes a “FontAndColorCategory” to the Options dialog. It also contributes one color called “MyColor”.

clip_image002

The aforementioned blog posts and msdn help should be able you to navigate and understand the source so I’m not going to repeat anything here. Hopefully a fully working example will expedite your own development efforts.

· The full source for the fonts and color provider can be downloaded here.

-- Patrick


Interesting Memory Leak in .Net 3.5 Binary Deserialization

$
0
0

Recently I was analyzing an application written in managed code for memory problems.  In managed code a common cause of eating up memory is statically allocated objects which are not nulled out after they are no longer needed.

In the application I was debugging it was making use of binary deserialization to reconstruct a graph of objects. Many of these objects needed some additional context so a StreamingContext object was created with an additional context state. Here’s a sample of what the code was doing:

   1: using (FileStream fs = myfile.OpenRead())
   2: {
   3:     ContextObject ct = new ContextObject();
   4:     ct.Name = "testing";
   5:     StreamingContext context = new StreamingContext(StreamingContextStates.Other, ct);
   6:     BinaryFormatter bf = new BinaryFormatter();
   7:     bf.Context = context;
   8:     SerializableObject obj = (SerializableObject)bf.Deserialize(fs);
   9: }

Everything was working well until I noticed that deserialization would consistently increase the memory footprint (recognize that, in my case, the additional context object was quite large). I wrote a sample application analyzed the problem using WinDBG. The sample program is located here.

Open WinDBG and select “Open Executable”. Select LeakApplication.exe and hit F5. LeakApplication will run waiting for console input. Select ‘Break’ in WinDBG and type “!dumpheap -type LeakApplication.ContextObject”. You’ll see something like this:

   1: 0:003> !dumpheap -type LeakApplication.ContextObject
   2:  Address       MT     Size
   3: 01f6b7b0 0042340c       12     
   4: total 1 objects
   5: Statistics:
   6:       MT    Count    TotalSize Class Name
   7: 0042340c        1           12 LeakApplication.ContextObject
   8: Total 1 objects

WinDBG is telling us that 1 LeakApplication.ContextObject may be in memory. In order to determine if this is accurate type “!gcroot 01f6b7b0”. Here are the results:

   1: 0:003> !gcroot 01f6b7b0 
   2: Note: Roots found on stacks may be false positives. Run "!help gcroot" for
   3: more info.
   4: Scan Thread 0 OSTHread 12f0
   5: ESP:20ef20:Root:01fcb148(System.Runtime.Serialization.Formatters.Binary.BinaryFormatter)->
   6: 01f6b7b0(LeakApplication.ContextObject)
   7: ESP:20ef3c:Root:01f6b7b0(LeakApplication.ContextObject)->
   8: 01f6b7b0(LeakApplication.ContextObject)
   9: ESP:20ef40:Root:01f6b7bc(System.Runtime.Serialization.Formatters.Binary.BinaryFormatter)->
  10: 01f6b7b0(LeakApplication.ContextObject)
  11: ESP:20ef54:Root:01f6b7b0(LeakApplication.ContextObject)->
  12: 01f6b7b0(LeakApplication.ContextObject)
  13: Scan Thread 2 OSTHread a28
  14: DOMAIN(002F1DE8):HANDLE(Pinned):4013fc:Root:02f51010(System.Object[])->
  15: 01f6ad88(System.Collections.Generic.Dictionary`2[[System.Runtime.Serialization.MemberHolder, mscorlib],[System.Reflection.MemberInfo[], mscorlib]])->
  16: 01f6ae5c(System.Collections.Generic.Dictionary`2+Entry[[System.Runtime.Serialization.MemberHolder, mscorlib],[System.Reflection.MemberInfo[], mscorlib]][])->
  17: 01f6cb5c(System.Runtime.Serialization.MemberHolder)->
  18: 01f6b7b0(LeakApplication.ContextObject)

Sure enough the BinaryFormatter is holding onto the LeakApplication.ContextObject. This is a known leak and, unfortunately, the best you can do is to mitigate the problem. In my case I created a small, disposable property bag object to hold onto my larger “additional” context objects. After deserialization simply call Dispose on the property bag and let it NULL out any members, thus clipping the object graph. Your property bag will still leak, but at least your larger object graph will be available for garbage collection.

-- Patrick

Converting .dbp files to .dbproj files

$
0
0

With the release of Visual Studio 2010 Beta 2, database projects with the .dbp extension are deprecated. There are several ways to convert your project manually

  • Import Script. Unfortunately this will shred your scripts for schema objects and move any conditional logic into the scriptsignored file.
  • Add Existing Item. You could create a Visual Studio 2010 project and use “Add Existing Item” to bring your .dbp scripts into your new project. The downside with this is 1) it is time consuming and 2) you lose the workspace the version control provider needs – therefore loosing history of your files.
  • “Include In Project”. Create a new Visual Studio 2010 project and drag your .dbp project folder structure on the top of the .dbproj file. After your finished select “Show All Items” from the Visual Studio 2010 Solution Explorer and then “Include In Project” for your .dbp files. Unless you create the new .dbproj project over the top of your .dbp project you’ll lose version control history.

DbpToDbproj.exe

To help Database Project (.dbp) customers with the upgrade to Visual Studio 2010 I’ve written DbpToDbProj. The source and executable is located here.

This project builds a command-line executable with parameters allowing you to create either a Visual Studio 2010 2005DSP or 2008DSP project, and indicate if the file structure for the ‘Schema Objects’ folder is ‘by schema’ or ‘by type’.

   1: c:\Temp>DbpToDbproj.exe /?
   2: Database projects that have the file extension .dbp have been deprecated in Visual Studio 2010
   3: DbpToDbproj creates a database project with the .dbproj extension to replace the dbp.
   4:  
   5: DbpToDbproj [drive:][path]filename [/target:[2005|2008]] [/DefaultFileStructure:[ByType | BySchema]]
   6: [drive:][path]filename
   7: Specifies drive, directory and filename for the dbp file to be converted.
   8:  
   9: /target:[2005|2008]
  10: Specifies whether the .dbproj to be created is a Sql2005 or a Sql2008 project.
  11:  
  12: /DefaultFileStructure:[ByType | BySchema]]
  13: Specifies whether the directory structure of the .dbproj should be by object schema or type.

Specifies whether the directory structure of the .dbproj should be by object schema or type.

As an example, here’s a .dbp project from Visual Studio 2008:

clip_image002

Running DspToDbproj in the same directory as your existing .dbp project will create a file called ‘<dbpfilename>.dbproj’.

   1: C:\Temp\dbpProject\dbpProject>DbpToDbproj.exe dbpProject.dbp
   2: Converting...
   3: C:\Temp\dbpProject\dbpProject\dbpProject.dbp
   4: ...to...
   5: C:\Temp\dbpProject\dbpProject\dbpProject.dbproj
   6: Successfully converted dbp to C:\Temp\dbpProject\dbpProject\dbpProject.dbproj

When the .dbproj file is opened you’ll receive an upgrade prompt.  The reason for this is that my command-line utility does not interact with your Version Control system.  Instead I place an MSBuild property in the dbproj file called “PostUpgradeAddToSCC”.  This property is a flag to the database project system that upgrade has added new files and that these new files should be added to SCC.  This only happens after the upgrade process so I had to make the system think an upgrade was necessary.  If you don’t use version control you can edit Program.cs and remove the line where I create the “PreviousProjectVersion” MSBuild property.  The absence of this property will prevent the upgrade wizard from appearing.

After opening the .dbproj file the following project will appear in the solution. Notice that the script files are maintained as they were in the .dbp project. Each of these script files have a build action of ‘Not In Build’, meaning they do not contribute to the dbproj model. It also creates the standard “Schema Objects”, “Data Generation Plans”, and “Schema Comparisons” folders you’d get with any other Visual Studio 2010 project. If you’re using the new Database Project (.dbproj) project for purely version control you can always delete these directories – otherwise you’ll still be able to import from an existing database or a script.

image

DspToDbproj will preserve the connections found in the .dbp by writing them into a file called Connections.txt. From these connection strings you should be able to recreate the connection in the Server Explorer manually.

Conclusion

Hopefully this will make the conversion to Visual Studio 2010 a little easier for everyone with .dbp projects. Once again the source and executable is located here. If you have any questions please visit our forums here or ping me through this blog.  Thanks!

-- Patrick

Template Driven Sql Generation Updated for Visual Studio 2010

$
0
0

More than a year ago I presented a simple template processor for Visual Studio 2008 Database Projects. This add-in gives you the ability to write Transact-SQL in a simple template form with embedded SQLCMD variables. It’ll turn

--*CREATE TABLE [dbo].[$(MyVar)] --*( --* column_1 int NOT NULL, --* column_2 int NULL --*)!-- code>
!-->

Into valid Transact-SQL

CREATE TABLE [dbo].[MyReleaseTableName] ( column_1 int NOT NULL, column_2 int NULL ) !-- code>
!-->

I’ve updated the add-in to work with Visual Studio 2010. The essential details haven’t changed so please refer to my past post. If you wish to recompile the source you’ll need Visual Studio 2010 (professional or ultimate) and the Wix Toolset.

The source code for the Visual Studio ProcessSqlTemplate MSBuild Task and Add-in is here.

The ProcessSqlTemplateAddIn MSI is located here.

Database Project Extensibility – Merge Table and View Scripts

$
0
0

 

In this blog I’ve created a project feature extension to combine Table and View child scripts into the script that defines the parent object. For example, constraints or extended properties for a table. Some developers like to have all the T-SQL scripts defining a particular object in one .sql file. Typically you’d run this after using Import Schema to populate your database project. The package is built using a vsix file as the deployment mechanism and can be downloaded here. The source for this project is located here.

Overview of the Package

Adding the button to Schema View

The package contributes a ‘Merge scripts into parent schema object’ menu button onto the schema view context menu for Tables and Views. When pressed the selected Tables and Views will gather their hierarchical statements into the parent script. If the child script is emptied of content then it is excluded from the project . If your project is under version control you may wish to review the output window and delete these files from your SCC provider.

image

Overview of the Code
One of our requirements is to add a button to the Schema View context menu.  In order to do that we’ll need to create a Visual Studio package.  To create Visual Studio packages you’ll need to install the SDK located here.  Once installed, you’ll have the option to create a new “Visual Studio Integration Package” project based on the vsix format.   See the file ‘MergeScripts.vsct’ for details on how I added this button on the schema view context menu. 

 

Implementing the database project feature extension

I also need to hook into the Visual Studio 2010 database project extensibility. To create extensions to a Visual Studio 2010 database project you must

 

  • Create a class which implements one of the extension point interfaces. All these interfaces ultimately derive from Microsoft.Data.Schema.Extensibility.IExtension.
  • Publish your extension via a file named <MyFile>.Extensions.xml file which can be placed in the %ProgramFiles%\Microsoft Visual Studio 10.0\VSTSDB, %ProgramFiles%\Microsoft Visual Studio 10.0\Common7\IDE or the %LocalAppData%\Microsoft\VisualStudio\10.0<hive>\Extensions directory.
  • Your extensions file must correspond to the following schema:

 

<?xml version="1.0" encoding="utf-8"?> <xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns="urn:Microsoft.Data.Schema.Extensions" elementFormDefault="qualified" attributeFormDefault="unqualified" targetNamespace="urn:Microsoft.Data.Schema.Extensions"> <xsd:element name="extensions"> <xsd:complexType> <xsd:sequence> <xsd:element name="extension" minOccurs="0" maxOccurs="unbounded"> <xsd:complexType> <xsd:attribute name="assembly" type="xsd:string" use="optional"/> <xsd:attribute name="type" type="xsd:string" use="required"/> <xsd:attribute name="enabled" type="xsd:boolean" use="required"/> </xsd:complexType> </xsd:element> </xsd:sequence> <xsd:attribute name="assembly" type="xsd:string" use="required"/> <xsd:attribute name="version" type="xsd:unsignedByte" use="required"/> </xsd:complexType> </xsd:element> </xsd:schema>!-- code>
!-->

The database extension created for this project is called MergeScriptsFeature and is published in the MergeScripts.Extensions.xml:

 

<?xml version="1.0" encoding="utf-8"?> <extensions assembly="" version="1" xmlns="urn:Microsoft.Data.Schema.Extensions" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="urn:Microsoft.Data.Schema.Extensions Microsoft.Data.Schema.Extensions.xsd"> <extension type="MergeScripts.MergeScriptsFeature" assembly="MergeScripts, Version=1.0.0.0, Culture=neutral, PublicKeyToken=0cae59df434c3018" enabled="true" /> </extensions>!-- code>
!-->

MergeScriptsFeature has the following declaration:

[DatabaseSchemaProviderCompatibility(typeof(SqlDatabaseSchemaProvider))]

class MergeScriptsFeature : IDatabaseProjectFeature, IOleCommandTarget

The DatabaseSchemaProviderCompatibility attribute identifies the database schema provider affinity. In this case “SqlDatabaseSchemaProvider” is used indicating that this feature will work with all SQL database projects.

When a database project is opened it queries for all extensions compatible with the current database schema provider which also implement IDatabaseProjectFeature. Instances of each of those features are instantiated and initialized with the IDatabaseProjectNode for that project. With this node, features can listen to events – in my case I listen to ProjectClosed and FileIconRequest.

The MergeScriptsFeature will need to chain into the IOleCommandTarget implemented by the Database Schema View. To do this the feature retrieves the ‘SchemaModelViewerService’ from the agnostic database package – called ‘DataPackage’. Once that service raises the DatabaseSchemaViewInitialized event (or if the Schema View is already displayed) the MergeScriptsFeature IOleCommandTarget is chained through the method IDatabaseSchemaView.RegisterForLimitedCommandRouting.

When the merge button is pressed MergeScriptsFeature.DoMerge() is called. This routine creates a CommandFactory and proceeds to buffer up the source code control and file operations necessary to do the actual merge. Once all these commands are buffered they are ordered and executed in CommandFactory.Execute().

 

Installing the Package

Installing a vsix package can be done on a per-user basis or for all users on the machine. To install as a specific user just double-click on the .vsix file. This will unzip the contents and place them into “%LocalAppData%\Microsoft\VisualStudio\10.0Exp\Extensions\Microsoft Corporation\MergeScripts\1.0”. The next time Visual Studio is launched the contents of this package (like menu definitions) will be merged into the Visual Studio shell. To verify the vsix extension is installed bring up the Extension Manager available under the main menu “Tools.Extension Manager”. You should see the MergeScripts package.

image

To install the vsix package for all users (called a Trusted Extension) you need to have administrator rights on the machine. You can then install using the “VSIXInstaller /admin” command. This will install the extension into %VSInstallDir%\Common7\IDE\Extensions.

 

Conclusion

I know an extension like this (or an option during Import Script) has been requested multiple times on our forums. Until we get around to adding the option into Visual Studio I hope this extension will service your needs. If you have any questions please visit our forums here or contact me via this blog. Thanks!

-- Patrick

Using T4 Templates with Visual Studio 2010 Database Projects

Viewing all 13 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>