Wednesday, April 30, 2008

A Function Gotcha with the Use of an Asterisk - SQL Server Central

Excel is my favourite tool for LD (Lazy Development).

Here's an interesting quirk that functions in SQL have when using Select *, and a workaround using Excel as a performance accelerator for development.

During a recent upgrade of our software, I discovered what I assume is a little-known gotcha that I feel appropriate to document and pass on. The gotcha occurs with user-defined table functions (UDFs), when selecting columns from a table using the asterisk (select all) method. The problem then occurs only when columns are subsequently added or removed from the based over table without recreating the function. When these conditions occur, it results in interesting and undesirable output. The SQL below illustrates this problem using SQL 2000.

A Function Gotcha with the Use of an Asterisk - SQL Server Central

One of the best features in Excel 2007 for SQL DBAs is the Conditional Format - Highlight Duplicates.  By using this, the tedious task of writing duplicate-finding sprocs is replaced with cut-paste-Conditional Format.

Now if only there was a data entry tool directly into SQL for Excel besides PerformancePoint.

Wednesday, April 23, 2008

Snapshots of Life, Faith and Work.: Custom SSIS Slowly Changing Dimension Type 2

 

Unfortunately when you use the SCD wizard to create a type 2 slowly changing dimension you have to choose between using a current indicator or a valid date range technique. You can not have both, at least through the wizard. To add both you will have to edit the package after the wizard is complete.

Complete the Slowly Changing Dimension wizard. I prefer choosing the Single Column / Current Flag option for Historic Attribute Changes during the wizard.

Snapshots of Life, Faith and Work.: Custom SSIS Slowly Changing Dimension Type 2

SqlBI.eu - Tabledifference SSIS component

 

Table difference is an SSIS custom component designed to simplify the management of slowly changing dimensions and – in general – to check the differences between two tables or data flow with SSIS.

Its main advantage over Microsoft standard SCD component is its speed and ease of use, the component receives input from two sorted sources and generates different outputs for unchanged, new, deleted or updated rows.

The project is freeware, full source code is available.

SqlBI.eu

Friday, April 18, 2008

Peter Eb on Custom Number formats in Excel (and this works in Reporting Services too!)

This saves so much hassle when dealing with $K or $M, or even $B values.

HOWTO: Show numbers as 1,000s in Excel

Posted Thursday, April 17, 2008 6:00 PM by petereb

This time there's nothing really specific about PPS. It's more of an Excel post actually... 

I've seen this a few times. Sometimes the data form authors want to show is big, and they want to show 1 instead of 1000 or 1 instead of 1000000. There are a bunch of complicated games you an play with formulas and rounding to get close to this behavior. But in this case you shouldn't settle for close. This is bizarrely not explicitly shown in the Excel UI, but this is well supported by the custom number format functionality. You just use a custom number format:

in thousands: #,

in millions: #,,

NumberFormatI1000s

Not sure off hand what to do in other regional settings. You may still have to use a comma "," or you might need to use the decimal separator "." or whatever it is called in your culture... In my test the number format works well:

ShowIn1000s

Although I agree formatting a matrix properly is a little tricky until you get the hang of it, unfortunately this is a short post and I'll save larger formatting discussions for another post. For data entry into cells with this formatting, you still type the entire number. If you want to avoid that part too then I suggest doing that conversion in the fact table to begin with...

Peter Eb.

Wednesday, April 16, 2008

SQL Server And XML: How to find all stored procedures used by Report Server?

 

How to find all stored procedures used by Report Server?

Report definitions [RDL files] are stored in the "catalog" table of ReportServer database. This table has a field "content" which stores the report definition as an image/text value.

The following query will extract a list of all reports and the stored procedures used by them, by querying the catalog table of report server.

SQL Server And XML: How to find all stored procedures used by Report Server?

SOA Challenges: Entity Aggregation

When building a Service Oriented Architecture, a Canonical Schema is important. 

Further details in this article from 2004.

Schema Reconciliation

Not only do disparate services have different views of an entity, they may also have their own schematic representation of their views. In order to achieve a single unified view of an entity, it is imperative for the EA service to harmonize the schematic differences between different services. To do this, the EA service should first know how each service represents an entity. Additionally, the EA service should define a holistic view of the entity by logically consolidating the various views. Equally important is for the EA service to discover a way to represent the transformation between a service's view of an entity and the holistic view. The following figure shows the customer information represented in more than one service.

SOA Challenges: Entity Aggregation

SQL Scripter

Generate T-SQL Data Scripts


SQL Scripter is a powerful tool for SQL Server 7.0/2000/2005 database administrators and developers
to generate data scripts in a readable and executable T-SQL format.

SQL Scripter

Build a Metadata-Driven ETL Platform by Extending Microsoft SQL Server Integration Services

SQL 2008 Metadata-Driven ETL Whitepaper.

Platform Architecture

The design goals of the platform include improving the productivity of developers, enforcing ETL standards, supporting a cost-effective way to deploy large data warehouses on commodity hardware, and providing a centralize metadata repository for lineage tracking. The solution architecture of the platform is shown in Figure 1. The intent of this paper is not to document all the implementation details. Rather, it describes the concepts and key components of the platform and how they are connected with SQL Server Integration Services. For a more detailed architecture diagram, see Figure 6 at the end of this paper.

ETLExtendSSISFig1.gif

Build a Metadata-Driven ETL Platform by Extending Microsoft SQL Server Integration Services

10 Easy Steps to Beautiful Sql Code

Note: Shortly after I put this article together, I realized that someone else wrote something eerily similar, so I decided not to submit as an article to SQL Server Central or another content aggregator. 

Also, the web service I was using stopped working a couple days after I wrote this.  :(

Anyway, this should offer a quick and easy guide to setting up a CLR web service call from SQL Server.  Just don't bother using the web service I'm providing as an example.

 

This project uses a SQL Server Common-Language Runtime (CLR) Function which calls a 3rd-party Web Service to format and beautify SQL Server Stored Procedures directly from Query Analyzer. Beautiful code is usually easier to read and allows for easier maintainability.

What do I need?

I am using Microsoft SQL Server 2008 CTP 6 for this example, however it should be backwards compatible with SQL Server 2005. I also have Visual Studio 2005 installed.

Alternatives to this method include SQL Pretty Printer http://www.sqlparser.com/ plugin for Visual Studio.

Disclaimer

I do not recommend using this in a production environment or with confidential stored procedures as it calls a web service and opens the database up to potentially malicious calls. You may need to change the locations where files and databases are created. Use at your own risk.

For this demo, you will need to enable the CLR in SQL Server.

Steps to Implement a CLR in SQL Server

From SQL Server Management Studio, Query Analyzer

--Execute this statement

EXEC sp_configure 'clr enabled', 1;

RECONFIGURE WITH OVERRIDE;

GO

When sending results to text, you may need to set the "Maximum number of characters displayed in each column" property of SQL Management Studio - Query Analyzer to a higher value than the default of 256. I have this set to 8000, but for larger stored procedures you may require more. Better yet, send the results to a file.

clip_image001

From the Visual Studio Command Prompt

Begin by creating the C# class library required for the solution. You may want to use NAnt, MSBuild, Visual Studio or a batch file to automate these tasks.

Step 1: Create the "wiz-dal" (WSDL) class wrapper for the Web Service

C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\Bin\wsdl /o:SqlFormatterWebService.cs /n:SqlFormatterWebService.

Step 2: Create the sqlformat.cs class

Open up Notepad and save this as the following file:

C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\Bin\sqlformat.cs

using System; 
using System.Data;
using System.Data.SqlClient;
using System.Data.SqlTypes;
using Microsoft.SqlServer.Server;
using SqlFormatterWebService.Test;
public partial class StoredProcedures
{
[Microsoft.SqlServer.Server.SqlFunction]
public static string FormatSql([SqlFacet(MaxSize=-1)]SqlString inputsql)
{
return GetFormattedSql(inputsql.ToString());
}
public static string GetFormattedSql(string inputsql)
{
SQLFormatterWebService sqlformatter = new SQLFormatterWebService();
return sqlformatter.beautifySql("tester","mssql","sql",inputsql,"");
}
};

Step 3: Execute the following from the Visual Studio Command Prompt:

C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\Bin\csc /t:library sqlformat.cs sqlformatterwebservice.cs

Once the class library is created, the sqlformat.dll file will be available to add as an assembly reference in SQL Server.

From Sql Management Studio - Query Analyzer

Step 4: Create a database to store the CLR function (optional)

CREATE DATABASE [ClrWebServices] ON PRIMARY 
( NAME = N'ClrWebServices', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL10.SQL2008\MSSQL\DATA\ClrWebServices.mdf' , SIZE = 3072KB , MAXSIZE = UNLIMITED, FILEGROWTH = 1024KB )
LOG ON
( NAME = N'ClrWebServices_log', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL10.SQL2008\MSSQL\DATA\ClrWebServices_log.ldf' , SIZE = 1024KB , MAXSIZE = 2048GB , FILEGROWTH = 10%)
GO

Step 5: Change the database owner to a non-domain service account (optional)

sp_changedbowner 'localhostname\localserviceuser

Step 6: Set database as trustworthy

There are many reasons why you would not do this in a production environment. Alternatives are in this forum posting:

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2027982&SiteID=1

USE ClrWebServices 
GO
ALTER DATABASE ClrWebServices SET TRUSTWORTHY ON
Step 7: Create the assembly reference for the function
CREATE ASSEMBLY SqlFormat
FROM 'C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\Bin\sqlformat.dll'
WITH PERMISSION_SET = UNSAFE;
GO

Step 8: Create the assembly reference for the function serializer

The XML Serializer is required when using a Web Service in CLR code.

CREATE ASSEMBLY [SqlFormat.XmlSerializers] FROM 'C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\Bin\sqlformat.XmlSerializers.dll' WITH PERMISSION_SET = SAFE; GO 

Step 9: Create the SQL function wrapper

CREATE FUNCTION SqlFormat(@inputSql nvarchar(max)) 
RETURNS nvarchar(max)
AS
EXTERNAL NAME SqlFormat.StoredProcedures.FormatSql
GO

Step 10: Test the procedure against a single stored procedure

Use AdventureWorks 
Go
set nocount on
go
select ClrWebServices.dbo.SqlFormat(text)
from sysobjects o
inner join syscomments c on c.id = O.id
where xtype='P'
and name = 'uspGetEmployeeManagers'

 


 


Conclusion

This article demonstrates one way of calling a web service from a CLR stored procedure, and how a web service may improve and extend the objects created inside SQL Server.

 

References:

http://en.wikipedia.org/wiki/Web_Services_Description_Language

http://en.wikipedia.org/wiki/Common_Language_Runtime\

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2027982&SiteID=1

http://support.microsoft.com/kb/913668

http://www.setfocus.com/technicalarticles/clrfunctionforsqlserver.aspx

http://sqlblogcasts.com/blogs/simons/archive/2006/11/28/CREATE-ASSEMBLY-Permissions---Could-not-obtain-information-about-Windows-NT-group-user-.aspx

http://sachabarber.net/?p=69

http://www.sqlteam.com/article/writing-clr-stored-procedures-in-charp-introduction-to-charp-part-1

http://www.sqlteam.com/article/writing-clr-stored-procedures-in-charp-returning-data-part-2

http://davidhayden.com/blog/dave/archive/2006/04/25/2924.aspx

Microsoft SQL Server 2008 White Paper: Reporting Services

One of the biggest features I see out of 2008 is the Report Designer application.  This removes the need to purchase Visual Studio or a 3rd party add-in for building reports, and makes it much easier to create adhoc queries.

Microsoft SQL Server 2008 Reporting Services provides a complete server-based platform that is designed to support a wide variety of reporting needs, including managed enterprise reporting, ad-hoc reporting, embedded reporting, and Web-based reporting, to enable organizations to deliver relevant information where needed across the entire enterprise. Reporting Services 2008 provides the tools and features necessary to author a variety of richly formatted reports from a wide range of data sources, and provides a comprehensive set of familiar tools used to manage and secure an enterprise reporting solution. Reports are processed and delivered quickly and effectively, enabling users to either receive reports automatically through subscriptions, to access reports from a central report repository on an ad-hoc basis, or to consume reports in the context of their business processes through reports directly embedded into their business or Web applications.

Microsoft SQL Server 2008 White Paper: Reporting Services

Common Solutions for T-SQL Problems - Home

A big problem with large systems is blocking.  This article can help troubleshoot why your database is "just hanging out" when it should be doing something.

Troubleshooting Deadlocks in SQL Server
Jonathan Kehayias, March 17, 2008

One of the more difficult/frustrating things to isolate and eliminate in SQL Server are deadlocks. In all but the rarest exception, deadlocks are a side effect of blocking and not a SQL Server Bug. They can generally be resolved by changing application or stored procedure code to access tables in the same logic order, or by changing the actual database schema or indexing structures. The first step in Deadlock elimination is to identify what two statements are actually causing the deadlocks to occur. In some cases, there may be more than just two statements that contribute to a deadlock. The methods available to troubleshoot deadlocks in SQL Server differ between SQL Server 2000 and SQL Server 2005.

Common Solutions for T-SQL Problems - Home

Common Solutions for T-SQL Problems - Home

Includes some details on record exclusion with EXCEPT.

Solutions for Common T-SQL Problems

Information presented in this WIKI represents the suggestions, ideas, and opinions of Volunteer Moderators and Answerers who support the Microsoft MSDN SQL Server Forums. (Unless specifically stated otherwise, nothing should be construed to represent the official positions or opinions of Microsoft and/or its Employees.)

Common Solutions for T-SQL Problems - Home

How can I update Multiple Tables at once

One way to update multiple tables in SQL.  Hint - use a view.

How can I update Multiple Tables at once
"CAN'T BE DONE" -- Crazy boy

How can I update Multiple Tables at once

Thursday, April 10, 2008

DB on GPUs

Upgrade your video card - get better database performance?  Perhaps in the near future...



Execution time of a predicate evaluation with 60% selectivity by a CPU-based algorithm and a GPU-based algorithm. Timings for the GPU-based algorithm include the time to copy data values into the depth buffer. Considering only the computation time, the GPU is nearly 20 times faster than a compiler-optimized SIMD implementation

DB on GPUs

I want some Moore - Short circuiting Sql Parameters

 

OR Conditions

Run this statement:

SELECT * FROM t1 
WHERE id = 3
OR id / 0 = 1

Here it is a whole different story than above with only AND conditions. Here you immediately see the error "Divide by zero error encountered". If you take a closer look you'll see that we never get to the execution plan phase. Why does this differ so much from the AND conditions?  Because OR's immediately mean Table/Index scan. Therefore no parameterization is done and at the query evaluation time all of the values are known. This is when the divide by zero is caught before the execution plan is even built or taken from cache and our query fails. It doesn't matter that our first condition is already true.


I want some Moore

CREATE ASSEMBLY Permissions - Could not obtain information about Windows NT group/user - SimonS Blog on SQL Server Stuff

 

CREATE ASSEMBLY Permissions - Could not obtain information about Windows NT group/user

If you encounter this

Could not obtain information about Windows NT group/user <MyDomain>/<MyName>, error code 0x5.

When you try and CREATE an ASSEMBLY, you might think well I'm not connected as that user. The user it sqawks about is the database owner.

I run on a laptop and often find myself not connected to the domain (although I can still log in with cached domain credentials). When I create a database the database automatically becomes owened by my domain account so when I then try and create an assembly it sqawks because it can't verify the domain account that owns the database.

The simple way around this is to create a sql user account and change the database owner to that user.

CREATE ASSEMBLY Permissions - Could not obtain information about Windows NT group/user - SimonS Blog on SQL Server Stuff

Tuesday, April 08, 2008

Megginson Technologies: Land and Hold Short » Blog Archive » Cost of owning a plane in 2007

Something tells me I will not be owning a plane any time soon.

Here’s what it cost to own and operate a 1979 Piper Warrior II in Ottawa, Canada in 2007 with 80 hours air time (a bit more flight time, of course). Since the US and Canadian dollars are basically at par now, there’s no need to convert:

TOTAL:
$9,801.08
$122.51/hr

These are real costs, including sales taxes, not the BS costs you hear people throwing around at the airport. Reserves are $20/hour for engine and paint. I also pay about $500/year for charts and recurrent training, but I’d pay the same as a renter, so I don’t count those as ownership costs.

Megginson Technologies: Land and Hold Short » Blog Archive » Cost of owning a plane in 2007

Database maintenance routines

List of maintenance procedures.

Database maintenance routines

EDIT: Updated link to the new version of isp_ALTER_INDEX.

Over the past couple of weeks, I have posted various database maintenance stored procedures.  Here's a listing of them:

Do not put these in the master database.  An Admin database should be created where non-application specific objects are put.  All of the stored procedures are designed so that they do not need to be in any particular database.

Use them at your own risk.  If you don't use them, at the very least create your own.  Don't use the maintenance plan wizards or the sqlmaint utility!

Database maintenance routines

SQL Server Forums - SQL Server Frequently Asked Questions FAQ

Frequently Given Answers in SQL Server Forums.

All Forums
General SQL Server Forums
New to SQL Server
SQL Server Frequently Asked Questions FAQ

SQL Server Forums - SQL Server Frequently Asked Questions FAQ

Data Models - ESRI Support

Testing Spatial & Geography support in SQL 2008?  Got Data?

Academic and industry leaders collaborate with ESRI to create and design data model templates that can be used with one GIS platform. The result is this set of data model structures that can be implemented for each of the industries and scientific disciplines that ESRI serves.

Read about the goals and process in the Introduction to ArcGIS Data Models, and visit the Data Model tips and tricks page designed as a resource for common methods and best practices.

Data Models - ESRI Support

Know Dot Net - NET Refactor - SQL Parameters

More tools to remove dynamic SQL from applications.

Cleaing up Dynamic SQL

Using Dynamic SQL is a bad habit that many developers must admit to having used. And, even worse, when you start to use it, you don't just use it once or twice. You use it so many times over a project that you may be ashamed to admit it. Old habits die hard!. Dynamic SQL is dangerous. On the Web, it opens your site to injections attacks and hacking. In a desktop or Client Server application, it opens the door to the old single quote (name = O'Rielly) bug-a-boo, to which all who would be honest, have been a victom. The SQL Parameters menu option of NET Refactor, found under the SQL Refactor Menu, converts dynamic parameters to the use of Database Command.Parameters automatically.

Know Dot Net - NET Refactor - SQL Parameters

What did you learn today? - Cool new features I found in SQL 2005

PERSISTED computed columns and the OUTPUT clause are 2 "new" features I discovered in SQL 2005 while researching SQL 2008.

OUTPUT Clause

A lot of people already know about this pretty cool feature, but I didn’t. In the past, if you do an insert to a table, and the table has an identity column, after the insert, you would look at @@IDENTITY or scope_identity(). That always seemed hackish to me.

You can now add an OUTPUT clause to the end of an INSERT, UPDATE, or DELETE statement. That will return the rows changed by the statement.

INSERT INTO Person.Address

(AddressLine1,AddressLine2,City

,StateProvinceID,PostalCode,rowguid,ModifiedDate)

VALUES(@AddressLine1,@AddressLine2,@City

,@StateProvinceID,@PostalCode,@rowguid,@ModifiedDate)

OUTPUT INSERTED.AddressID;

You use the pseudo tables INSERTED and DELETED, just like in triggers.

What did you learn today? - Cool new features I found in SQL 2005

Monday, April 07, 2008

Dan's Notes: BizTalk replace functoid trick

Of course, you could also replace functoids the easy way. :)

If you need to change a functoid that has many links associated with it, instead of deleting the functoid and creating a new one and re-adding all the links, you can just drop the new functoid on the old functoid in the grid. All links and parameters are preserved, just the functoid changes.

Dan's Notes: BizTalk replace functoid trick

Reverse Engineering BizMaps in BizTalk

Interesting trick with BizTalk Maps.   The Mapping Editor uses XML in the background.  For manipulating maps, use the Open With shortcut to your favourite XML editor.  i.e. Notepad.

Notice below that there are ID's set for individual Functoids.  These functoids are stored in C:\Program Files\Microsoft BizTalk Server 2006\Developer Tools\Microsoft.BizTalk.BaseFunctoids.dll

Looking at this class library in Reflector, you can browse to get the ID of the functoid under Base.ID.   For the StringLeftFunctoid() class it is 0x66.  It is stored as hex, so dropping the 0x and converting the value to decimal using http://www.easycalculation.com/hex-converter.php gives you 102.

By further reverse-engineering of the BizTalk.BaseFunctoids.dll library, you can see some different styles of implementing the various functoids.

If you don't like working with design tools for maintenance and simple things like changing base functoids, then you can modify this file, at the risk of blowing everything up.

Now if only you could do this mapping in Visio... and execute it... now that would be a powerful Data Design/ETL/Workflow tool.

<?xml version="1.0" encoding="utf-16"?>
<!-- Generated using BizTalk Mapper on Mon, Apr 07 2008 02:03:28 PM -->
<mapsource Name="BizTalk Map" BizTalkServerMapperTool_Version="2.0" Version="2" XRange="100" YRange="420" OmitXmlDeclaration="Yes" TreatElementsAsRecords="No" OptimizeValueMapping="Yes" GenerateDefaultFixedNodes="Yes" PreserveSequenceOrder="No" CopyPIs="No" method="xml" xmlVersion="1.0" IgnoreNamespacesForLinks="Yes">
  <SrcTree>
    <Reference Location=".\KeyStats.xsd" />
  </SrcTree>
  <TrgTree>
    <Reference Location=".\KeyStats_2.xsd" />
  </TrgTree>
  <ScriptTypePrecedence>
    <CSharp Enabled="Yes" />
    <ExternalAssembly Enabled="Yes" />
    <VbNet Enabled="Yes" />
    <JScript Enabled="Yes" />
    <XsltCallTemplate Enabled="Yes" />
    <Xslt Enabled="Yes" />
  </ScriptTypePrecedence>
  <TreeValues>
    <TestValues />
    <ConstantValues />
  </TreeValues>
  <Pages>
    <Page Name="Page 1">
      <Links>
        <Link LinkID="1" LinkFrom="1" LinkTo="/*[local-name()='&lt;Schema&gt;']/*[local-name()='KeyStats']/*[local-name()='ShortName']" Label="" />
        <Link LinkID="2" LinkFrom="1" LinkTo="/*[local-name()='&lt;Schema&gt;']/*[local-name()='KeyStats']/*[local-name()='Exchange']" Label="" />
        <Link LinkID="3" LinkFrom="1" LinkTo="/*[local-name()='&lt;Schema&gt;']/*[local-name()='KeyStats']/*[local-name()='IndustryName']" Label="" />
        <Link LinkID="4" LinkFrom="1" LinkTo="/*[local-name()='&lt;Schema&gt;']/*[local-name()='KeyStats']/*[local-name()='FCFF']" Label="" />
        <Link LinkID="5" LinkFrom="1" LinkTo="/*[local-name()='&lt;Schema&gt;']/*[local-name()='KeyStats']/*[local-name()='FCFE']" Label="" />
        <Link LinkID="6" LinkFrom="1" LinkTo="/*[local-name()='&lt;Schema&gt;']/*[local-name()='KeyStats']/*[local-name()='MarketCap']" Label="" />
        <Link LinkID="7" LinkFrom="1" LinkTo="/*[local-name()='&lt;Schema&gt;']/*[local-name()='KeyStats']/*[local-name()='MarketCap_USD']" Label="" />
        <Link LinkID="8" LinkFrom="1" LinkTo="/*[local-name()='&lt;Schema&gt;']/*[local-name()='KeyStats']/*[local-name()='FirmValue']" Label="" />
        <Link LinkID="9" LinkFrom="/*[local-name()='&lt;Schema&gt;']/*[local-name()='KeyStats']/*[local-name()='KeyStats_Child1']" LinkTo="2" Label="" />
        <Link LinkID="10" LinkFrom="2" LinkTo="/*[local-name()='&lt;Schema&gt;']/*[local-name()='KeyStats']/*[local-name()='Symbol']" Label="" />
      </Links>
      <Functoids>
        <Functoid FunctoidID="1" X-Cell="56" Y-Cell="215" Functoid-FID="376" Functoid-Name="Nil Value" Label="">
          <Input-Parameters />
        </Functoid>
        <Functoid FunctoidID="2" X-Cell="56" Y-Cell="212" Functoid-FID="106" Functoid-Name="String Extract" Label="">
          <Input-Parameters>
            <Parameter Type="Link" Value="9" Guid="{5AE7B85F-255C-498D-BF5C-1F1DBABF180B}" />
            <Parameter Type="Constant" Value="1" Guid="{7DF87C5C-729A-4726-B4E7-F51FD52CC659}" />
            <Parameter Type="Constant" Value="3" Guid="{80449EC2-724D-45E8-B64F-67EA60B42D94}" />
          </Input-Parameters>
        </Functoid>
        <Functoid FunctoidID="3" X-Cell="53" Y-Cell="213" Functoid-FID="101" Functoid-Name="String Find" Label="">
          <Input-Parameters />
        </Functoid>
      </Functoids>
    </Page>
  </Pages>
</mapsource>

Friday, April 04, 2008

Generate dynamic SQL statements in SQL Server - Program - SQL Server - Builder AU

 

sp_executesql or EXECUTE()

There are two ways to execute dynamic SQL in SQL Server: use the sp_executesql system stored procedure or the EXECUTE() operator. Sometimes the two methods can produce the same result, but there are differences in how they behave.

The system stored procedure sp_executesql allows for parameters to be passed into and out of the dynamic SQL statement, whereas EXECUTE() does not. Because the SQL statement is passed into the sp_executesql stored procedure as a parameter, it is less suseptible to SQL injection attacks than EXECUTE(). Since sp_executesql is a stored procedure, passing SQL strings to it results in a higher chance that the SQL string will remain cached, which should lead to better performance when the same SQL statement is executed again. In my opinion, sp_executesql results in code that is a lot cleaner and easier to read and maintain. These reasons are why sp_executesql is the preferred way to execute dynamic SQL statements.

Generate dynamic SQL statements in SQL Server - Program - SQL Server - Builder AU

Ben Hall's Blog: Creating partition tables in SQL Server 2005

 

Creating partition tables in SQL Server 2005

Partition Tables is a feature of SQL Server 2005 which allows you to separate your data into blocks (partitions) of data to improve performance and manageability as each partition can be stored in its own filegroup, which in turn can be on a separate disk array. In this post, I will cover how to partition your table, the table will contain a series of orders which I want to partition based on their order date.

The first thing I want to do is create a database.

CREATE DATABASE PartitionTest
GO

USE PartitionTest
GO

Next, I want to create a table which will contain all of my 'orders'.

CREATE TABLE Orders (ID INT NOT NULL, OrderDate DATETIME NOT NULL)

Next, I got SQL Data Generator to generate me 10 million 'Orders' for the table.  This gave me a large number of dates between 01/01/1953 and 30/12/2010 (values I had set) - so a nice big table which really needs to be partitioned. With that in place, we can create our partition. 

First, we need to create a partition function, give it a name and the data type we are partitioning based on - in this case DateTime. We need say RANGE LEFT (anything on the left hand side goes into the partition), we then list the series of values we want to split the data based on.  In this case, anything below 31/12/1950 goes into partition 1, between 1/1/1951 and 31/12/1960 into partition 2 etc

CREATE PARTITION FUNCTION OrderDateRangePFN(datetime)
AS
RANGE LEFT FOR VALUES ('19501231 23:59:59.997', --YYYYMMDD
                        '19601231 23:59:59.997',
                        '19701231 23:59:59.997',
                        '19801231 23:59:59.997',
                        '19901231 23:59:59.997',
                        '20101231 23:59:59.997',
                        '20201231 23:59:59.997')

At this point, we create a partition schema.  In this case, I'm using saying all the partitions should be stored in the primary filegroup, however this is where we would allocate the partitions to separate filegroups.

CREATE PARTITION SCHEME OrderDateRangeScheme AS
PARTITION OrderDateRangePFN
ALL TO ([PRIMARY])

At this point, we have a table with 10 millions rows, a partition function and a partition schema. However, nothing links the partition to the table.  What we need to do is create a new Primary Key which we can set the partition schema on. If you have any existing keys, they will need to be dropped and re-created with the partition schema included.

ALTER TABLE Orders
ADD CONSTRAINT OrdersPK
   PRIMARY KEY CLUSTERED (OrderDate, ID)
ON OrderDateRangeScheme(OrderDate)
GO

If the table wasn't already created, we could include the partition schema in the create table script.

CREATE TABLE Orders (ID int, OrderDate DATETIME) ON OrderDateRangeScheme(OrderDate)
GO

In the above function, the year 2000 partition was missing. If we wanted to include this we would alter the function and include a split range

ALTER PARTITION FUNCTION OrderDateRangePFN () split RANGE ('20001231 23:59:59.997')

After executing all of this, your table will successfully be partitioned. To verify the details of the partition, there are a few queries we can execute.

The system view partitons contains all of the information, we can query this to return all the partition information for a table.

SELECT * FROM sys.partitions
WHERE OBJECT_ID = OBJECT_ID('Orders')

This returns the partition number and how many rows are in each partition.  This is useful for identifying if your partitions are split correctly or if one partition needs to be split down more.

image

The other way is to use a $Partition, this gives the partition number for the row of data. This allows you to see what data is in what partiton

SELECT *, $Partition.OrderDateRangePFN(OrderDate) AS [Partition] FROM [Orders] ORDER BY [Partition]

image

More information visit http://www.simple-talk.com/sql/sql-server-2005/partitioned-tables-in-sql-server-2005/

Ben Hall's Blog: Creating partition tables in SQL Server 2005

Thursday, April 03, 2008

The Bit Bucket (Greg Low) : SQL Server 2008: Interesting Full-Text Dynamic Management Function

SQL 2008 word parser using FTS.

That's really sweet as it lets you parse text into words. The 1033 was the locale I chose and the other two parameters were a stop word list (formerly called a noise word list) and whether or not it should be accent sensitive.

The Bit Bucket (Greg Low) : SQL Server 2008: Interesting Full-Text Dynamic Management Function

Visual Studio 2005 with SQL 2008

If you want to add a database project to Visual Studio 2005 with SQL 2008, one way is to use an ODBC reference.

The SQL Native client reference will only work for SQL 2000/2005.

Vadivel's blog: How to find out recently run queries in SQL Server 2005?

A realtime view of your SQL Server 2005 queries.

How to find out recently run queries in SQL Server 2005?

Prior to SQL Server 2005 if we want to find out the list of recently run queries we need to depend on SQL Profiler.
Now in SQL Server 2005 the life has become more easier(!). With the help of of an Dynamic Management Views (DMV) and a table valued function we can list the required result.
Points to note:
1. sys.dm_exec_query_stats -- This DMV returns aggregate performance statistics for cached query plans. The view contains one row per query statement within the cached plan, and the lifetime of the rows are tied to the plan itself. When a plan is removed from the cache, the corresponding rows are eliminated from this view.
2. sys.dm_exec_sql_text -- This table valued function returns the text of the SQL batch that is identified by the specified sql_handle
Solution:
Select
dmStats.last_execution_time as 'Last Executed Time',
dmText.text as 'Executed Query'
from
sys.dm_exec_query_stats as dmStats
Cross apply
sys.dm_exec_sql_text(dmStats.sql_handle) as dmText
Order By
dmStats.last_execution_time desc

Vadivel's blog: How to find out recently run queries in SQL Server 2005?

Tuesday, April 01, 2008

Patrick Wellink - BizTalk Templates

 

Now its only a matter of seconds to create the projects. The naming convention should be SomeProject.ServiceKind,ServiceName. The templates really expect a dotted name where tle last part after the last dot is the actual service name.

  1. If you want to try this yourself, extract the files in the zip file ESB Concept Scenarios.zip to the directory C:\Program Files\Microsoft BizTalk Server 2006\Developer Tools\BizTalkProjects\ESB Concept Scenarios, so that the directory will contain two files. SimpleTemplate.ico and SimpleTemplate.vsz.
  2. Next you extract the following file SimpleTemplate.zip to the following directory BizTalkWizards. The result should look the same as the picture above.
  3. Next Start Visual Studio, Create a new BizTalk Project, Select SimpleTemplate and see stuff happen. (Don't forget the name yoy enter for the project should be in the form , SomeProject.ServiceKind,ServiceName otherwise it won't work.

If  You don't like the template or want to change it, simply go to the BizTalk Wizard directory and perform the following steps.

  1. Select the template you want to change.
  2. Next go to the 'Templates\1033\OriginalProject' directory.
  3. Open the solution.
  4. Make any change you want, and save the project.
  5. Copy only the Orch / Map and XSD one level up. Do not copy the SLN and BTPproj files. (Have a look in the notepad at both of them and you will probably spot the diffrencies).

Patrick Wellink

Benny Mathew's BizTalk Blog - Open Source tools

 

Open Source BizTalk Utilities on CodePlex

I am sure most of you (not talking about the BizTalk gurus out there J) have worked with some of the open source utilities available for BizTalk such as the BizTalk Adapter Wizard or the BizTalk Server Pipeline Component Wizard or BizUnit. However what some of you may not know is the wealth of other open source utilities for BizTalk available on CodePlex.

Here are some of the interesting ones and their descriptions as found on CodePlex :

SDC Tasks Library - . The SDC Tasks are a collection of MSBuild tasks designed to make your life easier. You can use these tasks in your own MSBuild projects. You can use them stand alone and, if all else fails, you can use them as sample code.

BizUnit - Framework for Automated Testing of Distributed Systems - BizUnit enables automated tests to be rapidly developed. BizUnit is a flexible and extensible declarative test framework targeted that rapidly enables the automated testing of distributed systems, for example it is widely used to test BizTalk solutions. BizUnit is fully extensible. Its approach is to enable test cases to be constructed from generic reusable test steps, test cases are defined in XML which allows them to be auto-generated and also enables the ‘fixing up’ of Url’s for different environments, e.g. test, staging and production environments. Defining test cases in XML enables test cases to be auto-generated.

BizTalk Server Pattern Wizard - The BizTalk Pattern Wizard is an extensible tool to help you capture, share, and reuse your orchestration best practices. By using the BizTalk Pattern Wizard, you can capture a best practice, turn it into a generalized and configurable pattern, and share that pattern with the rest of your team or the entire BizTalk community. The wizard comes with over a dozen patterns ready for implementation in your next project.

WCF Adapter for BizTalk Server 2006

BizTalk Adapter Wizard for BizTalk Server 2006 - The BizTalk Adapter Wizard for BizTalk Server 2006 is a Visual Studio 2005 project wizard which creates all of the framework code for your custom BizTalk adapter. The adapter wizard is accessible from the Visual Studio menu: File - New - Project - BizTalk projects.

BizTalk Server 2006 Documenter - Creates compiled help files for a given BTS 2006 installation. This tool can be run on an ad-hoc basis using the UI or from the command line as a post build/deploy task to create a compiled help file describing a BTS 2006 installation. It will compile: BTS Host configuration, Send/Receive port configuration, Orchestration diagrams, Schema and Map content, Pipeline process flow, Adapter configuration, Rule engine vocabularies and policies, More… and publish them as compiled help files. Optionally you can embed custom HTML content and custom descriptions for all BTS artifacts to produce a more customized look and feel to the CHM output

BizTalk Server 2006 Orchestration Profiler - Creates CHM report files illustrating the level of coverage for specified BizTalk orchestrations. This tool can be run to gain a consolidated view of orchestration tracking data for a specified period of time to help developers get an idea of how their orchestrations are being processed and how much coverage they are getting in their testing. In addition to simple coverage information the data presented helps to identify latency and code path exceptions by highlighting long running and error prone orchestration shapes.

PowerShell BizTalk Provider - A full PowerShell provider for exposing BizTalk Server as a filesystem. Administer your BizTalk installation. List all applications / orchestrations / Schemas. Stop an application, enlist an orchestration ... Use the full power of the shell to script away the pain of GUI based mass-management.

MapCop - A program for testing BizTalk map files (.BTM) for a set of best practices.

BizTalk Instance Controller - Limit the number of instances of any BizTalk Service.

Have Fun!

Benny Mathew's BizTalk Blog

Blog BizTalk - » Install BizTalk Server 2006 Orchestration Designer for Business Analysts with Visio 2007

Here's how to configure Biztalk ORCH Orchestrator Plugin for Visio 2003 to use Visio 2007.

I found that the installer does look for this registry key: HKLM\Software\Microsoft\Office\11.0\Visio, but cannot find it. However, I found similar registry key for Visio 2007: HKLM\Software\Microsoft\Office\12.0\Visio. I exported this registry branch to a file and replaced all instances of “12.0″ to “11.0″.

After merging this file with the registry, I was able to install and use ODBA.

Blog BizTalk - » Install BizTalk Server 2006 Orchestration Designer for Business Analysts with Visio 2007