Friday, December 30, 2005

SQL Server forums: development, administration, security, performance tuning tips

SQL Server forums: development, administration, security, performance tuning tips

Huge - SQL Monster

Donald Farmer on SQL Server Integration Services : Oracle and SQL Server Integration Services

Donald Farmer on SQL Server Integration Services : Oracle and SQL Server Integration Services: "SQL Server to Oracle. What can I say? S*!& happens. "

Wednesday, December 28, 2005

Office Excel Add-in for SQL Server Analysis Services

Link to new version 1.5 available.

Friday, December 23, 2005

Brian Knight, SQL Server MVP

Speeding up the VS/SQL User Interface Load Time

One of the frustrating things about the new Visual Studio interface is the load speed. This applies to SQL Server 2005 as well. I have a very new computer which is quite speedy and the splash screen still takes 5-10 seconds to load. On my old machine, this is more like 10-15 seconds while the splash screen is loaded before SQL Server Management Studio would show the login dialog box. A remedy I stumbled across was to add the -nosplash switch at the end of the executables call in the shortcut. After killing the splash screen from displaying, my load time on my good machine was down to 1-2 seconds before being prompted for a login screen. I come in an out of tools so much, that this is quite useful.

To do this, right-click on the shortcut in the start menu for your tool like Business Intelligence Development Studio or SQL Management Studio and click Properties. In the target text box change the shortcut to point to the following for SQL Management Studio:

"C:\Program Files\Microsoft SQL Server\90\Tools\Binn\VSShell\Common7\IDE\SqlWb.exe" -nosplash

For Business Intellgience Managment Studio, the same logic applies:

"C:\Program Files\Microsoft Visual Studio 8\Common7\IDE\devenv.exe" -nosplash

Also, this works for the Visual Studio environment in general. I personally hop in and out of these tools all day long and this solved a lot of frustration.

Thursday, December 22, 2005

Slowly Changing Dimensions

Thomas Pagel BI Blog :: "One of the very nice enhancements of SQL Server 2005 Integration Services (SSIS) compared to DTS is the introduction of the wizard for slowly changing dimensions (SCD). It makes it easy to create dimension tables storing changing data and tracking these changes. There are some blogs about that, i.e. Jamie has a nice example for a package using the wizard? http://blogs.conchango.com/jamiethomson/archive/2005/06/06/1543.aspx"

Task Parameters

Task Parameters

OLEDB ?
Ado.net @param
ADO ?
ODBC ?

Thomas Pagel BI Blog :

Thomas Pagel BI Blog :: "So we?re getting closer and closer to RTM and we all hope that Microsoft will deliver a perfect product after such a long time of Betas (sorry, IDWs and CTPs). But you can be sure that this will not be a 100% finished product, just as Windows 2003 and other releases were not perfect on day one. I?m not one of guys telling you ?wait for Service Pack 1?, SQL 2005 is a too good product to spend more time without it. But I?m sure you have to expect some issues mostly related to performance.

One issue we got aware of is a performance flaw when you use NON EMPTY with ?non trivial? calculated members (well, this seams to be starting with already quite trivial calculations?). We had queries running for minutes and asked Microsoft why they take so long. They suggested exchanging the calculated member with the formula behind it. We had this with just a ParallelPeriod function we used in a calculated member. When we changed the query to use the base measure and put the ParallelPeriod in the query the performance was very good (some seconds for a quite complex MDX).

Remember that NON EMPTY is used very often, i.e. Reporting Service?s query designer uses it by default. So it?s quite likely that you get in touch with this issue.

It?s confirmed that this flaw will NOT be fixed in RTM.
And there are others I?m not aware of, yet?"

Thursday, December 15, 2005

Sequence number in Reporting Services

Want a running total like Excel's row numbering?


RunningValue(1, Sum, Nothing) gives you a sequence number

Wednesday, December 14, 2005

Visual Studio Magazine ? Build Client-Side Reports Easily

Visual Studio Magazine ? Build Client-Side Reports Easily

Office Business Scorecard Manager 2005

Office Business Scorecard Manager 2005: "SharePoint Lists as Data Sources in Business Scorecard Manager
posted Wednesday, December 07, 2005 1:42 PM by Patrick Husting
SharePoint runs on SQL, SQL supports ODBC so of course it makes sense to be able to use SharePoint lists as data sources in Business Scorecard Manager as soon as you have ramped up on the SharePoint schema.

Here's an example connection string:

Driver={SQL Server};Server=localhost;Database=STS_MACHINENAME_1;Trusted_Connection=yes;

Here's a sample query that does a count:

select count(*)
from userdata
inner join lists on lists.tp_id = userdata.tp_listid
and lists.tp_title = 'SOME_LIST_TITLE' where nvarchar3 = 'SOME_COLUMN_HEADER_1' and nvarchar2 = 'SOME_COLUMN_HEADER_2'

Here's one that does a ratio:

declare @nApples decimal
declare @nOranges decimal
declare @ratio decimal

set @nApples = (
SELECT count(*)
FROM userdata
inner join lists on lists.tp_id = userdata.tp_listid
and lists.tp_title = 'Fruit'
and nvarchar2='FruitBasket'
and nvarchar1 <> 'UNRIPE'
and ((nvarchar3 = 'RED') or(nvarchar3='REDISH')))

set @nOranges = (
SELECT count(*)
FROM userdata
inner join lists on lists.tp_id = userdata.tp_listid
and lists.tp_title = 'Fruit'
and nvarchar2='FruitBasket'
and nvarchar1 <> 'UNRIPE'
and ((nvarchar3 = 'ORANGE') or(nvarchar3= 'ORANGEISH')))
"

Office Business Scorecard Manager 2005 : BSM05 Sample Project Plan

Office Business Scorecard Manager 2005 : BSM05 Sample Project Plan

64 Advantage

64 Advantage

Here's training for the future of 64-bit computing.

Visual Studio Magazine ? Build Client-Side Reports Easily

Visual Studio Magazine ? Build Client-Side Reports Easily

PromptSQL: SQL Intellisense for MS Query Analyzer, VS.NET 2003, Management Studio and VS 2005

Autocomplete for SQL

Friday, December 09, 2005

daclark's WebLog

daclark's WebLog: "Comparing machines in SMS
Here's a query for comparing Services on two machines. Just Clone the 'Compare software inventory on two computers' report and replace the SQL with below.

select (case when inv1.Name0 is not null then @name1 else @name2 end) as Netbios_Name0,
(case when inv1.Name0 is not null then inv1.DisplayName0 else inv2.DisplayName0 end) as DisplayName0,
(case when inv1.Name0 is not null then inv1.Name0 else inv2.Name0 end) as Name0,
(case when inv1.Name0 is not null then inv1.StartMode0 else inv2.StartMode0 end) as StartMode0,
(case when inv1.Name0 is not null then inv1.StartName0 else inv2.StartName0 end) as StartName0

from ( select s.ResourceID, s.DisplayName0, s.Name0, s.PathName0, s.StartMode0, s.StartName0
from v_GS_Service s join v_R_System sys on s.ResourceID=sys.ResourceID where sys.Netbios_Name0 = @name1 ) as inv1

full outer join ( select s.ResourceID, s.DisplayName0, s.Name0, s.PathName0, s.StartMode0, s.StartName0
from v_GS_Service s join v_R_System sys on s.ResourceID=sys.ResourceID where sys.Netbios_Name0 = @name2 ) as inv2

on inv1.Name0=inv2.Name0 and inv1.StartMode0=inv2.StartMode0
where (inv1.Name0 is NULL) or (inv2.Name0 is NULL) order by DisplayName0, Name0, Netbios_Name0


Here's a query for comparing AddR"

Friday, December 02, 2005

Creating Outlook Add-ins with Visual Studio 2005 Tools for Office

Creating add ins for Office with new tools.

calculated sum - MSDN

calculated sum - MSDN: "Not sure if this solution will work for you, but you can add a Calculated Field to the Data Set, say: 'FilteredQty', defined as (assuming Quantity is an integer):

CInt(iif(Fields!Country.Value = 'USA', Fields!Quantity.Value, 0))

Then in your report you can use this expression:

Sum(Fields!FilteredQty.Value), wich should return 4 rather than 10

To make it dynamic, 'USA' can be replaced by a report parameter."

Thursday, December 01, 2005

Download details: Microsoft KPIUtil.exe tool for Microsoft Office Business Scorecard Manager 2005

Download details: Microsoft KPIUtil.exe tool for Microsoft Office Business Scorecard Manager 2005: "Microsoft KPIUtil.exe tool for Microsoft Office Business Scorecard Manager 2005
Download files belowQuick InfoVersion:1.0
Date Published:11/28/2005
Language:English
Download Size:38 KB - 216 KB*
*Download size depends on selected download components. "

Tuesday, November 29, 2005

Empty Address Lines - MSDN

Empty Address Lines - MSDN: "I have not tried this but you can try using a formula for the LINE HEIGHT property of the the address lines - Under Properties/Format

=iif(Fields!AddressLine2.Value= '','0pt','9pt')

This hopeful will set it's line height to 0 if its empty OR 9pts if not."

Monday, November 21, 2005

Wednesday, November 16, 2005

Bob's SQL Reporting Services Blog

Link
How to create an inline bar chart

Here's a neat trick a co-worker and I discovered this afternoon for creating an inline bar chart (an inline data visualization in a table or other data region).

1. Add an embedded image to your report that will serve as the "bar" (a simple horizontal gradient usually looks nice)

2. Add a column to your table, and place an Image report item in the detail row cell.

3. Select the embedded image from step 1 as the source for the Image report item.

4. Set the padding on the Image report item to adjust the size and position of the bar within the cell

5. [Here's the kicker] Create an expression for the right (or left) padding that divides a field value by some maximum value, clamps it to a given range if necessary using Math.Min/Max, and then multiplies by the total size of the bar.

Example: image1.Padding.Right=((1.0-(Math.Min(Math.Max(Fields!TotalSales.Value,0),100000)/100000))*72) & "pt"">Bob's SQL Reporting Services Blog: "How to create an inline bar chart

Here's a neat trick a co-worker and I discovered this afternoon for creating an inline bar chart (an inline data visualization in a table or other data region).

1. Add an embedded image to your report that will serve as the 'bar' (a simple horizontal gradient usually looks nice)

2. Add a column to your table, and place an Image report item in the detail row cell.

3. Select the embedded image from step 1 as the source for the Image report item.

4. Set the padding on the Image report item to adjust the size and position of the bar within the cell

5. [Here's the kicker] Create an expression for the right (or left) padding that divides a field value by some maximum value, clamps it to a given range if necessary using Math.Min/Max, and then multiplies by the total size of the bar.

Example: image1.Padding.Right=((1.0-(Math.Min(Math.Max(Fields!TotalSales.Value,0),100000)/100000))*72) & 'pt'"

Friday, November 11, 2005

Production Debugging for .NET Framework Applications

Production Debugging for .NET Framework Applications

Debugging Memory problems

Writing High-Performance Managed Applications : A Primer

.NET General

Writing High-Performance Managed Applications : A Primer


Gregor Noriskin
Microsoft CLR Performance Team

June 2003

Applies to:
Microsoft� .NET Framework

Summary: Learn about the .NET Framework's Common Language Runtime from a performance perspective. Learn how to identify managed code performance best practices and how to measure the performance of your managed application. (19 printed pages)

Web Test Tools

Web Site Test Tools and Site Management Tools
More than 290 tools listed in 12 categories

Building managed code using Common Language Runtime (CLR) Integration in SQL Server 2005 - The Code Project - C

using System;
using System.Data;
using System.Data.Sql;
using System.Data.SqlTypes;
using System.Data.SqlClient;
using Microsoft.SqlServer.Server;

public partial class StoredProcedures
{
[Microsoft.SqlServer.Server.SqlProcedure]
public static void sp_GetEmployeeInfo()
{
SqlConnection connection =
new SqlConnection("context connection=true");
try
{
connection.Open();
SqlCommand sqlCommand =
new SqlCommand("SELECT EMPID,EMAIL FROM EMPLOYEE",
connection);
SqlDataReader sqlReader = sqlCommand.ExecuteReader();
SqlContext.Pipe.Send(sqlReader);
}
catch(Exception ex)
{
throw ex;
}
finally
{
if (connection != null)
connection.Close();
}

}
};

Visual C - Linq

101 LinqSamples

MSDN .NET Framework Developer Center: The LINQ Project

MSDN .NET Framework Developer Center: The LINQ Project: "The LINQ Project

The LINQ Project is a codename for a set of extensions to the .NET Framework that encompass language-integrated query, set, and transform operations. It extends C# and Visual Basic with native language syntax for queries and provides class libraries to take advantage of these capabilities."

GotDotNet User Sample: Building managed code using Common Language Runtime (CLR) Integration in SQL Server 2005

GotDotNet User Sample: Building managed code using Common Language Runtime (CLR) Integration in SQL Server 2005

Michael Rys : The light at the end of the tunnel is bright sunshine!

Michael Rys : The light at the end of the tunnel is bright sunshine!: "Enterprise Readiness:

* Multi-platform support: X32, X64 (WoW and native), IA64 Native, Numa Aware
* Multi-instance support with 7.0, 2000 and of course 2005
* Mobile, Tablet, Express (free), Workgroup, Standard and Enterprise ? If we don?t have it, you don?t need it?
* Table Partitioning ? partition by key easily ? drop a partition and add a new one, no need to delete millions of rows
* Upgrade Advisor ? (scan your existing 7.0 and 2000 servers and SP?s and get ready for an easy upgrade)
* SQL Server Management Studio ? built in the VS IDE and able to manage all services from one UI
* Enhanced Full Text Search ? Integrated backup/restore, thesaurus support, 100x faster for index build
* ONLINE: index build, page/file restore, concurrent log and db backup, memory add, fast recovery (online during rollback) ? all operations on-line?
* Column level encryption
* Off-by-default ? reduced attack space
* Code signing
* Vastly improved diagnostics ? Watson dumps allow us to fix 80% of customer issues ? NO CUSTOMER REPRO -, Dedicated admin thread, Dynamic Management"

Don Kiely's Technical Blatherings : SQL Server 2005: Foundation of Service-Oriented Data Architecture

Link

The original paper is by Microsoft?s David Campbell, ?Service Oriented Database Architecture: App Server-Lite?," which he presented at the 2005 ACM SIGMOD International Conference on Management of Data and is a part of its Proceedings. The basic idea of the paper is that a database engine is a logical place to implement features that allow it to function as a full-fledge service program in an application based on SOA. There are three major functions that a service program has to implement to fulfill such a role:

* It has to support Web services (and other types of endpoints, but this is the most important and most common) so that it can process request and response messages with other components in the system.
* It needs the ability to process service requests, meaning that it has to be able to read and parse messages, which are most commonly XML. It also has to participate in complex dialogs and conversations, in the terminology of SOA.
* It has to host service logic, the ability to implement complex business logic.

This isn?t an exhaustive list by any means, of course, but provides the foundation for a database in an SOA environment, what Microsoft is calling a Service-Oriented Data Architecture. It turns out that this is a fascinating way to look at databases in distributed applications, and I had a good time exploring the possibilities in my whitepaper. It finally made sense why Microsoft put so much effort into implementing some of the features in SQL Server 2005, such as:

* Native Web Service Access. Combined with Windows Server 2003?s kernel mode http.sys driver that eliminates the need for IIS as a middleperson, SQL Server has a very efficient way to receive and send SOA messages.
* Service Broker, the new transactional middleware that supports massively scalable services. This is the best example of the synergy of implementing such a feature in SQL Server itself?and the database engine?rather than merely integrating with other technologies such as Microsoft?s own MSMQ. After all, messages are ?interesting data.?
* Notification Services, the SQL Server name for what other platforms call database change notifications. Unlike client-server and n-tier applications that use a highly centralized data store, SOA applications are likely to have data all over the place, some of which is cached from various sources. Those caches need an efficient way of knowing when to refresh themselves, and notifications provide that way. Notification Services isn?t just for sending messages to a PDA anymore, and I?m intrigued by the possibilities.
* SQLCLR, the linchpin that makes it all possible. To be a viable service program requires complex business logic, and T-SQL, extended stored procedures, and OLE automation (COM) just don?t cut the mustard. Hosting the .NET Common Language Runtime in SQL Server is simply a stroke of genius.

Tuesday, November 08, 2005

Sara Ford's WebLog : How to recursively copy files using the Task in MSBuild

Sara Ford's WebLog : How to recursively copy files using the Task in MSBuild: "posted on Tuesday, November 08, 2005 10:21 AM by saraford
How to recursively copy files using the Task in MSBuild

From the MSBuild Team Blog...

Have you ever run into a situation where you had to recursively copy files from one folder into another as a part of your build process? We certainly have run into it a few times, and I thought I'd share with you a neat trick that involves the use of metadata and the Copy task.

Before you can copy a set of files, you need to be able to recursively add those files to an item list. Here's how you do that when declaring items.





The ** wildcard is used in item includes to indicate recursive includes.

So, once you have an item declared as such, you can rely on a piece of standard meta-data that goes with all items (i.e. RecursiveDir) to accomplish your recursive copy. The RecursiveDir metadata when used will return the evaluated value of the ** metadata for each item - and you can use that value to preserve the folder structure for each file when performing the copy. Here's how you invoke the task using this piece of metadata to copy recursively.

Differences between Visual Studio 2005 Standard and Professional Editions

Link



With the whoops I mentioned here comes the discussion of exactly what is the difference between the two version of the Visual Studio 2005 (standard and professional). First of all, here are the differences that Microsoft has made public in their feature chart. Before you spent too much time with the scorecard, let me summarize the major differences:

Standard:

* User Experience: Streamlined
* Deployment Tools: Click Once
* Extensibility: Consume Extensions
* Reporting: SQL Server Reporting Services
* Debugging: Local
* 64-bit CPU support, Server Explorer, SQL Server 2005 integration: No

Professional:

* User Experience: Full
* Deployment Tools: Yes
* Extensibility: Full
* Reporting: SQL Server Reporting Services and Crystal Reports
* Debugging: Local and remote
* 64-bit CPU support, Server Explorer, SQL Server 2005 integration: Yes

Most of the differences are fairly straightforward. That leaves the question between deployment tools, extensibility, and "user experience". Here is what I know on the first two.

“Consume Extensions“ means the you can use an existing Visual Studio Add-in. All, on that note, means there is IDE support for creating new Add-ins. Fair enough.

“Deployment, One-Click” means there is a mechanism to deploy a product through the IDE, but not create setup projects that produce a .MSI file.

So, what about user experience? There has been some talk here, here, and here about what it means, but it all is fairly vague. I do not have a clear-cut answer but I have a theory:

I was at a presentation earlier this year where the presenter was typing code. It looked like the presenter was using Resharper at a user group presentation! I was told that was not the case, but an extra set of new features built into the IDE. Maybe the folks at Microsoft sent the Resharper folks a nice check and incorporated that technology into the Professional Edition (and above)! That would mean

For standard, User Experience: Streamlined == Intellisense

For professional, User Experienced: Full == Intellisense + Resharper-like functionality

Again, just my theory. But thanks are in order to Geoff Snowman who provided me with a lot of this information and continues to check out our future “experiences“.

One thing is for certain, the Standard edition of Visual Studio 2005 is no wimp. In the past, Standard editions were basically a compiler + a text editor slightly better than notepad. Such is not the case. For learning, hobbyists, or those needing a bit more than the Express versions offer, it appears to be sized just right.

However, for those of us who continue to make a living in the .NET world, its the Professional version and above.

Edit: Just received this note indirectly from Prashant Sridharan, the Visual Studio Project Manager:

Standard is basically the union of the Express SKUs with mobile and remote data support added. It doesn't have support for remote debugging or server-side development, it doesn't have SQLCLR support, and it doesn't have the deeper data tools in Pro.

Friday, November 04, 2005

Office Developer Center: Redesigning an Excel VBA Solution for .NET Using Visual Studio 2005 Tools for Office

Link

Summary: Learn about redesigning an Excel VBA solution to take full advantage of Visual Studio 2005 Tools for Office and the .NET Framework. This article looks at the rearchitecture of the Trey Research Internet Information Services (IIS) Log Analyzer solution. (22 printed pages)

Friday, October 14, 2005

Jamie Thomson - Life, the universe and SSIS!

Jamie Thomson - Life, the universe and SSIS! or How to Pass DTS Datetimevalues into a stored proc.

Mark Hill Rambling About ... Analysis Services

Mark Hill Rambling About ... Analysis Services Good article on aggregations in 2005

More head-hurting articles...

Detecting Duplicate Indexes in Microsoft SQL Server 2005 databases using PIVOT

Nick Grattan - My Blog - Using the OUTPUT Clause with INSERT, UPDATE and DELETE in SQL Server 2005

No more @@IDENTITY or @@SCOPE_IDENTITY? My head hurts...

Jamie Thomson - Life, the universe and SSIS! : SSIS: The mind-shift from DTS

Jamie Thomson - Life, the universe and SSIS! : SSIS: The mind-shift from DTS

Jamie Thomson - Life, the universe and SSIS! : SSIS Nugget: Get error descriptions

Jamie Thomson - Life, the universe and SSIS! : SSIS Nugget: Get error descriptions

x5 � ISNULL in c# (2.0 only stuff)

x5 � ISNULL in c#

Separating Data Layer from Reporting Layer in RS

How to preview a report without direct access to its data source in SQL Server Reporting Services 2000
You may have noticed that starting with SSRS SP1, when you preview a report which contains data in the designer, a ReportName.RDL.Data file appears on your file system. We store the data that was just displayed by your preview action in this file so we don't have to round-trip to the server over and over again each time you preview the report (assuming you don't change any parameter values, etc.)
If you want, you can use this data file to make your report more "portable" -- Basically taking the report.rdl and report.rdl.data files to another machine that doesn't have access to the "real" data store. You can also use this method to send the report/data to a buddy if you need some help troubleshooting the report. Cool hack!
Close Visual Studio and add the following lines to Program Files\Microsoft Visual Studio .NET 2003\Common7\IDE\devenv.exe.config file:

What this entry does is force the preview pane to bypass checking that the cache (data file) is up-to-date and that any report parameter values selected in the report match those used to generate the cached data in the local file.
Next, Create a new VS Report Project and add the existing (rdl) report to it. Drop your rdl.data file in the same folder.
(This is the strange part) Add a new shared datasource to the project, giving it the SAME name as the old datasource your report used in the other project it lived in previously. You don't need even need to specify connection string in his data source -- it'll be using the cached data in your rdl.data file.
Finally (if necessary), modify the report so that any parameters in it don't use queried values (which you no longer have data for) , and that all parameters have valid default values. You also need to remove any other datasets that use data not coming from your .rdl.data file.
Preview!

Russell Christopher's Semi-Useful BI Musings

Russell Christopher's Semi-Useful BI Musings

Very useful musings...

Russell Christopher's Semi-Useful BI Musings

Russell Christopher's Semi-Useful BI Musings

Very useful musings...

Thursday, October 13, 2005

Read from a text file into SQL table without BCP

One of my co-workers asked how to read a text file into a table.

Without using BCP, here is one way of doing it... (must have xp_cmdshell enabled.)

-- This procedure reads the ALL text file which contains the currently
-- running date and displays the results

DECLARE @FileCommand as varchar(255)
SET @FileCommand = 'type c:\all.txt'
-- =============================================
-- Create table basic template
-- =============================================
IF EXISTS(SELECT name
FROM sysobjects
WHERE name = 'TempDateCheck'
AND type = 'U')
DROP TABLE TempDateCheck

CREATE TABLE TempDateCheck (
Contents varchar(100) NULL)

INSERT INTO TempDateCheck
EXEC xp_cmdshell @FileCommand

SELECT Contents
FROM TempDateCheck

Wednesday, September 28, 2005

SQL BI : Direct MOLAP insert vs. proactive cache

SQL BI : Direct MOLAP insert vs. proactive cache

Time-Dynamic MDX

Time-Dynamic MDX

Convention 3: Create the Closed Member Property
The third convention is the most flexible of the four. To use this technique, you create a member property called Closed on the month level of the Time dimension. This member property in the cube corresponds to a column in the Time-dimension table of the star schema. When a month is closed, this column in the star schema is set to 1; if the month isn't closed, the column is set to 0. MDX formulas can test for a 1 in this member property to determine whether the month is closed.

CREATE SET RecentMonth AS
'Filter( Time.Month.Members,
Time.CurrentMember.Properties
("Closed") =
"1").item(COUNT(Filter(
Time.Month.Members, Time.CurrentMember.Properties
("Closed") = "1") 1)'

Convention 4: A Server-Based Solution
The fourth convention for determining the most recent month is a server-based solution. To implement this solution, you use Analysis Manager to create the RecentMonth set on the OLAP server. This set contains one month member, and you use Analysis Manager to update the set manually whenever a new month closes. Although it contains a manual step, this technique isn't hard to use. You can automate this solution by changing the RecentMonth set definition at the end of the scheduled cube-load task. You can change server-based set definitions by using the Decision Support Objects (DSO) object model from Visual Basic (VB). Check the SQL Server Books Online (BOL) topic "DSO" for more information about how to create and update server-defined sets.

Tuesday, September 27, 2005

http://blogs.msdn.com/dataaccess/archive/2005/07.aspx

Connecting to Sql 2005

SSIS Nugget: Get error descriptions *****

http://blogs.conchango.com/jamiethomson/archive/2005/08/08/1969.aspx

If you use an error output then you will find that you get given an error number and an error column (i.e. the column in which the error occurred) but you do not get an error description. Well not to worry because youo can get the error description using a script component.
Here I've built a data-flow that contains an error in the source data.

SSIS 2005 Naming Conventions

http://www.sqljunkies.com/WebLog/macaw/default.aspx

ETL/SSIS/Naming Conventions
We are developing SSIS solutions with project teams, and figured it would be a good idea to define a set of ETL, SSIS, and naming conventions in order to make it easier for team members to collaborate. We would like to share our thoughts on these conventions, and wonder what strategies other SSIS developers out there are currently using.
ETL StrategyAs a general strategy for our ETL process, we use three SQL Server databases: SA_IN, SA_OUT, and DWH (where SA means Staging Area). Data flows through these databases in the following steps (see this picture for a graphical representation of the entire process):
First, all relevant data from the source systems (flat files, production databases, etc.) will be copied 1:1 to the SA_IN database.
In the SA_IN database, we build a view for each dimension and fact table we want to create. Each view's SELECT statement selects and joins the SA_IN columns and tables that are needed to populate the corresponding dimension or fact table. We also use the view to give columns correct names (aliases) and to perform some simple transformations (e.g., replace NULL values with the ISNULL function).
Then we use SSIS to copy the result data from each SA_IN view into the SA_OUT database. Note that the tables in the SA_OUT database can be created on the fly: when configuring the SQL Server Destination components in our SSIS package, we use the "Create..." button to create a new table in the SA_OUT database (the table design will then be based on the output columns of the data source, in this case our SA_IN view).
On the SA_OUT database we perform complex transformations with SSIS components when necessary.
Next, we use SSIS to copy all the data 1:1 from the SA_OUT database to the DWH database. On the DWH database we have defined foreign key constraints (relationships) to maintain referential integrity. Rows that do not satisfy these constraints will be written to an error table in the SA_OUT database and will be reported back to the administrator.
Finally, we use SSIS to process the SSAS cubes that are built on top of the DWH data warehouse database.
SSIS Package StructureOne of the decisions we had to make, was whether we should create many small SSIS packages or a few large packages. We decided on a middle course. Each SSIS solution contains one _Overall package that executes an _Extract, _Transform, and _Load package. These three packages in turn execute a package for each source table / data warehouse table that needs to be extracted, transformed, or loaded into the data warehouse (this picture shows how these packages fit in the ETL process as a whole). We are using separate packages for each table so different persons can work on the ETL for different tables at the same time.
Naming ConventionsWe also decided to define naming conventions for both database objects and SSIS packages. SSAS seems to detect fact and dimension tables based on their names: when a table name contains the string "dim", SSAS will treat it by default as a dimension table. The same holds for the string "fact" for fact tables. So we use a Dim and Fact prefix for dimension and fact tables respectively. Another naming convention we have often seen in Microsoft products is to use the prefixes PK and FK for primary keys and foreign keys. We've chosen to follow this notation, and also added a Key and AppKey suffix to differentiate surrogate keys from application (OLTP) keys. We have also posted a summary of the naming conventions we are currently considering. Note that some package names start with an underscore. The purpose of this prefix is twofold: (a) it differentiates the top-level packages from the lower-level packages, and (b) it makes these packages easy to find in the solution explorer (the solution explorer orders packages alphabetically by package name, so all the packages that start with an underscore will be grouped together at the top of the list).
Maarten Strunk [Macaw]

Saturday, September 24, 2005

Begginer questions frequently asked... and answered...

Begginer questions frequently asked... and answered...

Nice!!!

Begginer questions frequently asked... and answered...
This is a list of some stuff (mostly begginer) i've learned here on SQLTeam forums and i've written them down over time... i've posted it in forums here and on Madhivanan gave the idea to blog it... so i did. :)) maybe this will help in the planned begginers forum :))

1. How to update a column with incrementing numbers:

- whole table (identity stlye)
declare @table1 table (id int, name varchar(50))
insert into @table1
select null, 'text1' union all
select null, 'text2' union all
select null, 'text3' union all
select null, 'text4'
select * from @table1
declare @inc int
set @inc = 0
UPDATE @table1 SET @inc = id = @inc + 1
select * from @table1
- groups of data:
declare @table table (id int, diag int, count1 int, rank int)
insert into @table
select 1, 42, 75, null union all
select 1, 49, 50, null union all
select 1, 38, 22, null union all
select 2, 70, 48, null union all
select 2, 33, 27, null union all
select 2, 30, 12, null union all
select 2, 34, 5, null union all
select 2, 54, 3, null union all
select 3, 42, 75, null union all
select 3, 49, 50, null union all
select 3, 38, 22, null

declare @cnt int
set @cnt = 0
UPDATE t1
SET @cnt = rank = case when
exists (select top 1 id from @table where id not exists (select top 1 id from @table where id=t1.id and count1 > t1.count1)
then 1
else @cnt + 1
end
from @table t1
select * from @table

2. How to get top N rows for each group?

Declare @myTable table (cat varchar(100), subcat varchar(100), rank int )
Insert Into @myTable
Select 'A', 'a', 2
Union All Select 'A', 'b', 52
Union All Select 'A', 'c', 14
Union All Select 'A', 'd', 46
Union All Select 'A', 'e', 37
Union All Select 'A', 'f', 95
Union All Select 'A', 'g', 73
Union All Select 'A', 'h', 67
Union All Select 'A', 'i', 80
Union All Select 'A', 'j', 03
Union All Select 'B', 'a', 18
Union All Select 'B', 'b', 44
Union All Select 'B', 'c', 52
Union All Select 'B', 'd', 60
Union All Select 'B', 'e', 28
Union All Select 'B', 'f', 06
Union All Select 'B', 'g', 70
Union All Select 'B', 'h', 90
Union All Select 'B', 'i', 89
Union All Select 'B', 'j', 31
declare @n int
Set @n = 5
Select Cat, subCat, rank
From @myTable as A
Where (Select count(1) From @myTable Where cat=A.cat and rank <= A.rank)<=@n
Order By Cat, Rank Desc

3. How to pivot/cross tab/transpose data?

http://weblogs.sqlteam.com/jeffs/archive/2005/05/02/4842.aspx

4. How to make a search with multiple optional arguments?

SELECT ...
WHERE (((Col1 = @Col1) or (@Col1 is null))
and ((Col2 = @Col2) or (@Col2 is null))
and ((Col3 = @Col3) or (@Col3 is null)) and ...) and (other conditions)

5. How to put a column into a delimited form?

use northwind
Declare @ColumnList varchar(1000)
SELECT @ColumnList = COALESCE(@ColumnList + ', ', '') + column_name
FROM INFORMATION_SCHEMA.Columns
WHERE table_name = 'Orders'

6. How to export results of a stored procedure to a txt file?

Run this in the DOS command line
- arguments are case sensitive
osql /U sa /P password /d pubs /S Server9 /Q "sp_help" -o ofile.txt

7. How to reset an identity column?

DBCC CHECKIDENT('TableName', RESEED, 'StartValue')
or
Truncate table TableName


8. Parsing delimited words from a column:

Create table myTable (myCol varchar(50), Col1 varchar(10), Col2 varchar(10),
col3 varchar(10), col4 varchar(10), col5 varchar(10))

Insert Into myTable Values ('AA1|BBB1|CCCC1|DDD1|EEEE1', Null,Null,Null,Null,Null)
Insert Into myTable Values ('AAA2|BBB2|CC2|DDD2|EEEE2', Null,Null,Null,Null,Null)
Insert Into myTable Values ('AAAAA3|BBB3|CCCC3|DDD3|EEEE3', Null,Null,Null,Null,Null)
Insert Into myTable Values ('AAA4|BBB4|CCCC4|DD4|EEEE4', Null,Null,Null,Null,Null)
Insert Into myTable Values ('A5|BBB5|CCCC5|DDD5|E5', Null,Null,Null,Null,Null)
Insert Into myTable Values ('AAA6|BBB6|CCCC6|DDD6|EEEE6', Null,Null,Null,Null,Null)
Insert Into myTable Values ('AAA7|BBB7|CCCCC7|DDD7|EEEE7', Null,Null,Null,Null,Null)
Insert Into myTable Values ('AA8|BBBBBBB8|CCCC8|DDD8|EEEE8', Null,Null,Null,Null,Null)

declare @i1 int
declare @i2 int
declare @i3 int
declare @i4 int

update myTable set
@i1 = charindex('|', myCol),
col1 = left(myCol, @i1-1),
@i2 = charindex('|',myCol,@i1+1),
col2 = substring(myCol, @i1+1, @i2-@i1-1),
@i3 = charindex('|',myCol, @i2+1),
col3 = substring(myCol, @i2+1, @i3-@i2-1),
@i4 = charindex('|',myCol, @i3+1),
col4 = substring(myCol, @i3+1, @i4-@i3-1),
col5 = substring(myCol, @i4+1, 50)
select * from myTable

9. How to get Nth max value?

delcare @N int
set @N = 5
Select *
From Employee E1
Where (@N-1) = (Select Count(Distinct(E2.Salary)) From Employee E2 Where E2.Salary > E1.Salary)


10. How to use a variable in an IN part of where clause?

- declare @values nvarchar(50)
set @values = ',2,3,4,'
select * from Table1 WHERE charindex(',' + TableId + ',', @values) > 0
- use dbo.Split() function

11. How to get a random row from a table?

- select top 1 columns....
from table
order by newid()
- choosing between first 20 rows
set ROWCOUNT 20
select top 1 *
from (Select * from table)
order by newid()
set ROWCOUNT 0

12. How to LTrim any character from a value?

- SUBSTRING(@mystring, CHARINDEX(LEFT(REPLACE(@mystring, @char, ''),1), @mystring), LEN(@mystring))


13. How to dynamicaly rank rows?

- select rank=count(*), a1.au_lname, a1.au_fname
from authors a1, authors a2
where a1.au_lname + a1.au_fname >= a2.au_lname + a2.au_fname
group by a1.au_lname, a1.au_fname
order by 1
- groups:
Declare @data table(idn int, diag int, recCount int)

insert into @data
Select 1, 42, 75 union
Select 1, 49, 50 union
Select 1, 38, 22 union
Select 2, 70, 48 union
Select 2, 33, 27

select a.*,
(select count(*)
from @data b
where a.idn = b.idn and a.RecCount >= b.RecCount) as Rank
from
@data a

14. How to get a running total (value in current row = sum of all previous values)?

- SELECT DayCount, Sales, Sales+COALESCE((SELECT SUM(Sales)
FROM Sales b
WHERE b.DayCount < a.DayCount),0)
AS RunningTotal
FROM Sales a
ORDER BY DayCount

15. How to get just date or just time from a datetime value?

- just date: SELECT DATEADD(d, DATEDIFF(d, 0, GetDate()), 0)
- just time: SELECT DATEADD(d, -DATEDIFF(d, 0, GetDate()), GetDate())

16. how to get a number of repeating chars in a string?

- select len(@str) - len(replace(@str, @delimiter, ''))

17. How to multiply all of the values in a column?

- SELECT CAST(ROUND(COALESCE(EXP(SUM(LOG(ABS(nullif(col,0))))),0),0) AS INTEGER) AS output_value FROM @mytable
- set nocount on
declare @mytable table (col smallint)
insert @mytable(col) select 6 union select -7 union select 7 union select null union select 2
declare @x bigint
set @x = 1
select @x = @x * col from @mytable where coalesce(col,0) > 0
select col from @mytable
select @x as positive_product

18. Split function:

CREATE FUNCTION dbo.Split
(
@RowData nvarchar(2000),
@SplitOn nvarchar(5)
)
RETURNS @RtnValue table
(
Id int identity(1,1),
Data nvarchar(100)
)
AS
BEGIN
While (Charindex ( @SplitOn,@RowData)>0)
Begin
Insert Into @RtnValue (data)
Select
Data = ltrim(rtrim(Substring( @RowData,1,Charindex ( @SplitOn,@RowData)-1)))

Set @RowData = Substring( @RowData,Charindex ( @SplitOn,@RowData)+1,len( @RowData))
End

Insert Into @RtnValue (data)
Select Data = ltrim(rtrim(@RowData))

Return
END

Microsoft SQL Server - Programmatically Save DTS Packages to Files - SQLTeam.com

Microsoft SQL Server - Programmatically Save DTS Packages to Files - SQLTeam.com: "DECLARE @TARGETDIR varchar(1000)
SET @TARGETDIR = 'C:\DTSTest\'

SELECT distinct
'DTSRUN.EXE /S '
CONVERT(varchar(200), SERVERPROPERTY('servername'))
' /E '
' /N '
''' name '''
' /F '
''' @TARGETDIR name '.dts''
' /!X'
FROM msdb.dbo.sysdtspackages P"

Wednesday, September 14, 2005

MSMVPs.com Weblogs

MS MVPs weblogs - big list

Captain LoadTest

Captain LoadTest: "Question number two asks the candidate to list the types of testing with which s/he has experience. This reply included integration testing (also stated in his resume, correctly spelled). My follow-up asked him to define integration testing; a common ploy to make sure I�m not just being fed buzz-words. It was a definition he could not supply, or even attempt.
A candidate should be able to define every �word� he claims experience with. If you can not define it you obviously do not have enough experience in it to make it applicable. If you can not define �integration testing�, I will not hold it against you providing you do not list experience in it. Similarly, if you do not list it, and I ask you what you know about it, be straight; tell me straight-up that you cannot define it. You will rate higher in my book than someone who stumbles through an obviously concocted and blatantly incorrect response."

Amazon.com: Books: Expert .NET Delivery Using NAnt and CruiseControl.NET

Tom Hollander's blog : Instrumentation in Enterprise Library

a href="http://blogs.msdn.com/tomholl/archive/2005/02/18/376187.aspx">Turning off event logging & perf counters in App Blocks.<

Friday, August 26, 2005

MSSQL Server Reporting Services: Mastering OLAP Reporting: Extending Conditional Formatting: SWITCH and Drilldown Defaults

MSSQL Server Reporting Services: Mastering OLAP Reporting: Extending Conditional Formatting: SWITCH and Drilldown Defaults

Bait & SWITCH()

MSSQL Server Reporting Services : Black Belt Components: Manage Nulls in OLAP Reports

http://www.databasejournal.com/features/mssql/article.php/3436441

This article is a member of the series MSSQL Server 2000 Reporting Services. The series is designed to introduce MSSQL Server 2000 Reporting Services ("Reporting Services"), with the objective of presenting an overview of its features, together with many tips and techniques for real-world use. For more information on the series, as well as the hardware / software requirements to prepare for the exercises we will undertake, please see my initial Database Journal article, A New Paradigm for Enterprise Reporting.

Basic assumptions underlying the series are that...

Thursday, August 18, 2005

Sample asp page for OLAP

&#10 &#10 9.0.0.3821 &#10 &#10 &#10'>


<% set rsProject = nothing

set comPS = nothing

set prmPS = nothing

set conPS = nothing %>

Thursday, July 14, 2005

Tuesday, April 19, 2005

cool stuff - stored proc resultset queries

This command allows you to link to the local SQL Server & select individual columns from a stored procedure

sp_addlinkedserver @server = 'LOCALSERVER', @srvproduct = '', @provider = 'SQLOLEDB', @datasrc = @@servername
GO

select * from openquery(localserver,'exec mystoredproc')
GO

Great for comparing the results of 2 stored procedures, or doing union queries against multiple stored procedures without temp tables.

http://www.sommarskog.se/share_data.html

Wednesday, March 09, 2005

Reporting Services companies

One of the things the company I work for specializes in is Reporting Services deployments.

Here's a list of some companies with add-ons for RS.

Cizer
Capricorn Software, LTD
Central Information Systems
Groeneveld Management Services BV
The Henson Group
JM Consulting
Microsoft
Northwest Training Systems, Inc.">
NxTrend
ReportsTrader.com
Tridec
ZAPTECH
Other

Wednesday, January 05, 2005

Changing the Process Buffer Size to Improve OLAP Cube Processing

http://solidqualitylearning.com/blogs/Dejan/archive/2004/10/15/194.aspxChanging the Process Buffer Size to Improve OLAP Cube Processing
Last few days I played with the Process Buffer size (in Analysis Manager, check the server properties, Processing Tab). This option, besides many other performance tips, is described in excellent article Microsoft SQL Server 2000 Analysis Services Performance Guide by Carl Rabeler, Len Wyatt and Dave Wickert. After few tests, I got the best performance when I changed the default value of 32 MB to 256 MB. I was really surprised by the processing time improvement. Amazingly, the processing time (full process) for a quite large cube went down from 3 hours 58 minutes to 1 hour 14 minutes! Really amazing – processing is now more than three times faster by just adjusting a single option. After this experience, I would suggest to everyone who is using OLAP cubes in production to start testing the process buffer size, especially if you are close to the limit of the time window available for processing. The best actual size might vary depending on your HW and cubes, but you sure want to increase the default value.