Wednesday, January 30, 2008

Revenge of the BLOB - SQL Server Central

One of my articles has been published to SQL Server Central.  It was an odd scenario that I had never seen put to use in SQL Server before.  To store a table of records records in a table column as a blob sounds like madness to me, or perhaps something retro from the '80s.

I think it was more fun digging up the origin of Blob than finding a solution though...

Link to Revenge of the BLOB - SQL Server Central

Tuesday, January 29, 2008

The Data Platform Insider : Microsoft SQL Server 2008 Roadmap Clarification

Looks like SQL 2008 is not quite ready for primetime just yet...  

To continue in this spirit of open communication, we want to provide clarification on the roadmap for SQL Server 2008. Over the coming months, customers and partners can look forward to significant product milestones for SQL Server.  Microsoft is excited to deliver a feature complete CTP during the Heroes Happen Here launch wave and a release candidate (RC) in Q2 calendar year 2008, with final Release to manufacturing (RTM) of SQL Server 2008 expected in Q3. Our goal is to deliver the highest quality product possible and we simply want to use the time to meet the high bar that you, our customers, expect.

This does not in any way change our plans for the February 27 launch and we look forward to seeing many of you in Los Angeles and other events around the world. Please keep the great feedback coming and thank you again for your ongoing support of SQL Server!

Source: The Data Platform Insider : Microsoft SQL Server 2008 Roadmap Clarification

Monday, January 21, 2008

A Reporting System Architecture - SQL Server Central

An interesting use of Database Mirroring to create a low-impact reporting solution.

These seemingly simple requirements added more complexity to the project than one might think. We mulled over numerous ideas and varying solutions to satisfy the business needs. Our final solution involved the use of a staging database setup on the reporting server (a seperate box) that is populated via SQL Server mirroring. ETL Scripts are then written in SSIS that utilize a snapshot of the mirror database to denormalize the data and populate two reporting databases (details below). The solution turns out to be elegant, to perform well and has basically zero negative impact on the current OLTP system.

A Reporting System Architecture - SQL Server Central

Saturday, January 19, 2008

Comparing Multiple Datasets with the INTERSECT and EXCEPT operators

A manual Diff tool for SQL 2005...

With SQL Server 2005, Microsoft introduced the INTERSECT and EXCEPT operators to further extend what you could already do with the UNION and UNION ALL operators.

  • INTERSECT - gives you the final result set where values in both of the tables match
  • EXCEPT - gives you the final result set where data exists in the first dataset and not in the second dataset

The advantage of these commands is that it allows you to get a distinct listing across all of the columns such as the UNION and UNION ALL operators do without having to do a group by or do a comparison of every single column.  

Source: Comparing Multiple Datasets with the INTERSECT and EXCEPT operators

Monday, January 14, 2008

Basics of Statistics in SQL Server 2005 - Load testing without any data

For example, you can vastly increase the apparent rowcount and pagecount of the Sales.SalesOrderDetail table by issuing this command:


UPDATE STATISTICS Sales.SalesOrderDetail
WITH ROWCOUNT=5000000, PAGECOUNT = 5000000

Why would you want to do this? SQL Server's query optimizer does things differently on small tables and on large tables. During development, you likely don't have a whole lot of data in your tables. Rather than spend time generating huge amounts of sample data, you can use the undocumented options of UPDATE STATISTICS to see how the query optimizer will treat your tables after they've been in production for a while. This can be a quick and dirty method to uncover potential performance issues before they arise, and perhaps point out the need for additional indexes or other schema work. I'd recommend confirming any such changes by retesting with realistic amounts of sample data, though

Basics of Statistics in SQL Server 2005

Microsoft PowerPoint - procedurecache-sql.ppt

 

Lots of useful stuff on optimizing queries in SQL 2005.

Becoming a Query Plan

Caching Superhero!

Microsoft PowerPoint - procedurecache-sql.ppt

Wednesday, January 09, 2008

Very Large Database index optimization

 

We're talking hundreds of GBs or 1 TB or more. These databases are now common on SQL Server 2000 and 2005 - at TechEd IT Forum this week (and at SQL Connections the week before) there are many customers with multi-TB databases. Any experienced DBA knows the value of running consistency checks, even when the system is behaving perfectly and the hardware is rock-solid. The two problems that people have with running a full CHECKDB on their VLDB are:

  • It takes a long time to run (based on many factors – see my previous post here for details).
  • It uses lots of resources – memory, CPU, IO bandwidth, tempdb space.

So it uses lots of resources for a long time. Even with a decent sized maintenance window, the CHECKDB may run over into normal operations. There's also the case of a system that's already pegged in more or more resource dimensions. Whatever the case, there are a number of options:

  • Don't run consistency checks
  • Run CHECKDB using the WITH PHYSICAL_ONLY option
  • Use SQL Server 2005's partitioning feature and devise a consistency checking plan around that
  • Figure out your own scheme to divide up the consistency checking work over several days
  • Offload the consistency checks to a separate system

In Recovery... - Paul S. Randal on SQL Server

Monday, January 07, 2008

Howard @ MSFT selling BI: Proclarity tricks

 

Proclarity tricks

There is a cool trick that allows you to use a proclarity analytics server page web part in a performance point dashboard, with the toolbar functionality and supports the passing of filters.

Basically you set up the ProClarity Analytics Server Page report view like you normally would, except that after you select the desired report, you modify the server URL parameter to point to a different web page on the server.

Here are the steps. You’ll need to plug in your PAS server URLs:

#1 – start by using http://pas-server/pas  in the Server URL when configuring your PC Report View; then navigate select the PC view you wish to display

#2 – after having chosen a view, change the Server URL to:  http://pas-server/pas/en/src/proclarity.asp?uiConfig=tb;ht

#3 – Then, if you want to connect up a filter, drag the Member UniqueName property from the filter onto your PC Report View

Bonus – adding the URL parm fs; places an extra button onto the toolbar which comes in handy if you want to let end users open the underlying view in the PC Web pro client:

http://pas-server/pas/en/src/proclarity.asp?uiConfig=tb;ht;fs;

Howard @ MSFT selling BI: Proclarity tricks

Inferred Members & SSAS 2005

How to ignore missing keys in dimensions.

This is about you can handle new fact records that have one or many dimension keys that is not part of your dimension table or a dimension key is null or blank.

Either you can solve this in the ETL process by adding an inferred member in each dimension, like 'N/A' or 'unknown', or you can let SSAS2005 handle this.

This post is about the last scenario.

Thomas.I Microsoft BI & Analytics

Friday, January 04, 2008

Journey to SQL Authority with Pinal Dave

Very useful.

How many times we have wondered what were the last few queries ran on SQL Server? Following quick script demonstrates last ran query along with the time it was executed on SQL Server 2005.

SELECT deqs.last_execution_time AS [Time], dest.text AS [Query]
FROM sys.dm_exec_query_stats AS deqs
CROSS APPLY sys.dm_exec_sql_text(deqs.sql_handle) AS dest
ORDER BY deqs.last_execution_time DESC

Journey to SQL Authority with Pinal Dave