Thursday, September 29, 2011

Maximizing SQL Server Throughput with RSS Tuning - Microsoft SQL Server Development Customer Advisory Team - Site Home - MSDN Blogs

Running rings around your network cards.

In the end, we continued to use our “workaround” of scaling network load out to 4 NIC cards, which gave us enough network bandwidth as well as RSS CPUs to handle the heavy network traffic. You can certainly use more powerful 10Gbps NIC, but remember to configure “RSS rings” to a proper value.

Maximizing SQL Server Throughput with RSS Tuning - Microsoft SQL Server Development Customer Advisory Team - Site Home - MSDN Blogs

Tuesday, September 27, 2011

RAMMap

One of the best tools for finding what exactly is consuming memory in Windows Vista or higher.

Have you ever wondered exactly how Windows is assigning physical memory, how much file data is cached in RAM, or how much RAM is used by the kernel and device drivers? RAMMap makes answering those questions easy. RAMMap is an advanced physical memory usage analysis utility for Windows Vista and higher.

RAMMap

Monday, September 26, 2011

SQL and SQL Analysis Services aren’t friends in the sandbox

When you start using Analysis Services with larger dimensions and cubes, you may notice that your SQL Server is performing poorly if it’s on the same server.  By default, Analysis Services, SQL Server, Integration Services and Reporting Services (and perhaps even Full Text Search) are all fighting for valuable memory.  Setting the caps on each of those services, and ensuring that other services aren’t chewing up memory is important for a well-performing SQL Server. 

Greg Galloway points out a great tool above from Sysinternals called RAMMAP. Looks like a great utility for finding out just what is gnawing at your memory, and whether there’s spyware hijacking your system.

I can’t count the number of times I’ve seen a new client’s server in the following state. The server has SQL and SSAS on the same box. Both are set to the default memory limits (which is no memory cap for SQL and 80% of server memory for SSAS). By the time I see the server, SQL and SSAS are fighting each other for memory, causing each other to page out, and performance degrades dramatically. (Remember what I said about disk being a million times slower than RAM?)

Home - Greg Galloway

What’s further confusing with SQL configuration, is SQL uses “KB” as the default max memory setting, while Analysis Services has just “80” which means use up to 80% of physical RAM.  Typing in a number > 100 will allocate an absolute value of the “Bytes” of RAM Analysis Services will use.  So much for consistency.

Don’t forget to clear out the event viewer if you have a mysterious “services.exe” eating up memory.  I have seen a large security log take up 700MB of RAM.

Tuesday, September 20, 2011

SQL# (SQLsharp) Functionality

SQL # is a set of CLR functions that perform some interesting tasks not easily available from SQL.  Things like getting twitter feeds, getting web pages, or working with the filesystem.

Yep, I can tweet from SQL Server thanks to SQL#. In fact, I sent that tweet out.

I can also pick up my twitter stream

SQL# (SQLsharp): A Review

Automating SSAS cube docs using SSRS, DMVs and spatial data | Purple Frog Systems

 

This article outlines a method of documenting cubes with some stored procedures and Reporting Services reports.  The only flaw, which is more on the SQL side, is the lack of a way to dynamically specify the linked server name, without getting into dynamic sql.

This could be useful for managing change in cubes, and providing end-user or technical documentation.

Being a business intelligence consultant, I like to spend my time designing data warehouses, ETL scripts and OLAP cubes. An unfortunate consequence of this is having to write the documentation that goes with the fun techy work. So it got me thnking, is there a slightly more fun techy way of automating the documentation of OLAP cubes…

There are some good tools out there such as BI Documenter, but I wanted a way of having more control over the output, and also automating it further so that you don’t have to run an overnight build of the documentation.

I found a great article by Vincent Rainardi describing some DMVs (Dynamic Management Views) available in SQL 2008 which got me thinking, why not just build a number of SSRS reports calling these DMVs, which would then dynamically create the cube structure documentation in real time whenever the report rendered..

This post is the first in a 3 part set which will demonstrate how you can use these DMVs to automate the SSAS cube documentation and user guide.

Automating SSAS cube docs using SSRS, DMVs and spatial data | Purple Frog Systems

Thursday, September 08, 2011

Defining Dimension Granularity within a Measure Group

When dealing with multiple Measure Groups in a cube, you could have items repeating if they are at a higher level than the detailed rows.  Here are a couple articles and MSDN info identifying best practices around this.

All but the simplest data warehouses will contain multiple fact tables, and Analysis Services allows you to build a single cube on top of multiple fact tables through the creation of multiple measure groups. These measure groups can contain different dimensions and be at different granularities, but so long as you model your cube correctly, your users will be able to use measures from each of these measure groups in their queries easily and without worrying about the underlying complexity.

http://www.packtpub.com/article/measures-and-measure-groups-microsoft-analysis-services-part2

Users will want to dimension fact data at different granularity or specificity for different purposes. For example, sales data for reseller or internet sales may be recorded for each day, whereas sales quota information may only exist at the month or quarter level. In these scenarios, users will want a time dimension with a different grain or level of detail for each of these different fact tables. While you could define a new database dimension as a time dimension with this different grain, there is an easier way with Analysis Services.

Defining Dimension Granularity within a Measure Group

Thursday, September 01, 2011

Merrill Aldrich : Handy Trick: Move Rows in One Statement

Moving data around has never been so easy.

The fact that we can take output from DELETE and feed it to INSERT actually models what we are trying to do perfectly. And, we get some advantages:

  1. This is now a single, atomic statement on its own.
  2. The logic about which rows to move is specified only once, which is neater.
  3. The logic about which rows to move is only processed one time by the SQL Server engine.

Merrill Aldrich : Handy Trick: Move Rows in One Statement

Degenerate dimensions in SSAS « The Official BI Twibe Blog

By default, SQL 2008 sets the dimension error handling property to Custom, which breaks any dimensions that have duplicate attributes.  For instance, if you have a state or province column in your dimension and it’s not unique, it will “report and stop” when handling the error. 

The fix is to select “report and continue”, or set “default” for your error handling, or setup a composite key that makes the attribute unique, or configure relationships that make it unique.

There are some other things that SSAS does behind the scenes.  This article provides some more info.

The reason is that when processing the dimension, SSAS by default does a right trim and this eliminates not only the spaces, but also any of the three special characters (tab, line feed and carriage return) we added! Note how this differs from T-SQL where these characters are not impacted by RTRIM as can be seen here:

Degenerate dimensions in SSAS « The Official BI Twibe Blog