Friday, December 30, 2011

Multi-Value (Select All) parameter in Reporting Services | Microsoft Connect

 

One area Reporting Services falls down a bit is in the parameter selection process for the built-in Report Manager and integrated modes.  Usually this involves building custom parameter web parts or asp.net portals to get around.  There are some workarounds for issues related to multi-value parameters and the infamous ‘Select All’ checkbox that may help if your database is getting clobbered.

Multi-Value (Select All) parameter in Reporting Services | Microsoft Connect

Monday, December 26, 2011

SQL Server Reporting Services Team Blog - Site Home - MSDN Blogs

Is Reporting Services going the way of the dodo, to be replaced by Power View (Project Crescent?)

No.

Comparing Power View, Report Builder, and Report Designer

Power View won’t replace the existing Reporting Services reporting products.

Report Designer is a sophisticated design environment that developers and IT pros can use for embedded reporting in their applications. In Report Designer they can create operational reports, shared data sources, and shared datasets, and author report viewer controls.

In Report Builder, IT pros and power users can create powerful operational reports, and reusable report parts and shared datasets.

Report Builder and Report Designer create RDL reports; Power View creates RDLX reports. Power View cannot open RDL reports, and vice versa.

Both Report Designer and Report Builder are shipping in Microsoft SQL Server 2012 Release Candidate 0 (RC 0) Reporting Services, along with Power View. For more information, see Tools (SSRS).

SQL Server Reporting Services Team Blog - Site Home - MSDN Blogs

It is interesting that Microsoft is launching tools like PowerPivot and Power View, knowing that they will cannibalize their existing Analysis Services and Reporting Services products and roadmaps for future installations.

My question is, why don’t they just go all-in and build out these tools as replacements, instead of diverging paths?  Why not incorporate all the best features of Power View into Report Builder, and leverage the RDL language that is designed for reporting? 

Same goes for Analysis Services and PowerPivot.  Why have two separate servers for each product?  What is the value (other than licensing) of splitting up these technologies into 2 siloed environments?  The sum of the parts is much better than the individual. 

Still looking for some cohesion and sense of direction for the future of SQL Server reporting and BI tools… and hoping for a better 2012.

A deployment checklist for the new Denali / SQL 2012 tools

http://msdn.microsoft.com/en-us/library/hh231687(v=sql.110).aspx

Saturday, December 24, 2011

SQL Reports Scheduler|SQL Server Reporting Services|SSRS|Report Distribution|Business Intelligence|SQL-RD|ChristianSteven Software

The Swiss Army knife of Reporting Services tools.  From two men who both aren’t from Switzerland.  I’m not really sure what it doesn’t do.

SQL-RD is a Business Intelligence application for windows that saves time and money by making it easy to define single or packages of Microsoft® SQL Reporting Services reports, schedule and run them automatically, and send the reports to print, fax, disk, ftp, sms or email in a number of standard formats.

SQL Reports Scheduler|SQL Server Reporting Services|SSRS|Report Distribution|Business Intelligence|SQL-RD|ChristianSteven Software

Friday, December 23, 2011

technoblab

Day 23: Tips from Technoblab for Report Header/Footer, format codes, customizing the toolbar, working with images, etc

Lots of nice code examples here

A database field can neither be placed in a report header nor in the footer. This will throw an error "The Value expression for the textbox <fieldname> refers to a field. Fields cannot be used in page headers or footers".
There is however a workaround. Place a hidden textbox in the report body call it tb_hidden and assign to this the database field that you need in the header. Place a textbox in the report header and assign the hidden textbox's value to it
ex: =reportitems!tb_hidden.Value
This however as a drawback, the value will appear only in the first page. If your report runs into many pages then you need to take a different approach.
Pass the value to appear in the header as a parameter from your .aspx page call it database_field. Assign this to the textbox in the header.
ex: =Parameters!database_field.Value

technoblab

Thursday, December 22, 2011

SQL Server Reporting Services Parameter Logging & Caching

Day 22: Auto-caching and logging of SQL RS parameters

The solution tracks the selected parameters from a report execution and uses them later for the next execution to take them as default parameters. This enables users to log parameters and inspect the execution statistics of reports as well as caching former execution parameters.

SQL Server Reporting Services Parameter Logging & Caching

Wednesday, December 21, 2011

Using SharePoint List data for Reporting Services (SSRS) parameter source « Dan English's BI Blog

 

Excellent article with how to filter distinct values from a list using SSRS and Sharepoint.

Using SharePoint List data for Reporting Services (SSRS) parameter source « Dan English's BI Blog

SSIS ReportGeneratorTask

Day 21 was probably part of Day 20 too.  Still worth a mention.   Code Generator for SQL RS reports

SSIS Task for SQL Server 2008 to create server reports or local reports from a recordset data source.

SSIS ReportGeneratorTask

Tuesday, December 20, 2011

SSIS Community Tasks and Components

Day 20: An aggregator of all the SSIS components on Codeplex and elsewhere

Anything and everything you ever need to do with SSIS is somewhere on this page.  Includes some Reporting Services functionality too.

This is a community project for developing additional tasks and components for SSIS, and identifying other CodePlex projects that do the same. 51 Integration Services extensibility projects exist on CodePlex - find them and over a hundred more from around the 'net right here.

SSIS Community Tasks and Components

Monday, December 19, 2011

Microsoft SQL Server Community & Samples

Day 17, 18, and 19: Whatever you want to do with SQL is somewhere in these samples…

Microsoft SQL Server Community Projects & Samples

Microsoft SQL Server Community & Samples

Friday, December 16, 2011

SQL Server 2008 & R2 Reporter

Day 16: One of the more useful tools out there – integration of Reporting Services with SSMS for exporting table data to a report!

Because the import/export wizard sucks.

SQLReporter enable you exporting table or view to Reporting Services Report , Then , you can export it to Excel & PDF. Quick & nice.

SQL Server 2008 & R2 Reporter

GUI Tool Reveals SQL Injection Vulnerabilities


SQL Injection attacks are probably the most common attacks on internet-facing SQL Servers due to their ease of use.  This tool helps expose vulnerabilities to end-users and non-IT executives, visually.

The reasoning is that most existing SQL Injection testing tools are executed from a command line and “lack an intuitive user interface”. In other words, if you can’t show the problem in a pretty web page, people won’t really believe it exists.

GUI Tool Reveals SQL Injection Vulnerabilities

Thursday, December 15, 2011

SSRS Report Downloader

Day 15: Leech for Reporting Services

SSRS downloader is a tool to download all report files from a folder from Reporting Services at once. It uses reportservice2005.asmx endpoint web service to do this.

SSRS Report Downloader

SimpleDBExtension - SSRS extension for SimpleDB

Day 15: Amazon SimpleDB query extension for your cloud computing needs

If you’d like to tell your manager you’re running reports “off the cloud”, here’s one way to do it.

http://aws.amazon.com/simpledb/

This project is a SQL Server Reporting Services 2008 data processing extension that facilitates querying of Amazon's SimpleDB.

SimpleDBExtension - SSRS extension for SimpleDB

Wednesday, December 14, 2011

SSRS PowerShell Provider

Day 14: A Powershell provider for SSRS

Doesn’t look too current but probably still works…

A PowerShell provider for SSRS (SQL Server Reporting Services). The provider allows for navigation of SSRS installations, management of reports, etc, all through a scriptable interface.

SSRS PowerShell Provider

Tuesday, December 13, 2011

Database Documentation Reports

Day 13: Self-service documentation

These are reports for MS SQL Management Studio and MS SQL Reporting Services for creation of database documentation with the use of descriptions from Extended Properties.
Reports contain the detailed information on tables, views and stored procedures etc. Use of reports engine allows exporting generated reports into Excel, Word, PDF etc. formats easily.

Database Documentation Reports

Monday, December 12, 2011

Sperm Report

Skipping the weekend again. 

Day 12: The unfortunately-named sperm report project, blowing out data from OLAP cubes with “Spermline visualization”

Essentially with these four reports you can create 400 user reports (actually as many as you like) Change what is displayed on rows, columns, filter or time by simply creating a new "linked report" and setting a new set of paramaters.

Sperm Report

Friday, December 09, 2011

RSBuild SQL Server Reporting Service Deployment Tool

Day 9: An automated build tool for SQL RS

RSBuild is a deployment tool for SQL Server Reporting Service. It currently supports two types of tasks: executing SQL Server scripts and publishing SQL Server Reporting Service reports and shared data sources. From version 1.1.0 onwards SQL Server 2008 Reporting Services is supported as well as all previously supported versions of SQL Server Reporting Services.

RSBuild SQL Server Reporting Service Deployment Tool

SQLCAT Community Projects and Code Samples

Day 9: Auditing in SQL from the SQLCat team (and Denny Lee)

This project is a place for the SQL CAT team to share their expertise with the community and collaborate with other community members outside Microsoft who wish to contribute code samples to enable more productive use of the SQL Server platform for high-scale,enterprise customers

SQLCAT Community Projects and Code Samples

Thursday, December 08, 2011

RSToolKit

Day 8: RSToolKit, a tool for creating, moving, documenting, and exporting reports.

This could be very useful in a distributed, farm or DRP environment.

RSToolKit allows users to administrate different SQL Server Reporting Services with one application and has different methods to deploy reports. It supports all SQL Server Reporting Services versions which are using the ReportService2005 and the ReportExecution2005 SOAP API

RSToolKit

Wednesday, December 07, 2011

Drilltrough and filtering on SSAS-cubes in SSRS

Day 7: A framework for navigating an Analysis Services cube from SSRS

Basically the user-friendliness of the interface is achieved by allowing users to drill up/down on the different hierarchies by simply clicking columns or rows, and applying filters by clicking on icons in rows or columns.
The performance is achieved by only retrieving the necessary data from the cube.

Drilltrough and filtering on SSAS-cubes in SSRS

Tuesday, December 06, 2011

Reporting Services Tracer

Day 6 – A debugger for Reporting Services

Wouldn't it be nice to be able to peek under the hood and see what APIs Report Manager or a custom application calls? This is exactly what the Reporting Services Tracer (RsTracer) sample was designed to handle. RsTracer intercepts the server calls and outputs them to a trace listener, such as the Microsoft DebugView for Windows. RsTracer helps you see the APIs that a Reporting Services client invokes and what arguments it passes to each interface. It also intercepts the server response. Armed with this information, you can easily reproduce the same feature in your custom management application.

Reporting Services Tracer

And a utility mentioned above from my favourite Windows utility guy Mark Russinovich

http://technet.microsoft.com/en-us/sysinternals/bb896647

Monday, December 05, 2011

SSIS Report Generator Task (Custom Control Flow Component)

Day 5 (sorry I skipped the weekend) – Report Generator for Integration Services ETL

As the name says it, this "Control Flow" custom component can be used render your SQL Server Reporting Services in files withe the following formats: PDF, Word, Excel, HTML 4.0, MHTML, CSV, XML

SSIS Report Generator Task (Custom Control Flow Component)

Friday, December 02, 2011

SSRS report deployment tool (SSRSBuddy)

Day 2 – SSRS Buddy

A quick fix to upload multiple report definitions to Reporting Services.

SSRSBuddy enables deployment of multiple reports and report models onto a SSRS2005 instance, using shared datasources

SSRS report deployment tool (SSRSBuddy)

Thursday, December 01, 2011

SQL Server Metadata Toolkit 2008

Here’s to a start of 25 days of Reporting Services tools from Codeplex

Day 1 – The Metadata Toolkit

SQL Server Metadata Toolkit
MSDN's SQL 2005 tool kit updated to 2008 for managing metadata in SQL Server Integration Services, Analysis Services and Reporting Services using built-in features including data lineage, business and technical metadata and impact analysis.

SQL Server Metadata Toolkit 2008

Scalability Upgrades Free SQL Server 2012 Assistant Tool

A complement to the upgrade adviser and best practices analyzer.

Microsoft Gold Certified Partner, Scalability Experts, has announced availability of its Upgrade Assistant for SQL Server 2012 (UAFS) tool. As reported on ASP Free, with the UAFS tool, users can automate the process of application compatibility testing to determine any issues that may arise from upgrading to SQL Server 2012 from SQL Server 2008.

Scalability Upgrades Free SQL Server 2012 Assistant Tool

SQL Server 2012 RC0 and PowerPivot V2 RC0 - Blogs

SQL Server 2012 RC0 was put out a couple of weeks ago.  It includes a tool called PowerPivot, which Microsoft is touting as the replacement (ahem, augmentation) for Analysis Services cubes.

(The) shiny features of PowerPivot V2, which are:

  • Hierarchies
  • KPIs
  • Security
  • Perspectives
  • Sort-by-Column
  • Measure-definition through PowerPivot window
  • Data view
  • New DAX-functions

SQL Server 2012 RC0 and PowerPivot V2 RC0 - Blogs

PowerPivot removes some of the complexity and design decisions from a developer and places them in the hands of an end-user.  Pulling in data from various cloud, relational (and cube) sources, the PowerPivot add-in for Excel 2010 (now 2012?) builds an Analysis Services cube in the background.  The cube is stored in-memory (or perhaps paged to disk, but still in-memory technically), until it is saved to disk as an XLSX zip file with an ABF Analysis Services Vertipaq-style backup inside.

Before Analysis Services, Pivot tables were something mainly power Excel users dealt with.  Analysis Services started bringing Pivot tables to the foreground, with all of their limitations.  Corporate users looked to 3rd-party tools for cube browsing, until IT budgets were cut and Excel became “good enough”. 

As users needed more data in their pivot tables, they had to find the 1 person in the company with knowledge of these “cubes”, and bring with them budget to change the corporate data structure.  Now they need to find one of the 10 users in their company who are power Excel gurus and know about this thing called Powerpivot.

It will be interesting to see what happens when PowerPivot is truly embedded in Excel, without the add-in feel.

Monday, November 28, 2011

How I passed SharePoint 2010 exam 70-667 (Part 1 of 4) - Tales from a SharePoint farm

Sharepoint 2010 is a complex beast of an application, with many facets and configuration options.  Taking the exams should be a prerequisite to any upgrade or deployment.

These notes are the best I have seen so far to help prepare for the configuration exam.

How I passed SharePoint 2010 exam 70-667 (Part 1 of 4) - Tales from a SharePoint farm

PowerShell Data Processing Extension for SQL Server Reporting Services

A useful tool to generate reports based on Powershell outputs.

Plug this into SQL Server Reporting Services to create reporting based on your PowerShell scripts. The PowerShell DPE transforms your PowerShell output into a DataSet that can be consumed by SSRS.

PowerShell Data Processing Extension for SQL Server Reporting Services

Saturday, November 26, 2011

Monday, November 14, 2011

SQLBI - Marco Russo

SQLBI - Marco Russo
Project Crescent is now Power View, (on the theme of PowerPivot) and it doesn't support the many to many revolution (2.0).

"
If you are looking to create many-to-many relationships based models in PowerPivot and in BISM Tabular, probably one day you will be interested in using them with Power View (formerly codename “Crescent”). There is a very bad news for you: it appears that, at least in its first release, Power View will not support this scenario, showing a behavior that (and this really worries me) is very different from Excel."

SQLBI - Marco Russo

SQLBI - Marco Russo: "The Many-to-Many Revolution 2.0.

Marco Russo and Alberto Ferrari seem to confirm my assumption that the artist formerly known as UDM has become BISM.

Linking PowerPivot in a many-to-many scenario is, well, a bit hacky in my opinion. Issues like case insensitivity also can cause issues for users.

A whitepaper link below that may give some ideas for a better approach to these challenges.


These are the news in this edition:

Alberto Ferrari joined me as co-author of the paper
We added a new pattern for BISM Multidimensional (formerly known as UDM)
We translated several existing pattern to BISM Tabular model.
Because BISM Tabular doesn’t support many-to-many relationships in its data model, you have to rely on DAX formulas to obtain the desired results. This produces many changes in data modeling and we tried to cover these differences in the paper, too
The paper is freely available in PDF format. We will publish single patterns described in the paper as web articles, in order to improve readability and indexing from search engines (today everybody use a web search engine instead than looking for a document in local disk, just because it’s faster)."

'via Blog this'

Thursday, November 10, 2011

EEG shows awareness in some vegetative patients - Health - CBC News

I wonder if something as simple as a $90 OCZ NIA would allow people to test for this?

EEG shows awareness in some vegetative patients - Health - CBC News: "Owen said another goal is to see whether brain-computer interface technology being developed may one day be used to unlock the world even more for patients who are cognitively aware but unable to let anybody know."

'via Blog this'

Tuesday, November 08, 2011

Create a date and time stamp in your batch files | Remote Administration For Windows

DOS scripts at their best.  DOS better be in Windows 8…

Here is an interesting one. I found a way to take the %date% environment variable, and turn it into a valid string for a filename – without any extra programs or scripts.

For the longest time I used a little utility I created to do this. The problem with that is the utility needs to be around if you want to send the batch file to someone.

What I didn’t know that was that you can use this character combination ‘:~’ to pull a substring out of an environment variable. That is when I realized you could use this to pull out parts of the current date (or time).

Here is how it works. Lets take the %date% variable and print it out

Create a date and time stamp in your batch files | Remote Administration For Windows

Here’s an example including time that backs up a projects directory to the C: drive under a Backup timestamped directory.

@echo off
set YYYYMMDD=%date:~-4,4%%date:~-10,2%%date:~-7,2%
set HHMMSS=%time:~-11,2%%time:~-8,2%%time:~-5,2%
echo %HHMMSS%


c:
md \Backup_%YYYYMMDD%%HHMMSS%

echo Backing Up files to \Backup_%YYYYMMDD%%HHMMSS%\

xcopy /s C:\Projects\*.* \Backup_%YYYYMMDD%%HHMMSS%\

echo Files backed up to \Backup_%YYYYMMDD%%HHMMSS%\

Goto Quit

:Quit

Friday, November 04, 2011

SQL Server 2012 Editions Announced

 

Microsoft has just announced SQL Server 2012 Editions information on official SQL Server 2012 site.

SQL Server 2012 will be available in three main editions:

  1. Enterprise
  2. Business Intelligence
  3. Standard

The other editions are Web, Developer and Express.

Here is the salient features of each of the edition:

Enterprise

  • Advanced high availability with AlwaysOn
  • High performance data warehousing with ColumnStore
  • Maximum virtualization (with Software Assurance)
  • Inclusive of Business Intelligence edition’s capabilities

Business Intelligence

  • Rapid data discovery with Power View
  • Corporate and scalable reporting and analytics
  • Data Quality Services and Master Data Services
  • Inclusive of the Standard edition’s capabilities

Standard

  • Standard continues to offer basic database, reporting and analytics capabilities

There is comparison chart of various other aspect of the above editions. Please refer here.

Additionally SQL Server 2012 licensing is also explained here.

Reference: Pinal Dave (http://blog.SQLAuthority.com)

Journey to SQLAuthority

Wednesday, November 02, 2011

Under the hood: Copy database diagrams in SQL Server 2005

One way to copy database diagrams between databases.

INSERT INTO dbB.dbo.sysdiagrams
SELECT[name],[principal_id],[version],[definition]
FROM dbA.dbo.sysdiagrams

Under the hood: Copy database diagrams in SQL Server 2005

Wednesday, October 26, 2011

Vizubi 2.0 Full Tech Specs – Detailed Features

Vizubi is a Microsoft Powerpivot-like add-in for Excel 2003, 2007, and 2010.  It’s greatest feature is, well, compatibility.  PowerPivot only works with Excel 2010, and the best version of PowerPivot isn’t even released.  PowerPivot will be targeted towards new Office and Sharepoint adoption vs. supporting legacy platforms.  This probably means that the majority of customers and partners will have at least a 2-3 year sales cycle ahead of them.

Vizubi’s in beta now.

Other than that, it has a laundry-list of interesting features on the site.  It seems to play in a similar space to QlikView and other in-memory column-oriented databases.  It even supports the QlikView file format.

From Syntes, a team that brought features to Cognos, Business Objects, Board, and QlikView.  It’s primary products before this were NPrinting and NScheduler, plugins for Qlikview.

Vizubi 2.0 Full Tech Specs – Detailed Features

Saturday, October 15, 2011

Flipboard for the web

Flipbook is one of my favourite iPad apps.  It's a content-styler and aggregator for RSS feeds from my favourite sites like Reddit, Facebook, Twitter, Boingboing, and Buzzfeed.  They would be considered daily reads.  The tool translates very cluttered sites into a nice, flippable interface with the ability to zoom photos, share content and drill into the underlying web site.

The Treesaver HTML5 framework is a web styling tool that works with the iPad to format and present readable content similar to Flipbook. http://siliconangle.com/blog/2011/11/01/flipboard-hires-html-5-star-but-no-web-version-planned/

According to the founder of Treesaver, he chose HTML for his framework because "toasters will someday do HTML."   I would buy an internet-enabled toaster if it was less than $120 and it had a Flipbook-like recipe app.  Smell burnt toast?  Stop reading the internets.
Here's four Flipbook-style interfaces for Windows.

Wednesday, October 12, 2011

Delegation, Claims, Active Directory…Oh My!…Aw Crap! « Denny Lee

Denny Lee is a BI advisor at Microsoft.  He goes through the joys of configuring security, Kerberos and delegation around Sharepoint and Powerpivot. 

There are many blogs on this subject, which to me indicates a failing in Microsoft’s security deployments, and the overcomplexity of Sharepoint when it comes to the Windows security model.  Users accessing PowerPivot within Sharepoint may notice that they don’t have access, when the same level of accounts for other users do. 

This scenario works out well when a VP can’t get into their director’s PowerPivot workbooks.

Sometimes the issue comes down to plumbing within the Active Directory infrastructure, where years of upgrades have caused legacy issues to creep up.  New users will be assigned default security permissions where migrated users may not have these permissions.

Following the lifetime of a security token isn’t my idea of fun, but it is definitely a challenge that will keep security consultants employed for the next few years…

Delegation, Claims, Active Directory…Oh My!…Aw Crap! « Denny Lee

OLAP PivotTable Extensions

This is one of my favourite Excel add-ins when dealing with Analysis Services cubes.  It can pull the MDX out of a pivot table, search cubes, add in calculated measures on the fly, and allow users to share measures.

The latest releases fix some bugs and provide a very useful feature of clearing a pivot table’s cache.  The pivot table cache, a monstrosity of XML code, is known to blow up many an Excel workbook.

Show properties as a caption allows captions to be exposed as real members in the workbook.

Worth downloading if you use OLAP cubes regularly.

OLAP PivotTable Extensions

Monday, October 10, 2011

Friday, October 07, 2011

Customer Proof of Concept on New HP DL980 - Running SAP Applications on SQL Server - Site Home - MSDN Blogs

Did he say 512 GB of RAM?  Yes, yes he did.

Recently we conducted a Performance Proof of Concept for a large customer using the new 8 Intel Nehalem-EX E7540 8-core processor HP DL980 G7 server. This blog discusses some of the configurations and tuning conducted during the PoC. One HP DL980 with 512GB of RAM was used for SQL Server and 9 x 2 Intel Nehalem-EP 5670 processor were used as application servers.

Customer Proof of Concept on New HP DL980 - Running SAP Applications on SQL Server - Site Home - MSDN Blogs

Tuesday, October 04, 2011

Using PowerPivot to analyze MS Dynamics NAV

 

Project Description
Project show how to prepare MS Dynamics NAV data for analyzing in PowerPivot for Excel. Project include Data Warehouse demo database, sql procedure to transfer data from Navision to DW and Excel example.
Happy BI for NAV.

Using PowerPivot to analyze MS Dynamics NAV

HTML5 Adoption Might Hurt Apple's Profit, Research Finds

Apple has tried to limit the use of cross-compiler technologies to allow developers a develop-once, deploy-all solution.  They want Cocoa and Objective-C to be the platform for mobile computing.

Unfortunately developers usually go towards the path of least-resistance, and even though HTML5 is just another dog with the same fleas, it is becoming the platform of choice. 

By adopting HTML5, it opens the market up for Android and, to a lesser extent, Windows Mobile.  Not to mention any PC or Mac with an HTML5 web browser.  Though those HTML5 features again vary depending on the browser and platform.

Did you know Google Chrome can run C++ apps natively inside the browser?  Google Chrome is it’s own HTML5-based operating system.

The catch with Windows Mobile and the new Windows 8 Metro is that HTML5 has always been a lesser player at Microsoft.  They tried to block the path of least resistance with Microsoft Silverlight.  Developers and companies were eventually adopting that standard over Flash and Flex, until MS proposed HTML5 and *cough* javascript as a first-generation language. 

Even with Metro, HTML5 is still treated as a proprietary Microsoft thing.  If you want to ensure that your code is not “view-sourced”, your main alternative to straight HTML5 and JS is to work with a WinRT component.

The walls of the fortress just look a bit different in Redmond than they do in Cupertino. 

I wonder if it was Microsoft Research that found HTML5 adoption might hurt Apple’s profits.

HTML5 Adoption Might Hurt Apple's Profit, Research Finds | PCWorld Business Center

Thursday, September 29, 2011

Maximizing SQL Server Throughput with RSS Tuning - Microsoft SQL Server Development Customer Advisory Team - Site Home - MSDN Blogs

Running rings around your network cards.

In the end, we continued to use our “workaround” of scaling network load out to 4 NIC cards, which gave us enough network bandwidth as well as RSS CPUs to handle the heavy network traffic. You can certainly use more powerful 10Gbps NIC, but remember to configure “RSS rings” to a proper value.

Maximizing SQL Server Throughput with RSS Tuning - Microsoft SQL Server Development Customer Advisory Team - Site Home - MSDN Blogs

Tuesday, September 27, 2011

RAMMap

One of the best tools for finding what exactly is consuming memory in Windows Vista or higher.

Have you ever wondered exactly how Windows is assigning physical memory, how much file data is cached in RAM, or how much RAM is used by the kernel and device drivers? RAMMap makes answering those questions easy. RAMMap is an advanced physical memory usage analysis utility for Windows Vista and higher.

RAMMap

Monday, September 26, 2011

SQL and SQL Analysis Services aren’t friends in the sandbox

When you start using Analysis Services with larger dimensions and cubes, you may notice that your SQL Server is performing poorly if it’s on the same server.  By default, Analysis Services, SQL Server, Integration Services and Reporting Services (and perhaps even Full Text Search) are all fighting for valuable memory.  Setting the caps on each of those services, and ensuring that other services aren’t chewing up memory is important for a well-performing SQL Server. 

Greg Galloway points out a great tool above from Sysinternals called RAMMAP. Looks like a great utility for finding out just what is gnawing at your memory, and whether there’s spyware hijacking your system.

I can’t count the number of times I’ve seen a new client’s server in the following state. The server has SQL and SSAS on the same box. Both are set to the default memory limits (which is no memory cap for SQL and 80% of server memory for SSAS). By the time I see the server, SQL and SSAS are fighting each other for memory, causing each other to page out, and performance degrades dramatically. (Remember what I said about disk being a million times slower than RAM?)

Home - Greg Galloway

What’s further confusing with SQL configuration, is SQL uses “KB” as the default max memory setting, while Analysis Services has just “80” which means use up to 80% of physical RAM.  Typing in a number > 100 will allocate an absolute value of the “Bytes” of RAM Analysis Services will use.  So much for consistency.

Don’t forget to clear out the event viewer if you have a mysterious “services.exe” eating up memory.  I have seen a large security log take up 700MB of RAM.

Tuesday, September 20, 2011

SQL# (SQLsharp) Functionality

SQL # is a set of CLR functions that perform some interesting tasks not easily available from SQL.  Things like getting twitter feeds, getting web pages, or working with the filesystem.

Yep, I can tweet from SQL Server thanks to SQL#. In fact, I sent that tweet out.

I can also pick up my twitter stream

SQL# (SQLsharp): A Review

Automating SSAS cube docs using SSRS, DMVs and spatial data | Purple Frog Systems

 

This article outlines a method of documenting cubes with some stored procedures and Reporting Services reports.  The only flaw, which is more on the SQL side, is the lack of a way to dynamically specify the linked server name, without getting into dynamic sql.

This could be useful for managing change in cubes, and providing end-user or technical documentation.

Being a business intelligence consultant, I like to spend my time designing data warehouses, ETL scripts and OLAP cubes. An unfortunate consequence of this is having to write the documentation that goes with the fun techy work. So it got me thnking, is there a slightly more fun techy way of automating the documentation of OLAP cubes…

There are some good tools out there such as BI Documenter, but I wanted a way of having more control over the output, and also automating it further so that you don’t have to run an overnight build of the documentation.

I found a great article by Vincent Rainardi describing some DMVs (Dynamic Management Views) available in SQL 2008 which got me thinking, why not just build a number of SSRS reports calling these DMVs, which would then dynamically create the cube structure documentation in real time whenever the report rendered..

This post is the first in a 3 part set which will demonstrate how you can use these DMVs to automate the SSAS cube documentation and user guide.

Automating SSAS cube docs using SSRS, DMVs and spatial data | Purple Frog Systems

Thursday, September 08, 2011

Defining Dimension Granularity within a Measure Group

When dealing with multiple Measure Groups in a cube, you could have items repeating if they are at a higher level than the detailed rows.  Here are a couple articles and MSDN info identifying best practices around this.

All but the simplest data warehouses will contain multiple fact tables, and Analysis Services allows you to build a single cube on top of multiple fact tables through the creation of multiple measure groups. These measure groups can contain different dimensions and be at different granularities, but so long as you model your cube correctly, your users will be able to use measures from each of these measure groups in their queries easily and without worrying about the underlying complexity.

http://www.packtpub.com/article/measures-and-measure-groups-microsoft-analysis-services-part2

Users will want to dimension fact data at different granularity or specificity for different purposes. For example, sales data for reseller or internet sales may be recorded for each day, whereas sales quota information may only exist at the month or quarter level. In these scenarios, users will want a time dimension with a different grain or level of detail for each of these different fact tables. While you could define a new database dimension as a time dimension with this different grain, there is an easier way with Analysis Services.

Defining Dimension Granularity within a Measure Group

Thursday, September 01, 2011

Merrill Aldrich : Handy Trick: Move Rows in One Statement

Moving data around has never been so easy.

The fact that we can take output from DELETE and feed it to INSERT actually models what we are trying to do perfectly. And, we get some advantages:

  1. This is now a single, atomic statement on its own.
  2. The logic about which rows to move is specified only once, which is neater.
  3. The logic about which rows to move is only processed one time by the SQL Server engine.

Merrill Aldrich : Handy Trick: Move Rows in One Statement

Degenerate dimensions in SSAS « The Official BI Twibe Blog

By default, SQL 2008 sets the dimension error handling property to Custom, which breaks any dimensions that have duplicate attributes.  For instance, if you have a state or province column in your dimension and it’s not unique, it will “report and stop” when handling the error. 

The fix is to select “report and continue”, or set “default” for your error handling, or setup a composite key that makes the attribute unique, or configure relationships that make it unique.

There are some other things that SSAS does behind the scenes.  This article provides some more info.

The reason is that when processing the dimension, SSAS by default does a right trim and this eliminates not only the spaces, but also any of the three special characters (tab, line feed and carriage return) we added! Note how this differs from T-SQL where these characters are not impacted by RTRIM as can be seen here:

Degenerate dimensions in SSAS « The Official BI Twibe Blog

Friday, August 26, 2011

SQLBI - Marco Russo : DateTool dimension: an alternative Time Intelligence implementation

 

If the number of blog comments signifies how important a subject is, this blog post takes the cake.

The built-in time intelligence features (YTD, Y/Y, etc) of Analysis Services don’t work very well.

Enter DateTool dimension: an alternative Time Intelligence implementation

SQLBI - Marco Russo : DateTool dimension: an alternative Time Intelligence implementation

Wednesday, August 24, 2011

Monday, August 22, 2011

Reporting Services SharePoint Integration in SQL Server Denali - Prologika (Teo Lachev's Weblog) - Prologika Forums

 

In SQL Server Denali, Reporting Services leverages the SharePoint service application infrastructure and it doesn't require installing a Reporting Services server. Not only this simplifies setup but improves performance because there is no round-tripping between SharePoint and report server anymore. Configuring Reporting Services for SharePoint integration mode is a simple process that requires the following steps:

Reporting Services SharePoint Integration in SQL Server Denali - Prologika (Teo Lachev's Weblog) - Prologika Forums

Tuesday, August 16, 2011

Monday, August 15, 2011

Analysis Services Best Practise Analyser

I think it’s Best Practice but whatever, looks interesting…

Analysis Services Best Practise Analyser (SqlAsBpa for short) is a tool which checks your live Microsoft Sql Server Analysis Services 2005 against some important best practises, and reports items which violate these best practises.

Analysis Services Best Practise Analyser

Friday, August 12, 2011

Microsoft SQL Server Community Samples: Analysis Services

Resmon exposes the Analysis Services 2008 DMVs as a cube, which could help expose performance problems and find resolution.

With Analysis Services 2008 and later, data from dynamic management views can be retrieved with SQL query syntax as described here. Therefore, it is possible to build an Analysis Services cube which uses Analysis Services DMV SQL queries as the data source.
The ResMon cube rolls up information about Analysis Services such as memory usage by object, perfmon counters, aggregation hits/misses, and current session stats.

Microsoft SQL Server Community Samples: Analysis Services

Wednesday, August 10, 2011

Tuesday, August 02, 2011

RogerNoble.com

RogerNoble.com: "I needed to un-pivot the values for each month in order to be able to map it to the actual days in the month so I can have a count measure. In addition I also needed to duplicate each row twice as each also represented an ‘In’ and an ‘Out’ transaction. At first glance it would seem that the simple solution calls for some sort of cursor, but having recently seen Jeff Moden’s talk on Numbers Tables at the SQL PASS Summit 2010 I decided instead to solve it using a numbers table but also apply the same logic to the date dimension table that was already in the warehouse. (Jeff has written a great post here: http://www.sqlservercentral.com/articles/T-SQL/62867/ which pretty much covers what he talked about)"

Interesting approach to using CROSS APPLY and a numbers table to distribute and pivot budget data.

Windows: How To Compact A Dynamic VHD

Windows: How To Compact A Dynamic VHD: "This becomes a problem if you make use of multiple VHDs because you are essentially wasting space on files that no longer exist. The solution is to Compact the VHD using Diskpart a tool provided with Windows.."

I have a Hyper-V Windows 2008/Sharepoint 2010 VHD which I am dual-booting from Windows 7. Works great once the memory is beefed up to 8gb. However, during execution the VHD mysteriously expands to 132GB. After shutting down it comes back to a more reasonable(?) 55GB.

Might not be applicable in this scenario, but for those who want to shrink a VHD file without the Hyper-V manager, see above.

Monday, August 01, 2011

Recovery Made Simple: Oracle Flashback Query | Oracle FAQ

Windows has had a recycle bin since Version 3.1, perhaps even earlier.  Even DOS had an undelete command.  How come SQL requires you to restore from backup, assuming a backup even exists?

The Oracle feature, Flashback, just sold me on using Oracle for a solution requiring high availability and uptime, provided the budget is there...  Not sure what a red query has anything to do with it though.

Sometimes it is a rouge query, sometimes a simple data clean up effort by the users, whatever may the cause be, inadvertent data-loss is a very common phenomenon. Backup and recovery capabilities are provided by the database management systems which ensure the safety and protection of valuable enterprise data in case of data loss however, not all data-loss situations call for a complete and tedious recovery exercise from the backup. Oracle introduced flashback features in Oracle 9i and 10g to address simple data recovery needs.

Recovery Made Simple: Oracle Flashback Query | Oracle FAQ

Drawing a logo or diagram using SQL spatial data | Purple Frog Systems

Spatial data in SQL 2008 R2 allows you to create freeform diagrams in your Reporting Services Reports.  Some interesting possibilities here with data-driven diagrams.  For instance, what stores on a mall map generate the most traffic?  What part of an automobile has the most frequent damage replacements?  What part of the body is most affected by a lab test? 

The ability to flag these shapes with red/yellow/green traffic lighting makes it a cool proposition for some deep visual reporting.

My session is about using SSRS, SQL spatial data and DMVs to visualise SSAS OLAP cube structures and generate real-time automated cube documentation (blog post here if you want to know more…).

This shows an unusual use for spatial data, drawing diagrams instead of the usual demonstrations which are pretty much always displaying sales by region on a map etc. Whilst writing my demos, it got me thinking – why not use spatial data to draw even more complex pictures, diagrams or logos…

Drawing a logo or diagram using SQL spatial data | Purple Frog Systems

Friday, July 29, 2011

2014–The year a Cray supercomputer becomes self-aware, assumes control

Perhaps it will be sooner?  The new Cray’s achieve 50 petaflops, so all they need to do is get some biological computing interfaces hooked up to the processor to truly think for themselves as an organic being. 

Amazing that in 1984 we were using 4.77MHz machines to draw lines on the screen and 30 years later we’re looking at not only simulating life, but replicating it.

Engineer and inventor Ray Kurzweil believes that the human Brain is capable of a comparable processing power of about 10 PFlop/s. Of course, there are different opinions and a rather scientific experiment to estimate the computational ability of the human mind by IBM engineers concluded that our brain may be closer to about 38 PFlop/s and a memory of about 3584 TB. At the current speed of progress, we may be able to simulate a human brain as early as next year or, by IBM’s estimates, within 3 years.

Supercomputers Challenge Human Brain | ConceivablyTech

Thursday, July 28, 2011

Intel code guru: Many-core world requires radical rethink • The Register

The new 50 and 100-Core CPUs might actually slow down some programs?  Here’s how.

"We think 'Oh, there's something I'm computing. I'll compute it once and put it into a variable'. Well, if you put it in a variable, and then a hundred processors access that variable to get that data, you've got a bottleneck. But if all one hundred of them computed it, no bottleneck," he said.

"Boy, is that foreign to my brain," he confessed.

We suggested that our generation might have to wait for our kids to become programmers before this new way of thinking became the new standard. "I hope not," he chuckled, noting that although today's programmers may have to learn a new mindset, they do have one great advantage over the next generation of code monkeys: experience.

Intel code guru: Many-core world requires radical rethink • The Register

The GPU-processing mindset is quite similar.  Who would have thought that running applications in the GPU would dramatically increase processing power of otherwise standard hardware.  However, it requires programming specifically optimized for the GPU which many programmers wouldn’t pick up on.

Before you go optimizing for a GPU, watch for the FSA, a GPU+CPU architecture that is transparent to developers.

http://www.extremetech.com/computing/87326-amd-fusion-system-architecture-moves-to-unify-cpus-and-gpus

I wonder how fast email would come up with this one…

http://www.zdnet.co.uk/news/emerging-tech/2011/05/25/cray-taps-gpus-for-50-petaflop-supercomputer-40092887/

The XK6, announced on Tuesday, is made up of multiple supercomputer blade servers. Each blade includes up to four compute nodes containing AMD Opteron CPUs and Nvidia Tesla-architecture GPUs. It marks Cray's first attempt to blend dedicated GPUs and CPUs in a single high-performance computing (HPC) system.

Why does it look like they put in just enough blades to spell CRAY XK6?

The world’s most powerful computer currently runs at 2.5 petaflops.  The Cray beast will run at 50.

save attachments vba - Google Search

This script may be useful to automate the constant email task of saving attachments. It can be applied as an Outlook rule to automatically save attachments for mail messages with a certain subject or sender. It should only be used against trusted senders and with a uniquely identifiable subject so as not to download corrupt or damaging files.

It is saved into the “c:\attach\” directory, however this could be substituted for a network share or Sharepoint folder in the code below.

To setup this rule in Outlook:

1. Tools – Macro – Visual Basic Editor

2. Right-click on Project 1 – Insert Module

3. Paste code below

4. Close window

5. Send yourself a message with the subject and attachment you want, or select an existing message.

6. Right-click on the message – Create Rule

7. Check sender and subject contains and select a part of the subject line that could be used to uniquely identify the email.

8. Click Advanced Options, confirm options and click Next

9. Check 'run a script', click 'a script' select Project1.SaveToFolder

10. Click OK

To turn off the rule:

1. Tools – Rules and Alerts – Uncheck or Delete the rule.

‘<CODE >

Sub SaveToFolder(MyMail As MailItem)

Dim strID As String

Dim objNS As Outlook.NameSpace

Dim objMail As Outlook.MailItem

Dim objAtt As Outlook.Attachment

Dim c As Integer

Dim save_name As String

'Place path to sav to on next line. Note that you must include the

'final backslash

Const save_path As String = "c:\attach\"

strID = MyMail.EntryID

Set objNS = Application.GetNamespace("MAPI")

Set objMail = objNS.GetItemFromID(strID)

If objMail.Attachments.Count > 0 Then

For c = 1 To objMail.Attachments.Count

Set objAtt = objMail.Attachments(c)

save_name = Left(objAtt.FileName, Len(objAtt.FileName) - 4)

'save_name = save_name & Format(objMail.ReceivedTime, "_mm-dd-yyyy_hhmm")

save_name = save_name & Right(objAtt.FileName, 4)

objAtt.SaveAsFile save_path & save_name

Next

End If

Set objAtt = Nothing

Set objMail = Nothing

Set objNS = Nothing

End Sub

Private Sub Application_Startup()

'MsgBox "Welcome, " & Application.GetNamespace("MAPI").CurrentUser

Application.ActiveExplorer.WindowState = olMaximized

End Sub

‘</CODE>

Excel Hacks book

save attachments vba - Google Search

Thursday, July 21, 2011

SqlASDoc

Document your Analysis Services databases.

Analysis Services Documenter (SqlASDoc for short) is a tool which creates documentation for your Analysis Services 2005 and Analysis Services 2008 databases. It connects to your live server, and queries the metadata for the name and description which the developers attached to every object (database, cube, dimension, hierarchy, attribute, translation, data source, data source view,...). It then produces an HTML file which contains the name and description of all these objects. But there is more: also other properties, such as last processing time, creation time, sizes of fact tables etc. are written to the HTML file.

SqlASDoc

Trust-building with Lync

Business Intelligence analysis doesn’t have to be just about the numbers.  Unstructured information contains perhaps more valuable insight into day-to-day activities than financial or KPI data.

Here’s one add-in for Lync which lets you analyze your relationships via messenger conversations.

Conversations Analyzer is an application that reviews your Microsoft Lync instant messaging (IM) conversation history and gives you scores for your use of trust-building language in day-to-day communications with business associates and friends. Conversations Analyzer applies trust rules to determine the occurrence of words and phrases that have a positive or negative effect on trust.

Chad McGreanor's Blog

Saturday, July 09, 2011

Updated Analysis Services Whitepaper

http://sqlcat.com/whitepapers/archive/2011/06/02/sql-server-2008r2-analysis-services-operations-guide.aspx\

Last month's release of an updated Analysis Services Operations guide.

Abstract: 
This white paper describes how operations engineers can test, monitor, capacity plan, and troubleshoot Microsoft SQL Server Analysis Services OLAP solutions in SQL Server 2005, SQL Server 2008, and SQL Server 2008 R2.

Thursday, July 07, 2011

SQLBI - Marco Russo : Microsoft updates its BI Roadmap - #ssas #bism #teched #powerpivot

Curves, forks, and dead-ends in the roadmap of MS BI.  Avoiding MDX in favor of SQL may no longer be an option when the MS “Universe” data model layer is adopted.

I just hope MS provides some tools that an end-user could use to maintain these BISM schemas, without having to understand DAX and MDX.  And Visual Studio, or anything else over 500MB, is not a client tool that can be easily deployed.

Why not put this stuff into Sharepoint or Office?  Building BISMs in Excel seems to be a good idea to me.

Thus, we'll have a single unified model, called BI Semantic Model (BISM). BISM will have two technologies to model data: the Multidimensional one (formerly known as UDM) and the Tabular one (that will be an enhancement of the current PowerPivot data model experience). You will have two languages, MDX and DAX, to define business logic and to query the model. And you will have two engines, MOLAP and Vertipaq (with their corresponding counterpart for accessing data in passthrough mode, ROLAP and Direct Query). The great and wonderful big news is that in BISM it will be possible to combine these components as you need: DAX query over a multidimensional model, MDX Script over a Tabular, and any client able to access to any model.

SQLBI - Marco Russo : Microsoft updates its BI Roadmap - #ssas #bism #teched #powerpivot

Sunday, July 03, 2011

Fascinating concept - Using Processing for data visualization

Pointclouds and heatmaps are my favourite visualization tools.  Here is one example of how a city (in this case a video game city) could be mapped to show where the most accidents or other incidents occur.

http://www.vizworld.com/2011/05/heatmaps-point-clouds-big-data-processing/

Thursday, June 30, 2011

Microsoft Research - Turning Ideas into Reality

If you are looking for innovation and future ideas, Microsoft Research is a good place to start.  Steve Ballmer is on the Innovation Team for the US Government.  Though spending for MS research has dipped recently, they are spending 90% of their $9.6 billion on the cloud.

  • Windows Azure
  • SQL Azure
  • Office 360

Why is the cloud so important?  Tablet computing.  A tablet doesn’t have enough disk space to house an entire product suite, and IT staff wouldn’t want to be patching and supporting tablets in addition to desktop computers.

The tablet is a rogue PC that is infiltrating corporations organically.  Like the iPhone, tablets are causing IT to take notice and focus efforts on mobile computing.

901 results for cloud on Microsoft Research.  836 results for Windows.

http://research.microsoft.com/apps/dp/search.aspx?q=cloud#p=1&amp;ps=36&amp;so=1&amp;sb=d&amp;fr=&amp;to=&amp;fd=&amp;td=&amp;rt=&amp;f=&amp;a=&amp;pn=cloud&amp;pa=&amp;pd=

Looks like the future is a more ad-hoc, rapidly changing cloud environment.

People - Microsoft Research

Microsoft Research - Turning Ideas into Reality

Thursday, June 23, 2011

SSIS: Using IF(IIF) logic in Derived Column

Of course they couldn’t use IF or IIF (SSIS team had to be different…) so here’s how to do an if statement in a derived column.

Update: So the question was if how you would use the substring statement in the body of you iif statement to check if the column was blank…if it is then set if to null otherwise set it to the substring value.
Basically, it would be like this. Please note that I have used a different form to test for a blank field. In my version I use Trim and LEN(gth) functions so that the line can be a thousand blank spaces and I will get the same result..my column name in this instance is “Test”
(LEN(TRIM(Test)) > 0 ? SUBSTRING(Test,1,5) : NULL(DT_WSTR,5))

SSIS: Using IF(IIF) logic in Derived Column

Why varchar2?

One reason why you don't name your keywords with size limitations.

varchar2 is an Oracle datatype, equivalent to its (as well as TSQL's) own varchar.

CLARIFICATION

The differentiation between Oracle's varchar and varchar2 seems to have originated back whenthe ANSI SQL standard was still being formulated. I cannot locate any definitive references as to what exactly the implementation differences, if any, were initially (pre-Oracle 7), as this remains a source of confusion. Suffices to say that the two are practically treated as synonyms since Oracle 7. Both suffered improvements (maximum size went from 2k to 4k) in release 8. Note that TSQL's varchar can store 8k

http://stackoverflow.com/questions/621439/does-ms-t-sql-include-a-data-type-called-varchar2-and-if-so-whats-the-differen

Wednesday, June 22, 2011

Database triggers are evil

Some concerns around triggers.  Usually there are performance and blocking issues whenever I see triggers in a database.

1) they make things happen "automagically"
2) they are rarely implemented correct
3) they slow down DML

About Oracle: Database triggers are evil

Wednesday, June 15, 2011

Merge Multiple CSV Files | Zorba the Geek

 

Useful tip of the day – merge multiple files….

You can merge multiple csv or text files with a simple DOS command. Copy the code below into a text editor and save the file with a .bat extension. Save the file in the same folder as the csv files.

copy *.csv importfile.csv

I've used this as part of an import routine. Step one produces a set of csv files from SQL queries on an Oracle database. Step two merges all the csv files together. Step three imports a single file into Sage MMS

Merge Multiple CSV Files | Zorba the Geek

Business Intelligence for IT

Software as a Service and Smart Apps are the future of BI

A flood of vendors have entered the SaaS BI market in recent years, including Actuate, BIRST, GoodData, Host Analytics, QlikView, Oco, Panorama, and PivotLink, while larger BI players like IBM, Oracle, Microsoft, and MicroStrategy are introducing products to take advantage of the move toward self-service BI. And all of them are racing to get a larger presence on smartphones and connected tablets.

Business Intelligence for IT

Thursday, June 09, 2011

MS BI Labs

Microsoft BI Labs consolidates some of the tools and add-ins from Microsoft Labs related to BI.

PivotViewer Extension for Microsoft SQL Server Reporting Services

Microsoft SQL Server Data Mining for the Cloud

Fuzzy Lookup Add-in for Microsoft Excel 2010

MDX and DAX Formatter

Microsoft SQL Server Reporting Services Log Viewer

Home

Wednesday, June 01, 2011

3rd-party BI Tools

Sometimes the built-in tools available from Microsoft or your corporate BI software aren’t enough.  It should not be taboo to look for 3rd-party tools that fulfill a need and provide value. 

Most companies would not entertain use of third-party tools and utilities, for the obvious reasons of cost, developer expertise and corporate policies. But it is not that tools should not be used at all, it's just that there should be correct reasons when you should consider the options of using freeware or third-party licensed tools and utilities. Below are some of the reasons in my viewpoint when usage of tools can be justified and considered.

Siddharth Mehta's Blog

Tuesday, May 31, 2011

Black Clouds on the horizon

Last month’s outage of Amazon may have serious repercussions on the future of Cloud-based services and storage, and even business intelligence offerings in the cloud.

Like rolling blackouts, a disconnect from the cloud can kill business.  Uptime, especially for a large cloud provider, has to be seamless.  Like a hydro plant, it just has to work.

Although downtime for BI tools usually doesn’t equal the urgency of downtime for mail or web hosts, more than a day could be a killer for some apps.

Amazon outage sparks frustration, doubts about cloud - Computerworld

Lars Bjork of QlikTech, on Order vs. Bureaucracy - NYTimes.com

“Order is where you put a process into place because you want to scale the business to a different level.  Bureaucracy is where nobody understands why you do it. “

Lars Bjork of QlikTech, on Order vs. Bureaucracy - NYTimes.com

I would add, bureaucracy is where you put approvals for daily processes in place that exceed a single level of the company organization structure, and the company structure looks more like a family tree than a nicely trimmed hedge.

Friday, May 27, 2011

SQL Server Destination Vs OLE DB Destination and 64-bit Oracle Drivers

For some reason, SSIS team dropped the ball when dealing with 64-bit and SQL 2005.  Cryptic errors and non-working packages, and command-line workarounds were everywhere.  SQL 2008 has a “use 64-bit” option when scheduling the package.  However, there are still challenges with 3rd-party drivers.

Here’s a workaround to get Oracle 64-bit drivers going.

* Install Oracle 32 and 64 bit drivers, 11g i think, maybe first install 64 bit then 32 bit
* Install Attunity Oracle provider 64 bit
* Copy Visual Studio directory to new dir: C:\ProgFilesX86\

Should work now both in designtime (BIDS) and runtime (SSIS). Oracle is sensitive to () in the path of the calling app so everything under Program Files (x86) will fail. After normal installation of SQL Server, simply copy VS dir and start devenv.exe from there.

Use Attunity:s .NET provider in SSIS. Really fast and stable.

SQL Server Destination Vs OLE DB Destination

Monday, May 23, 2011

Glenn Berry's SQL Server Performance | Semi-random musings about SQL Server performance

Microsoft has always put limitations for licensing purposes on SQL Server.  Express edition could only use 2GB of RAM.  Standard edition was a laggard in terms of features in Analysis Services.  Most of the time this required our customers to purchase Enterprise edition, and possibly sacrifice a scale-out architecture for scaling-up.

Scale-up might not let you get around these limitations with SQL 2008 R2.  Datacenter edition might be the only option for those customers with > 8 CPUs.

What is new for SQL Server 2008 R2 Standard Edition and SQL Server 2008 R2 Enterprise Edition, are more restrictive hardware license limits compared to the SQL Server 2008 versions of both of those editions.

SQL Server 2008 Enterprise Edition had no limit for the number of processor sockets, but was limited to 64 logical processors. SQL Server 2008 R2 Enterprise Edition imposes a new limit of eight physical processor sockets, but will theoretically let you use up to 256 logical processors (as long as you are running on Windows Server 2008 R2). However, this is not possible, currently, since it would require a processor with 32 logical cores. As of April 2011, the highest logical core count you can get in a single processor socket is 20 (if you are using the new Intel Xeon E7 series). Also, the RAM limit for R2 has changed from “operating system limit”, as it was in the 2008 release, to a hard limit of 2TB.

SQL Server 2008 R2 Standard Edition  has a new RAM limit of 64GB. This lowered limit may catch many people by surprise, since it is very easy to have much more than 64GB of RAM, even in a two-socket server. You should keep this RAM limit in mind if you are buying a new server and you know that you will be using Standard Edition. One possible workaround for this limit would be to have a second or third instance of SQL Server 2008 R2 Standard Edition installed on the same machine, so you could use more than the 64GB limit for a single instance. The physical socket limit for SQL Server 2008 R2 Standard Edition is still four processor sockets.

Glenn Berry's SQL Server Performance | Semi-random musings about SQL Server performance

Monday, May 16, 2011

A Glimpse at Project Crescent - SQL Server Reporting Services Team Blog - Site Home - MSDN Blogs

Project Crescent seems to fall into the “shiny tool” space that Crystal Xcelsius occupies.  From a presentation standpoint, it pops.  Will it be a daily reporting workhorse?  Let’s see.

Click on the Crescent logo below to play the Project "Crescent" teaser video and experience Project “Crescent” from the Microsoft Company meeting where we originally showed over 70’000 people for the first time this unique ability to bring data to life:

clip_image002[6][4]

See the complete keynote at PASS Summit 2010 here: http://www.sqlpass.org/summit/na2010/

A Glimpse at Project Crescent - SQL Server Reporting Services Team Blog - Site Home - MSDN Blogs

Coming out of Microsoft there are a few tools that really made me turn my head a couple times and go “wow”.  Data Analyzer, an OLAP tool from early 2000 was one of those.

It had the ability to save as PowerPoint, which Reporting Services and even Excel couldn’t do.  It was discontinued after Office XP.  It took PerformancePoint to bring that “export to PPT” feature back to life.  And I didn’t know more than 2 people who actually knew what it was or used it more than twice.

Another tool I found innovative was Site Server 3.0’s Content Management link map.  It was decision tree style 3D interface that let you browse through your site like a spider’s web, with the capability of zooming into the web.

Looked something like this but in 3-D!

The Wikipedia article on Site Server just about sums up many Microsoft “Product as a Solution” offerings targeted at the gray area between technical and business users.

On this front, Site Server's main advantage was its low cost. Another feature that might have been a source of confusion was the taxonomy management system. The tools used to maintain item metadata were very basic and required a degree of technical familiarity foreign to most business users.

Site Server was discontinued after it’s 3.0 release in 1998.

Sounds a lot like PerformancePoint Planning.  Let’s rephrase that.

On this front, PerformancePoint Planning’s main advantage was its low cost. Another feature that might have been a source of confusion was the model management system. The tools used to integrate data were very complex and required a degree of technical familiarity foreign to most business users.

In the end, all things were absorbed by Sharepoint, except perhaps for that cool link visualization tool, and PerformancePoint Planning. 

Perhaps, in the next version of Sharepoint, Microsoft will have another cube model builder with integrated dimension data management and a powerful set of financial reporting tools.  Maybe even a cool 3D hyperlink visualizer too…

Wednesday, May 11, 2011

Canada Post - Householder Counts and Maps / Nombre de chefs de ménage et cartes

Building an application requiring map or postal code info for Canada?  Learn about the number of residences or businesses by postal code using this service by Canada Post.  Also includes PDF boundary maps.

http://www.canadapost.ca/cpc2/addrm/hh/current/indexp/cpALL-e.asp

Another service (paid) provides an updated database of postal codes with detailed address and lat/long information.

http://www.zipcodeworld.com/postalcodegold.htm

Canada Post - Householder Counts and Maps / Nombre de chefs de ménage et cartes

Monday, May 09, 2011

Insights (Business Intelligence) Demo 1 Version 3 - iwdemos - Site Home - MSDN Blogs

Sharepoint Insights demos for BI.

Overview

This SharePoint and Office 2010 demonstration Content Pack provides the content and scripts to support Insights Demo 1.  The goal of this demo is to demonstrate and communicate the value of Business Intelligence (BI) in SharePoint 2010. We highlight some of the “eye-candy” components that pop, and demonstrate some of the Enterprise Content Management (ECM), social media, and enhanced search capabilities of SharePoint 2010 along the way.  Please note that this demo is not designed to “sell” a particular feature, BI solution, or application. Rather, it seeks to demonstrate a sweep of Microsoft BI solutions (except SRS). We aim to show that BI is not merely a backward-looking set of data analysis tools to be used by executive management or financial analysts.  Providing flexible BI solutions with SharePoint makes key business data accessible to mid-tier employees in an organization, to be used in daily decision-making and planning.  It is intended for use with a specific Virtual Machine listed in the Installation & Setup section.

Insights (Business Intelligence) Demo 1 Version 3 - iwdemos - Site Home - MSDN Blogs

Friday, May 06, 2011

Resources for Learning about Microsoft Business Intelligence - SQL Server and the Data Platform in the Field - Site Home - MSDN Blogs

A verbatim repost of a great training linkfest from Clint Kunz.

Resources for Learning about Microsoft Business Intelligence - SQL Server and the Data Platform in the Field - Site Home - MSDN Blogs

Wednesday, May 04, 2011

WMIC - Windows Management

With 6000+ attributes available to monitor in Windows XP this command line tool is very underrated.

WMIC.exe

Windows Management Instrumentation Command.
Read a huge range of information about local or remote computers. Also provides a way to make configuration changes to multiple remote machines.

WMIC - Windows Management

Monday, May 02, 2011

Gartner Predicts the Future

I wonder what BI tool they’re using for that one… must be something in-house.

Four key BI predictions to help organizations plan for 2011 and beyond.

Gartner Business Intelligence Summit e-newsletter

Friday, April 29, 2011

Gartner Magic Quadrant for Business Intelligence Platforms- Microsoft is the leader - My Application Platform Quotes - Site Home - MSDN Blogs

Gartner is the leader in creating Magic Quadrants.  Microsoft is now the leader for Business intelligence, according to Gartner.

clip_image001

The niche market is becoming quite cluttered as the big players acquire and knock-out specialized tools in favour of mass markets and commoditization.  Panorama, a company I worked with in the past, seems to have fallen a bit out of favour.  Microstrategy is becoming a leader in the BI space and I think it’s a big potential acquisition target for one of the bigger vendors, if they decide to split their stock, and if the stock wasn’t as overvalued as it appears to be in my opinion.

With the stock on the way to $140/share, Microstrategy and it’s Ipad 2 strategy appears to be working.  MSTR has a market cap of under $2 billion.  Microsoft has $217 billion. IBM - $200 billion. Oracle - $172 billion. Apple - $317 billion(!)

There should be more consolidation in the future… but perhaps MSTR will be staying independent for awhile longer…

Gartner Magic Quadrant for Business Intelligence Platforms- Microsoft is the leader - My Application Platform Quotes - Site Home - MSDN Blogs

Monday, April 25, 2011

Data Warehouse versus Business Intelligence - Executing a business service with precision - Site Home - MSDN Blogs

Daniel Rubiolo blog about his view of Business Intelligence, with some great metrics on the components and percentage of resources dedicated to a BI solution.

The traditional definition of “data warehouse, or DW” from late ‘70s/early ‘80s involved the end-to-end solution for business users… from extracting data and aggregating it into special data models, the queries, all thru the applications/reports the users interacted with to consume that data.

According to Daniel, a BI project is 70% getting the data and 30% showing it off.  I would suggest it fits with the 80/20 rule in most cases.  80% of the time and effort in a BI project would be sourcing the data and gathering requirements, designing, developing, testing, and deploying a solution.  20% of the time would be adjusting and presenting reports.

This is probably why PowerPivot is such a favourite for power users, since they don’t necessarily require outside IT resources to build their models.  It becomes a 50/50 game with a faster ROI.  However, scalability and maintainability of a solution that resides on a user’s desktop remains in question.  The “hit by a bus” rule makes desktop-based custom models a potentially risky solution for a business

Data Warehouse versus Business Intelligence - Executing a business service with precision - Site Home - MSDN Blogs

Friday, April 22, 2011

Business Intelligence Recap of 2010 - Microsoft Business Intelligence - Site Home - MSDN Blogs

Emilie Bridon provides a BI recap of 2010 in the Microsoft space.

The one that has me interested is BISM.  Sounds like it might finally provide a central “one truth” version of your metadata.  However the limitation of requiring Visual Studio might kill any hopes of leveraging it from a pure business perspective.  Users that I work with need to be able to manage their data without an IT gatekeeper.

Can’t see anything else that excites me a whole lot with the BI space in 2010, except perhaps with Social media.  It was truly an acquisition and consolidation year…

Business Intelligence Recap of 2010 - Microsoft Business Intelligence - Site Home - MSDN Blogs

Monday, April 18, 2011

Michael J Swart - Database Whisperer

 

Michael posts his Study Plan for 70-433 exam and details on SQL 2005 support

Yesterday was April 12, 2011, and as far as I know mainstream support for SQL Server 2005 ended. See Aaron Bertrand’s A Quick Note About SQL Server 2005 Mainstream Support.

Michael J Swart - Database Whisperer

Upgrading to SQL 2008 R2

With SQL 2005 support expiring shortly, the move is on to switch to SQL 2008 R2 (why would you switch to just SQL 2008?)

Running upgrade advisor before the upgrade is a must.  However, some of the warnings could safely be ignored rather than changing a potentially large codebase.

I have seen many people asking how shall they move towards upgrading their SQL Server 2000 Database to SQL Server 2008. I thought of putting together an article based on my experience where I have migrated many servers from SQL Server 2000 to 2008. This particular Article is about analyzing incompatible code which is a MUST step in my view. At end of this article, I have compiled list of various errors/warning that you may come across after running analysis tool.

SQLVillage.com

Friday, April 15, 2011

Using Quest’s Spotlight?

Watch for the performance effects of running SQL Analysis on a production box.  It is running a firehose client-side trace from the Spotlight server to the SQL Server.

Based on your information, it would appear that someone in your group may be using SQL Analysis or session trace in the product. SQL Analysis is a schedulable utility that initiates a SQL Trace based on the criteria you specify - if it's enabled with the collection type default and only the options on the initial screen specified, you may be running a continuous client-side trace without any filtering to capture the various statement/procedure/RPC events. Session-level tracing initiated if someone clicks into a specific SPID (session_id) and then selects the trace tab, so it's a less likely option.

Spotlight on SQL Server Ent. 701 - ... | SQL Server | Database Management Community | Quest Software