Monday, September 17, 2012

Track report usage



One of the best ways to ascertain how effective your portal is in reaching the users the reports were designed for is to track how many times a report is viewed and who is viewing it.

The report above is a no frills report I have in my 'administrator reports' section of the portal at our company.  It is solely for my use.

It shows the report name and the number of times it has been viewed during the period specified in the input parameters.

The query is as follows:

USE [ReportServerSP]

DECLARE @start DATETIME -- < of course these are the input parameters on the actual report,
DECLARE @end DATETIME -- < I just show them here for clarity

SET @start = '08/01/2012' -- < of course these are the input parameters on the actual report,
SET @end = '08/31/2012' -- < I just show them here for clarity

SELECT [ReportPath][Report], COUNT(*)[TimesViewedForPeriod]
FROM [dbo].[ExecutionLog2] AS el
WHERE [TimeStart] BETWEEN @start AND @end
AND [ReportPath] <> 'unknown'
GROUP BY [ReportPath]
order by count(*) desc
;

The database, ReportServerSP is one we created when we integrated SSRS with WSS3.0, the SP at the end clues us that it is for the 'Sharepoint' based portal.

When a report name is clicked, the next report drilled to provides a breakdown as to the number of times a particular user on the DOMAIN viewed the report.

That linked report's query is:
USE [ReportServerSP]


SELECT [UserName], COUNT(*)[UserViewsForPeriod]
FROM [dbo].[ExecutionLog2] AS el
WHERE [TimeStart] BETWEEN @start AND @end
AND [ReportPath] IN ( @reportpath )
GROUP BY [UserName]
;


Using these two reports has come in very handy.  There have been times when the President has wanted to know for a particular report, whether it was viewed during a range of dates, and if so, how many times and by who.

Using these reports came in very handy on those days.

Even if you don't have your SSRS reports integrated to WSS, by pointing these queries to your ReportServer database, you should be able to extract the same data.

Have fun with it!

Friday, September 14, 2012

Make Reports Easier to Use



One of the things about a successful SSRS install is that after a short time there are so many folders and reports, the general user population gets lost.

Try this.  I have found it well received by the user community where I work.

Create a Word document as a directory of your reports in a folder.
Each report has its name as a hyperlink directly to the report on the portal.
I captured an image of the rendered report and embedded it in the document under the report title, this gives the user a 'picture' view of what the report looks like.
Below that I provide a short description of the input parameters the report takes and the data the report provides as output.

Each time a report is changed, deleted, or a new report added; I update the document.

I have provided the user community with an understanding of how they can be alerted by email when a change to the document has been made.

In this way, the user community is kept abreast of the reports on the portal and has an easier way of examining reports and getting a direct link to the report they want.  Saves a lot of time rummaging around in folders that they may not be as familiar with as people that use a particular report every day.

Also, if you note the particular report that is highlighted above; this is a 'control panel' report.
My service manager wanted a set of analytic reports to examine service on a particular manufacturer's product lines - how many parts have we installed, what % was warranty, how much labor and travel, etc.

We built this control panel report to feed the exact same set of inputs to each report in the series.
In this way, he can run a report, examine it, export it to Excel should he wish; then press the backspace key to get to this control panel.  Then all he need do is click the next hyper-linked report name or image and the same set of inputs is fed to the next report in the series.

The point here is that a portal may contain a great set of reports; they may be innovative and extremely useful.  But if they are hard to locate, or the user community is frustrated in their attempts to use it, then it won't succeed, no matter how brilliant the reports are.

Take a bit of extra time to make the reports as easily accessible as you possibly can.  Solicit feedback from the user community; don't be defensive when they call 'your baby' ugly though; listen to what they are telling you and do your best to incorporate their ideas.  Follow up with them and ask them about your implementation of THEIR IDEAS; you'll find that in itself will get their attention.  Once you have the lines of communication open on using the reports, you'll hopefully find they're being used more often!


Tuesday, September 4, 2012

Monitoring database size

One thing every DBA ought to have (and many do!) is a way to monitor and routinely report on the size of databases and their growth.



This is important when it comes time to discuss the purchase or expansion of a SAN, NAS or other storage subsystem.

Frequently management will make decisions that significantly accelerate database growth, yet the nature of the change makes it almost impossible to predict with any accuracy how fast the growth will be.

Having a good means of tracking, measuring and reporting on this growth is vital to our success when wearing our DBA hat.

The script below is one I use.  Every SQL Server install I do, the first thing I do after getting the server up on its legs is I create the database 'DBA'.  This holds all my management and measurement 'stuff'.  This script creates the table tbl_dbsize (in the comment).  Then, whenever I run it, the script gets information on the database files for each database on the server and the size of the file in MB.  The "inserted on" timestamp column gives me a way to measure the growth over time.

We recently made the decision to start scanning copies of important documents at the company where I am employed; these document scans can be added to a 'file room' tab in our ERP system.  This is great for our ability to attach document copies to transactions in the system.  However, when all our locations start doing this and we have no way of knowing with certainty the size of the scanned files, close monitoring is prudent.

So, I'm ratcheting up the frequency of this job from monthly to weekly right now in order to see what impact these scans will have on growth of the database.

If you don't have a script or job like this, give this one a try.


USE [DBA]
/*
Report: Database Size
Contents:   size of database files

CREATE TABLE tbl_dbsize (
[inserted on] DATETIME,
[DB Name] NVARCHAR(255),
[Logical file] NVARCHAR(255),
[Physical file]    NVARCHAR(1000),
[MB] INTEGER
)
;
*/

INSERT INTO tbl_dbsize
SELECT GETDATE()[inserted on],
DB_NAME(database_id) AS [DB Name],
Name AS [Logical file],
Physical_Name [Physical file], (size*8)/1024 SizeMB
FROM sys.master_files




SELECT * FROM dbsize; -- <-- right now I don't have the need for an SSRS report, I dump the answer set into an Excel spreadsheet that I can use to track the growth of files over time.