Monday, December 28, 2009

QUOTENAME : A little known util function for "bracketing" a value

DECLARE @TEXT VARCHAR(50)
DECLARE @NUMBER INT
DECLARE @QUOTE CHAR(1)

SET @QUOTE = '"'
SET @TEXT = 'hello'
SET @NUMBER = 13

--bracketed with default brackets
SELECT  QUOTENAME(@TEXT), len(@TEXT), len(quotename(@TEXT))
--bracketed with default brackets
--note that an implicit conversion occurs)
SELECT  QUOTENAME(@NUMBER)
--bracketed with a double quote
SELECT  QUOTENAME(@TEXT, @QUOTE)

Tuesday, December 15, 2009

Great SQL Brain Teaser.... how many records are in the table?

Create Table TBL1 (col1 int, col2 int)

Create Table TBL2 (col1 int, col2 int)

 

--query1

select count(col1) from TBL1 where col2 >= 5

--results = 3

 

--query2

select count(col1) from TBL1 where col2 < 5

--results = 2

 

-- can you tell how many records are in the table?

-- how/why?

Friday, December 11, 2009

Find the most recently modified table constraints

USE somedatabasename
GO

SELECT DISTINCT CONSTRAINT_NAME

                                 ,modify_date
FROM INFORMATION_SCHEMA.KEY_COLUMN_USAGE
LEFT OUTER JOIN sys.all_objects
ON CONSTRAINT_NAME = name
WHERE TABLE_CATALOG = N'somedatabasename'
ORDER BY modify_date DESC

--complements to Igor Malkiman of Qwest Communications for this code

Wednesday, December 9, 2009

When were the stored procedures last modified?

USE [SOMEdbNAME]
GO
SELECT so.name, so.modify_date
FROM SYS.OBJECTS SO
WHERE SO.name in
   (SELECT ROUTINE_NAME
   FROM INFORMATION_SCHEMA.ROUTINES isr
   WHERE isr.ROUTINE_CATALOG='SOMEdbNAME')
ORDER BY so.modify_date desc

--complements to Igor Malkiman of Qwest Communications for this code

Tuesday, December 8, 2009

Deleting records from a source table as you add them to a destination

Frequently we come across a case where the records in a source table must meet a certain condition in order to be loaded into the destination table.  I recently came across a problem where the records were to remain in the source table if the conditions were not met. 

So the question was how to keep track of the inserted records? 
  • Query the source table after the insert for records matching the destinationtable and then delete them? 
  • Create a #Temp table of what is to be inserted and join that same table to the source for deletion?
Or:
  • Use the OUTPUT clause (available on insert, update and delete statements) to populate a table of record-keys of what has been inserted and later use that to join to the source records for deletion
Here is an example:

CREATE TABLE #T1(COL1 INT)
CREATE TABLE #T2(COL1 INT)
CREATE TABLE #T3(COL1 INT)

DECLARE @iRec int

TRUNCATE TABLE #T1
TRUNCATE TABLE #T2
TRUNCATE TABLE #T3

-- populate some data 
set @iRec= 1
While @iRec <= 10
BEGIN
   INSERT #T1 VALUES(@iREC)
   if @iRec < 6
      insert #T2 values(@iREC)
   set @iRec = @iRec + 1
END

-- perform the deltion
DELETE #T1
      OUTPUT Deleted.*  INTO #T3
FROM #T1 ONE
JOIN #T2 TWO ON ONE.COL1 = TWO.col1

-show the results
SELECT '#T1 - original recs',* FROM #T1
SELECT '#T2 - deleted recs ',* FROM #T2
SELECT '#T3 - new table ',* FROM #T3

-- cleanup
DROP TABLE #T1
DROP TABLE #T2
DROP TABLE #T3



Thursday, October 15, 2009

Restarting SSIS Packages without processing ALL the records over again

Here is a link to a little-known solution to a common problem.  Every time I have restarted a SQL SSIS package in the past that performs transformations on records, all the records get processed over again. 

This is a problem in a couple respects:
  • Processing time wasted
  • Code must check for records already processed
  • Additional time is spent maintaining that code and procesing
Here is the link:
Restarting SSIS Packages with Checkpoints

While this article uses SQL 2008 as an example, SQL 2005 also employes this same feature.

Wednesday, October 7, 2009

What is Business Intelligence?

Business Intelligence (BI) refers to skills, technologies, applications and practices used to help a business acquire a better understanding of its commercial context. Business intelligence may also refer to the collected information itself.

BI technologies provide historical, current, and predictive views of business operations. Common functions of business intelligence technologies are reporting, OLAP, analytics, data mining, business performance management, benchmarking, text mining, and predictive analytics.

In a 1958 article, IBM researcher Hans Peter Luhn used the term business intelligence. He defined intelligence as:[1] "the ability to apprehend the interrelationships of presented facts in such a way as to guide action towards a desired goal."


In 1989 Howard Dresner (later a Gartner Group analyst) proposed BI as an umbrella term to describe  [2]"concepts and methods to improve business decision making by using fact-based support systems." 

It was not until the late 1990s that this usage was widespread.  More recently with the release of SQL 2005, the marketing department at Microsoft has trademarked the term with the introduction  of their "Business Intelligence" toolset.  Thus many believe BI to be something new on the scene.  Before the term "BI" gained popularity, terms such as Predictive Analytics, Decision Support Systems (DSS), and Executive Information Systems (EIS) covered earlier iterations of the same science.

[1] H. P. Luhn (October 1958). "A Business Intelligence System" (PDF). IBM Journal. http://www.research.ibm.com/journal/rd/024/ibmrd0204H.pdf. Retrieved 2008-07-10.


[2] D. J. Power (2007-03-10). "A Brief History of Decision Support Systems, version 4.0". DSSResources.COM. http://dssresources.com/history/dsshistory.html. Retrieved 2008-07-10.

Monday, September 28, 2009

Running with Scissors

Business Intelligence, like any other technology, has the propensity to be used in ways that may adversely effect people where the action is not deserved. Case in point; The recent number of reports by MSNBC and CNN where poeple are recieving notices from AMEX that thier credit limit has been lowered.

Reason why? Among other explanations given, "“Our credit experience with customers who have made purchases at establishments where you have recently used your card.”

I have the distinct impression that this will not be legally allowed to continue for the reason that people have the right to shop wherever they want to shop without negative impact and also the negative impact upon those merchants identified as places where "at risk" shoppers frequent.

Friday, September 18, 2009

This is funny and too often true...

As with anything I suppose, there are true experts, newbies, and wannabe's. How many of us remember the wannabe web programmers who could do little more than a home page? Or those so called design and user-interface specialists?

Yes, these are real and valuable fields of expertise, but just like anything else there are "posers" as discussed in the following post:

http://darmano.typepad.com/logic_emotion/2009/09/snake.html

Painful as it is, he does put bankers pretty low and I was in the banking field for many years. Although I cannot blame anyone for the lack of distinction between the average bank employee and the true posers who opted for greed over morals in the mortgage lending business and those in our federal government who looked the other way for whatever rea$on.

Wednesday, June 24, 2009

You Can't Copyright Common Sense

No you cannot, unless your name happens to be Thomas Payne, you were born over 200 years ago, and actually published the pamphlet titled, "Common Sense".

The truth is that if you listen to most speakers today or read one of the plethora of books about sales, service or whatever, you will find that they are telling you about what you probably already know. For example, "A firm handshake and friendly smile go a long way toward making a lasting impression".... well of course it does.

Is there any true new ground when it comes to human interaction that has not already been written? Possibly, but not nearly as much as has been re-written over and again since the beginning of recorded time. So the question is, why do people keep buying the books, listening to those speakers or receiving the epiphanies in the middle of the night?

The answer is simple. No you cannot copyright common sense - but what ideas you put together behind what you say is as important as how you put your personal brand on it.

Wednesday, June 10, 2009

Dynamic SQL Passthrough Queries with Parameters

If you need to run a passthrough query against another DBMS and need to create that SQL on the fly because of a parameter, then you may be out of luck. At least, according to MSDN:
http://msdn.microsoft.com/en-us/library/ms188427.aspx

However, there is a nice workaround you can employ that will do the job. Here is an example:

-- variable to contain the passthrough sql statement
DECLARE @SQL VARCHAR(300)
-- variable to contain the dynamic lookup value into the query
DECLARE @KeyLookup INTEGER
-- varliable to contain the passthrough query
DECLARE @query VARCHAR(400)
-- populate the keylookup value
SET @KeyLookup = 102
-- prepare the sql statment for the passthrough query
SET @query = 'select * from some_table where key_value = ' + CONVERT(VARCHAR(10) , @KeyLookup)
-- prepare the passthrough execution query
SET @sql = 'select * from openquery(linkedservername, ''' + @query + ''')'
-- option to print the passthrough execution query in full for debug purposes (this is nice because you can paste it into a sql

-- editor and run it to see what is wrong)
PRINT @SQL
-- execute the passthrough
EXEC (@SQL)

Monday, June 1, 2009

Creating System Stored Procedures

So what are the steps to creating a systemwide stored procedure in SQL Server?
  • Create them in the Master Database
  • Name must start with "sp_"
  • Mark them as System Objects using
    • for sql 2000 - master.dbo.sp_MS_upd_sysobj_category
    • For 2005 and later - sys.sp_MS_marksystemobject
For a test in 2008:
  1. Create a stored procedure in the system databaseone called sp_test
  2. Register them as system stored procs using sp_MS_marksystemobject.
  3. Then try to use them each from another database without prefixing them with master.dbo.

Thursday, May 28, 2009

How to send IM text when a SQL job fails

/* this is very useful for critical production jobs that may fail at times when you do not have access to your email, but can be contacted via SMS Text Messaging on your CellPhone*/

set nocount on
declare @message varchar(555)
set @message = 'Process failed in job xyz'


-- different providers have different addresses to send SMS to cell phone --
/* T-Mobile: phonenumber@tmomail.net
Virgin Mobile: phonenumber@vmobl.com
Cingular: phonenumber@cingularme.com
Sprint: phonenumber@messaging.sprintpcs.com
Verizon: phonenumber@vtext.com
Nextel: phonenumber@messaging.nextel.com
where phonenumber = your 10 digit phone number */

exec master.dbo.xp_sendmail @recipients = '5555551212@messaging.sprintpcs.com' , @message = @message , @subject = 'Svr263 Job Failure'

Thursday, April 30, 2009

Geometric Project Managment

In a prominent banking institution a member of the IT staff made the following sarcastic joke; "In any other part of the world, two people might create a baby in 9 months, but here they throw 18 people at the problem and expect results in 1 month."

Simple math, right? Hardly so. There is no parallel dimension where human development happens at a faster rate. Certain things are just hard constraints with absolutely no variables. In other cases, like in the world of IT, the variables do exist, but only with trade-offs or functions that can be illustrated through geometric and mathematical formula. Geometry applied to project management - looks like witchcraft!




The usual real-world constraints of any set of Requirements are Cost, Resources, and Time. Conceptually this is simple. For any given project there exists a set of movable laws of nature. A set of requirements can be produced in a significantly shorter time, but only with an increase in cost and/or resources. Cost can be reduced, but only with an investment of time and/or resources. One can argue that Time and Cost are resources, and normally I would agree in the most broad definitions. Consider though that with an overall reduction or increase in Requirements, there is a direct relationship with the other inputs of this function.
Just as changing a single attribute of a triangle effect the other attributes, the same holds true for a set of requirements in a project.

Friday, April 10, 2009

Job Notification Random Features

If you've worked with job notifications in the past then you probably already know this, but I thought it was worth discussing. Any time you change the name of the operator you want to notify, the event that the operator is notified upon defaults back to 'On Success'.

There are three different events that will generate notification from the job properties:
  • When the Job Completes - generates whether the job fails or completes
  • When the job Fails - generates notification only upon failure
  • When the job Succeeds - generates only upon job success

The problem comes in that intermittently when you change the operator to be notified, the event will default back to 'When the job Fails'. This is a great problem in that you would not be notified in the event of job success.

Tuesday, April 7, 2009

Using Try-Catch in Transact-SQL

SQL 2005 implemented the TRY-CATCH form of trapping errors in SQL code. The old ON_Error statement no longer applies. This new construct is used in many languages today including C# and VB.NET to name a few. The basic concept is that you place your code within a Try and then use a Catch to find errors it may have encountered. Here is a very simple example you can run:

/* Simple Try Catch Example */
DECLARE @Denominator INT
SET @Denominator = 0

BEGIN TRY
SELECT 1/@Denominator AS [Output]
SELECT 'Computation did not fail' AS [message]
END TRY
BEGIN CATCH
SELECT ERROR_NUMBER() AS Error#, ERROR_MESSAGE() AS [MESSAGE];
GOTO EXITOR
END CATCH;
SELECT 'Catch Did Not Find Error'
EXITOR:



Now set the value of @Denominator to 1 or some other valid value. Notice how the flow of program execution changes.

Monday, April 6, 2009

Enable Database Mail Alerts In SQL2008

While setting up our new SQL 2008 Boxes, I came across the interesting task of configuring SQL Mail. I mean, it would be nice to get an email when your job fails or logfiles grow too big, right? So I went to the web and started looking for articles describing the task.

The articles very nicely describe how to set up mail profiles and operators on your SQL Server. There is lots of material there, so I won't bother to add non-original content. But they do leave out a critical piece, which is to configure SQL Agent to use the profile you have defined. This is funny because my Systems Admin and I were trying to figure out how the two were tied together. The answer is on the Properties Tab of your SQL Agent. Just navigate to SQL Agent, Properties and you should see something like the below. Once you configure your profiles, set up your operators and fill in the below box, you should be good to go.


Saturday, April 4, 2009

Moving to SQL2008 64-bit

Moving from SQL2005 32-bit to SQL2008 64-bit sounded very exciting at first. Especially when we managed to budget a box with 32gb of memory, 24 processor cores, and 2 raid controllers hitting an array of 16 drives. Our old box was just quad-core, 32-bit, 12gb of ram, and a single controler on a 5-disk array. It did the job, but very slowly.

Imagine my surprise when the database was copied to the new server and the jobs that had run fine on the old box started filling up the 160 gigabytes of TempDB every time they were executed!!!!!!

Long story, but here is the short answer: Rebuild your indexes - because when you move to a 64-bit OS, it has issues reading a fragmented index that was created on a 32-bit system.