Excellent Consultants...or Why Technology Kingdoms Rise and Fall

The experience of a recent Consultants Guild blog reader drives home the key factors that cause projects and engineering organizations to fail. It also is a good reminder of what is necessary in order to be an Excellent Consultant. So indulge me for a moment and let me tell a little story of Excellent Consultants...or Why Technology Kingdoms Rise and Fall.

His experience was being confronted with antagonistic interviewers that didn’t understand a technically superior solution. I had a very similar experience trying to do a Sybase port of the dynamic inline T-SQL aggregation construct that I've used numerous times in SQL Server. I'm first and foremost a SQL Server implementation expert, but very proficient in other database products. We were forced to implement a SQL Server database on Sybase. I used this construct extensively to avoid laborious and expensive CURSORS. Then the unthinkable happened (well not really for Sybase), they released a patch which broke the dynamic inline T-SQL aggregation construct in Sybase. Once we finally tracked down why critical portions of our system were broken, Sybase’s response was first disbelief that this approach even worked. When I proved that it did they responded…“Well, that never should have worked in the first place, so we aren’t going to support it going forward.”

Hmmm. That doesn’t seem too intelligent to me. Am I having a nightmare here? Let me do a reality check. So I asked myself a question…”Mark, I asked…Who is paying Sybase here?”. Yep, the answer is me and the customer that I’m implementing Sybase for.

That lack of disrespect for the basic customer-vendor food chain required about 2 weeks of redesign of some key backend stored procedures and processes. Needless to say, it didn’t serve to improve my perspective of Sybase. The customer spent between $100k-$200k in the original port from Sybase to SQL Server. What value did it add….ZERO. It cost more, took longer and performed poorer. Fast forward...now the customer decided that their strategic direction is to migrate all Sybase applications to Microsoft SQL Server. So, it will cost Sybase millions of dollars from this customer and any customer that I do business with in the future that is willing to listen to my experience.

Nope, it wasn’t a nightmare after all. At least not mine (maybe Sybase’s though). Whew!

Haacked’s experience of interviewers that think they know it all, like my experience, taught me an important lesson. As technicians, we need to continually remind ourselves that fear, pride and ignorance are the greatest barriers to success. There is almost always a more elegant, more simplistic, more cost effective and higher performing way to solve a problem. Furthermore, I have found that fear, pride and ignorance are the hallmarks of project failure. It can have enormous cost implications to us as consultants, to the tool vendors we utilize and ultimately the customers we serve.

For the interviewers that thought they know it all, it cost them a potentially great resource and asset. In my scenario it cost my customer hundreds of thousands of dollars. Ultimately it will cost Sybase millions of dollars.

The lesson I learned is this, technology kingdoms rise and fall primarily due to fear, pride and ignorance. Excellent consultants, at least the ones I am willing to work with, have no room for any of these traits. If you are on a project where fear, pride and ignorance are given free reign, raise the red flag high. Waive the flag with vigor. Get executive sponsorship and root it out. These people are the type that are continually complaining, blaming and failing. An end user cracked me up with describing these type of people like this..."Their cubicles are vortexes where all happy thoughts go to die". Sadly their teams and their organizations are where good projects go to die as well.

What excellent consultants possess, at least the ones that I’m willing to work with, are increasing measures of boldness, confidence and wisdom. Boldness replaces fear, confidence replaces pride and wisdom replaces ignorance. We in the Consultants Guild are passionate about Experience, Integrity and Delivery. They are the mark of Excellent Consultants.

Excellent Consultants are…

Excellent consultants are bold enough to look for new ways to deliver successful solutions better. Whether it’s a technical challenge, a business challenge or a process challenge they always strive for success. They don’t shrink away in fear or let pride or ignorance cause them to fail. Typically they will lean on their wealth of experience and/or the deep pool of experience from other excellent consultants that they’ve been in the trenches with. Boldness is the key to successful delivery when the barriers seem insurmountable.

Excellent consultants are confident in their experience that has led them to success in the past. The proof is in the pudding. A key principle of Agile software development is that “Working software is the primary measure of progress.” Given a reasonable amount of customer trust and freedom, I’m fully confident that I will make them successful. This shouldn’t be confused with pride or arrogance because I’ll readily admit when I need the help of others. Confidence is only developed through experiencing repeated success.

Excellent consultants are wise enough to know what they don’t know. A common misconception about wisdom is that it means knowing everything. It does have to do with knowledge, but its much more than that. Wisdom is when knowledge is applied through the filter of “experience”. I also heard someone once say that “Wisdom is knowing what you don’t know”. Just becaues a ASP.NET application was a perfect solution for one customer, doesn't mean that its the only solution for every customer. We were faced with this challenge on a current project and realized that it was in the best interest of the client to build a .NET Win Forms Smart Client application vs. a traditional .NET Web Form HTML based application. Wisdom grows as we do business with integrity by looking to the mutual best interest of our clients and ourselves.

Dynamic SQL INSERT Generator Unleashed!

Okay, so I'm hyping a bit. So give me a break...this little gem took me a few days to write and weeks to refine in the 90's. Anyway, I updated it to take advantage of the "dynamic aggregator" approach shown in my previous query.

So you are probably saying, "Great, so you can generate a CREATE TABLE statement...Whoopee!". Well, if I could help you fill that table or tables using a script that dynamically generates INSERT statements based on the contents of a table, would that be worth something to you? Great! Read further and lets really impress those SQL jockey friends of yours.

Problem to Solve
Write a query that dynamically generates a series of INSRET statements based on the contents of a table. The output of this query can then be used to fill an empty table or replace a table. It also accomodates preserving IDENTITY columns.

Goals
Like always, I try to make my SQL tools reusable and tight. In this query, I opt to blend simplicity (one cursor for the tables) with the powerful "dynamic aggregator" shown in my last blog (for the columns). So without further delay....

Solution to the Problem
Paste the following query into a SQL Server Query Analyzer window and set the current database to the desired database that you want to generate the INSERT script for. Change the WHERE clause of the DECLARE CURSOR query to reflect one or more tables from that database. Set the output format to Text. Then let 'er fly! (eg: run the query)....


create table #tmp (
SQLText varchar(8000) )

create table #tmp2 (
Id int identity,
SQLText varchar(8000) )

set nocount on

delete #tmp
delete #tmp2

declare @vsSQL varchar(8000),
@vsCols varchar(8000),
@vsTableName varchar(40)

declare csrTables cursor for
select name
from sysobjects
where type in ('u')
and name in ('Customers')
order by name


open csrTables
fetch next from csrTables into @vsTableName

while (@@fetch_status = 0)
begin

select @vsSQL = '',
@vsCols = ''
select @vsSQL = @vsSQL +
CASE when sc.type in (39,47,61,111) then
'''''''''+' + 'isnull(rtrim(replace('+ sc.name + ','''''''','''''''''''')),'''')' + '+'''''',''+'
when sc.type = 35 then
'''''''''+' + 'isnull(rtrim(replace(substring('+ sc.name + ',1,1000),'''''''','''''''''''')),'''')' + '+'''''',''+'
else
'isnull(convert(varchar,' + sc.name + '),''null'')+'',''+'
end
from syscolumns sc
where sc.id = object_id(@vsTableName)
order by ColID

select @vsCols = @vsCols + sc.name + ','
from syscolumns sc
where sc.id = object_id(@vsTableName)
order by ColID

select @vsSQL = substring(@vsSQL,1,datalength(@vsSQL)-1)

select @vsCols = substring(@vsCols,1,datalength(@vsCols)-1)

insert #tmp
exec ('select ' + @vsSQL + ' from ' + @vsTableName)

update #tmp
set sqltext = 'insert ' + @vsTableName + '(' + @vsCols + ') values(' + substring(sqltext,1,datalength(sqltext)-1) + ')'

insert #tmp2
select 'DELETE from ' + @vsTableName

insert #tmp2 values ('GO')

if (select count(id) from syscolumns where id = object_id(@vsTableName) and ((status & 128) = 128) ) = 1
begin
insert #tmp2
select 'set identity_insert ' + @vsTableName + ' on'
end

insert #tmp2
select * from #tmp

if (select count(id) from syscolumns where id = object_id(@vsTableName) and ((status & 128) = 128) ) = 1
begin
insert #tmp2
select 'set identity_insert ' + @vsTableName + ' off'
end

insert #tmp2 values ('GO')

insert #tmp2
select 'update statistics ' + @vsTableName

insert #tmp2 values ('GO')

delete #tmp

fetch next from csrTables into @vsTableName

end

close csrTables
deallocate csrTables

update #tmp2
set sqltext = substring(sqltext,1,charindex(',)',sqltext)-1) + ',NULL)'
where not(charindex(',)',sqltext) = 0)

update #tmp2
set sqltext = replace(sqltext, ',''''',',null')
where not (charindex(',''''',sqltext) = 0)

update #tmp2
set sqltext = replace(sqltext, '(''''',',null')
where not (charindex('(''''',sqltext) = 0)

set nocount off

select sqltext from #tmp2 order by id

go

drop table #tmp
drop table #tmp2

The resulting output will be an auto-generated INSERT script that can be used to migrate data and/or seed a new database/tables.

Things to grok:

  • HONEST DISCLAIMER: Notice that this has limitations (8000 characters of SQL per INSERT line, NO image data types). It handles about 90-95% of the needs that I use it for, but please understand that I have tweaked and embellished it over time. So use it at your own risk. If you find a bug/enhancement, please let me know.
  • Notice the nifty "dynamic aggregator" queries used to build up the main @vsSQL and @vsCols variables. This pattern can be leveraged over and over to generate powerful SQL
  • Don't freak about the batch of single quotes. If you are going to do dynamic SQL, you need to know how to use these. Each quote must be represented by two single quotes. For example: SELECT '''' will result in the display of one (1) single quote in the result set. The best thing to do is to experiment with this until you grok it.
  • Notice the use of the "EXEC (@sql)" construct to dynamically execute SQL to generate more SQL.
  • Notice the concluding few UPDATE queries to "massage" the INSERT SQL to handle NULLs and final columns in the table. Why punish yourself by embedding all of the conditional logic in the main queries when a few simple UPDATE statements can be used to "massage" the data at the end. I use this pattern routinely. This is one distinct advantage of SQL/Set based operations vs. iterative constructs...So use it!

Coming next on my blog:
Hope you found this blog entry useful. I know that this tool has been indispensable for me. So, you may be wondering..."Is this guy just a SQL junkie?" Well, the answer is yes and no. I am a SQL Junky, but not "just" a SQL junky. I also specialize in the design, architecture and implementation of enterprise-wide n-tier applications.

So, I'm going to share a bit of a blend in my next blog. See how I leverage the pervious pattern to generate VB.NET Business Classes to be consumed in an n-tier .NET Smart Client application..."Dynamic SQL begets .NET Business Classes". My theory is, I like classes and I like SQL, so why code redundant patterns by hand? The answer is, you don't have to!

Now, go and "REALLY" impress your SQL jockey friends!

Dynamic T-SQL in Just Minutes!

Ever feel left behind when one of those SQL Server jockies breezes through a complex query that generates SQL on the fly? After all its a bit over the top to use SQL to generate SQL, isn't it? Well, its time to level the playing field. I'm going to pass on a simple pattern that you can use to generate the most sophisticated queries. This pattern will lead to an endless supply of SQL/Database tools to add to your library.

Problem to Solve
Write a query that generates a CREATE TABLE definition from any table. Then write a query that generates INSERT statements to fill a copy of that table based on the current data.

So you are probably saying to yourself...Wouldn't just be easier to type the CREATE TABLE definition by hand? Maybe so for the first one, but imagine doing it for an entire database of 300+ tables. Or you may say...Couldn't I just use the T-SQL generator in Enterprise Manager (or equivalent)? Sure, but if you want to take your skills to the next level, its time to make an investment. Besides, does anyone else hate the excess baggage that those tools include in the generated SQL? Me too. So, dynamic SQL to the rescue.

Goals
I always try to make my SQL tools reusable and tight. I could do a lot of things using brute force CURSORS or WHILE loops in SQL, but I try to avoid that. So I'm going to give you a little gem that will aggregate the statement on-the-fly in one SQL statement. So without further delay....

Solution to the Problem
Paste the following query into a SQL Server Query Analyzer window and set the current database to Northwind. Set the output format to Text. Then run the query....


declare @vsSQL varchar(8000)
declare @vsTableName varchar(50)
select @vsTableName = 'Customers'

select @vsSQL = 'CREATE TABLE ' + @vsTableName + char(10) + '(' + char(10)

select @vsSQL = @vsSQL + ' ' + sc.Name + ' ' +
st.Name +
case when st.Name in ('varchar','varchar','char','nchar') then '(' + cast(sc.Length as varchar) + ') ' else ' ' end +
case when sc.IsNullable = 1 then 'NULL' else 'NOT NULL' end + ',' + char(10)
from sysobjects so
join syscolumns sc on sc.id = so.id
join systypes st on st.xusertype = sc.xusertype
where so.name = @vsTableName
order by
sc.ColID

select substring(@vsSQL,1,len(@vsSQL) - 2) + char(10) + ')'

The resulting output will be a auto-generated CREATE TABLE statement:

CREATE TABLE Customers
(
CustomerID nchar(10) NOT NULL,
CompanyName nvarchar(80) NOT NULL,
ContactName nvarchar(60) NULL,
ContactTitle nvarchar(60) NULL,
Address nvarchar(120) NULL,
City nvarchar(30) NULL,
Region nvarchar(30) NULL,
PostalCode nvarchar(20) NULL,
Country nvarchar(30) NULL,
Phone nvarchar(48) NULL,
Fax nvarchar(48) NULL
)

Things to grok:

  • Notice the nifty main SQL statement that aggregates the SQL string variable @vsSQL on-the-fly.
  • Notice how to leverage the combination of sysobjects (for tables), (for column Des) and systypes (for data types properties).
  • Notice the use of char(10) to append carriage returns to the SQL for decent formatting.
  • Notice the use of the CASE statement to conditionally append the length. This could be used to handled scale/precision for FLOAT, REAL and DECIMAL data types.

Coming next on my blog:
See my next Blog post on how to "Dynamically generate INSERT statements" from the contents of a table. This is like gold when creating database migration scripts and especially for transferring data between varying SQL based platforms (SQL Server, Sybase, Oracle, Informix, etc.).

Now, go and impress your SQL jockey friends!