Windows Communication Foundation

My prediction is that Windows Communication Foundation will turn out to be a huge sucess for Microsoft, and we plan to utilize it on Ellis (project name).

What is a Service-Oriented Architecture (SOA) ?

Much has been written about the latest buzzword to emanate out of the cubicles of IT shops and, surprise surprise, it's another acronym. SOA stands for Service-Oriented Architecture, and has been around for many years. In recent months it has taken off, with vendors, professional service firms, webzines, and my plumber Jim all providing their own definition and explanation, pro and con. In the vendor space, IBM is now packaging 13 (not a typo) websphere and Tivoli products around SOA. However, finding a common understanding, let along simple definition of SOA is not easy.

So what is all the buzz about? Wickepedia defines SOA as such:

In computing, the term Service-Oriented Architecture (SOA) expresses a perspective of software architecture that defines the use of services to support the requirements of software users. In an SOA environment, nodes on a network[1] make resources available to other participants in the network as independent services that the participants access in a standardized way. Most definitions of SOA identify the use of Web services (i.e., using SOAP or REST) in its implementation. However, one can implement SOA using any service-based technology.

And the definition goes on and on like this, smelling very much like the latest brew to come pouring out of the IT department down in the bowels of the company's basement.

As the technical architect on a project that is on the SOA path, considerable effort has been spent trying to determine what SOA truly means to a project, and to the company's strategy and growth plans.

As my current project evolves I plan to keep an active journal detailing what goes right, and not so right related to SOA. For this first post, here's a quick summary of what SOA means to me, and how it most relates to the success or failure of the engagement:

SOA is not so much an architectural framework, or a set of architectural best practices, or a specific vendor solution, as it is a guiding principal or philosophy around the purpose a software project or packaged vendor solution lives for. Those people controlling the purse strings in an organization are very concerned about the real ROI a project is forcasted to bring back. What SOA brings to the table, more than anything else, is the opportunity for greatly expanded returns the investment in the project required. This is accomplished through the extensibility and ubiquity. An example might help to best describe this.

If I did everything correct as the software architect on a project to custom build a solution.. everything "by the book", and created course-grained, stateless interfaces that abstracted the complexity away from the consumer, and incorporated other best practices into its design.. BUT, I used a .NET remoting solution as the transport protocol because it offers higher performing working software, the central tenet of SOA has just been violated.

That is, my solution may be solid in every way, except I just forced my new partner, or the latest company merger to move away from their Sun/Java shop.. and the huge costs associated with it, and instead, to spend lots of money and resources purchasing Windows, and .NET, because the "perfect solution" you created requires .NET on both the client and server. The reach of the service interfaces created, and extending them to my business partner just got more expensive.

Forcing the consumers of your services into one particular vendor's technology in order to be used might be a perfectly valid solution.. perhaps the best solution, but it's not SOA.

By utilizing XML, web services, SOAP, and other vendor-neutral technologies, your services just became available to a far larger pool of consumers. By creating a framework that is consumable by whomever.. or whatever platform is the preferred flavor of your organization, an investment in a long term growth strategy can be more easily realized.

On smart clients, adhesion, and strong objects

A challenge presented to developers when designing smart client apps includes keeping the solution's assemblies, which must communicate at some point with one another, as loosely coupled as practically possible.

Reaching this goal will greatly increase the "intelligence" of the software tool as the capability to version your app increases. The "smart" in smart client application design, among other things, is the capability to keep the code base as feature-rich as possible, with minimal impact to both the developer tasked with creating and managing the source (relatively speaking), as well as the end user consuming the updates on the client.

At the same time, designing strongly typed objects can reduce the headache of building and managing over time the interaction these objects have with the user interface layer of the application.

How do you design components that need to communicate with each other, while removing the constant dependency inherent with early bound objects in a way that is repeatable, and easy to manage? The following image illustrates this pattern, with a description below.

EventBroker

Open Source software as a Commodity? An exchange with an expert

Last week I had the following email exchange with Andy Oram, a highly respected book publisher and author at O'Reilly, specializing in Open Source technologies, and the movement in general. I initiated the exchange after a conversation with Keith Steele, who works on the Pisces project at BPA, concerning the word "Commodity", and its usage with Open Source software. I think you will find it interesting....

My initial email:

Andy

ne who is genuinely interested in Open Source software, and the movement, but comes from a Microsoft background, I have a simple question for you that hopefully you can help me with.

The dictionary defines the word, "commodity" as such:

1. Something useful that can be turned to commercial or other advantage: “Left-handed, power-hitting third basemen are a rare commodity in the big leagues” (Steve Guiremand).

2. An article of trade or commerce, especially an agricultural or mining product that can be processed and resold.

3. Advantage; benefit. Obsolete. A quantity of goods

Can you explain why this word is now accepted in the lexicon of the Open Source movement? For example, in your article, "Linux becomes a commodity at LinuxWorld" http://www.oreillynet.com/pub/wlg/3614 your first sentence reads, "Perhaps the clearest indication that operating systems are becoming a commodity".

If the modus operandi of the movement is collaboration, sharing, and free software, than why is this word, that is normally used to describe Capitalistic vehicles for wealth generation, being used in this way?

Sincerely,
Kirk Miller

Andy's Response:

That's a really great question! Thanks for writing and pointing out the
odd shift in meaning. When I look up "commoditization" on Wikipedia (a
great example of my meaning of the term, commoditizing encyclopedia
content) I get the formal definition you mentioned in your mail. But
when I look up "commodity," I see some material that supports the common
use in the open source community. Here's a relevant excerpt:

In the original and simplified sense, commodities were things of value,
of uniform quality, that were produced in large quantities by many
different producers; the items from each different producer are
considered equivalent. It is the contract and this underlying standard
that define the commodity, not any quality inherent in the product. One
can reasonably say that food commodities, for example, are defined by
the fact that they substitute for each other in recipes, and that one
can use the food without having to look at it too closely.

Wheat is an example. Wheat from many different farms is pooled.
Generally, it is all traded at the same price; wheat from Joe's farm is
not differentiated from wheat from Jane's farm. Some uniform standard of
quality must necessarily be assumed.

---

I think somebody took the original meaning (which you laid out in your
email) and took a bit of a flight of fancy to come up with the more
controversial meaning.

Open source didn't start it. For a long time I've heard phrases such as,
"IBM-compatible PCs became a commodity." That is, all companies were
pretty much the same and it didn't matter to consumers where they
bought, except for subtle differences in support. And that was credited
with making Microsoft software so valuable, and making it the focus of
changes in the computer field.

And I've heard it said that this kind of "commoditization" of the PC
lowered the prices so much that for the first time, a few years ago, the
price of the software became significant. So companies started to wonder
how they could cut down software license fees--perhaps a boost to open
source.

So, in short, the term seems to have been jerked around a lot to point
to a socio-economic phenomenon. I think the phenomenon is real, but it
might be too bad that a good term had to be reused to describe it.

My Response:

Andy

or the quick and enlightening response to my inquiry.

As I looked around the net some more I came across an article that described software "Commoditization" in a similar way.

Here's the relevant excerpt:
Open source is helping turn significant chunks of the IT infrastructure into commodities by offering alternatives to proprietary software. (See "Enterprise Ready," left, for a list of these tested open-source applications.) This is software as corn or wheat. As the products become indistinguishable, buyers will choose the cheapest, most reliable supplier they can find-and it's hard to beat open source on price.

That makes sense, but I think the term is misleading for the following reason: Some software is surely heading for, or has already become, homogenous.. or, "indistinguishable" from one competitor's version to the next. Web servers come readily to mind. In this respect, the excerpt's first sentence is accurate. However, further "up the stack", as I've also read recently, referring to the applications that reside on top of "Commoditized" software, it seems to me that the tools and applications will always remain far too specific in what they are doing, and what problems that are resolving, to become generic. In this sense, proprietary vendors will probably always remain one step (or more) ahead of the process of commoditizing that, perhaps, will continue to take place.

Andy's last response:

I agree with you that it's much harder to marshal a lot of volunteers
(or willing vendors) in narrow, specialized spaces. What I wonder is
whether innovation can take place in open source too. On small things,
the proprietary vendors seem to take the lead and do a better job. But
on world-changing ideas such as the Internet and the Web, open source
was there first.

Andy

My final response:

The question I have is how far "up the stack" can "Commodified Software" get?

It seems to me that the more Commodification takes place in software, the better case Open Source advocates can make to the business community that Open Source software makes economic sense to adopt. However, I fear that, in the attempt to relegate an ever-increasing range of applications as "homogenized.. Commoditized" with the goal of expanding the presence of Open Source, what could lose out is functional innovation...what is also known as strategic advantage that makes one product shine when compared to its cousins. Ultimately, even if Open Source wins out in the end, the compromises that victory required might have deleterious effects to the entire industry.

Grouping data using the Component One FlexGroup Control

A popular requirement to display data in a custom .NET Winform application is to structure the data in a hierarchical tree-view format. However, the built-in support for this type of display is limited. Using the ComponentOne FlexGrid control, one can build a set of data tables into a Dataset object, and then relate them together. However, getting this setup, and moving the data into the Flexgrid requires a considerable effort.

Another way to get this result is to use a FlexGroup control from ComponentOne. It implements Outlook-style grouping and filtering using the FlexGrid. However, the control is not included in any of ComponentOne's developer suites. Instead, they provide the source code to the control, and it's up to you to build an instance that can then be used in your project.

Steps
1. Download the FlexGroup sample code under the FlexGrid control at the ComponentOne website: http://www.componentone.com/pages.aspx?pagesid=113

2. Once the download is complete, exract the compressed files and run the FlexGroup.vbproj VB.NET project.

3. View the properties window for the FlexGroupVB project file and change the Output type from a Windows Application to a Class Library.

4. Re-build the solution. When complete you can close this solution.

5. Open your solution. From the Toolbox, click the Customize Toolbox option. In the .NET controls section, browse to the Bin directory of the FlexGroup folder you downloaded. There, you will find the compiled control named FlexGroup.dll. Add this to your Toolbox. Back at your Form, the FlexGroup control is now a selectable item in the Toolbox. Select it and drag or paint it onto a form.

6. Set the control's Grid.Datasource property equal to a Dataset table. Run the project and display the form. You will see a display that looks like the example shown below.

Additional Note
The Dataset you use should be a flat result set with parent/child relationships built in.
For example:
Author1 BookA
Author1 BookB
Author1 BookC
Author2 BookX
Author2 BookY
Author3 BookZ

There is also a property named .ShowGroups. If this is displayed you can drag and drop columns into a Grouping section located just above the grid, allowing the user to customize the groupings displayed. In my example below I have chosen to hide this feature so I can control the look and feel of the user interface.

One "gotcha" I found was dealing with sibling columns. For example, City, State, and Zip are attributes of an address and thus siblings to each other. If you allow each of these columns in the dataset, the display of the data appears less fluid and appealing. I got around this by including sibling columns together as one column (e.g. City + state + zip as Address).

C1 - FlexGroup Example

What is it with the Open Source community??

With the Open Source community back in the spotlight I'm truly perplexed by what motivates them http://www.technewsworld.com/story/40315.html, and I have mixed emotions about the effect they've had on the software industry.

I respect someone who puts great effort into a software application and gives it away to the world free of charge. There are many amazing tools and programs available to us all. I've found myself relying on Open Source tools during past projects, and find them to be almost uniformly of high quality. Think of Apache for web servers, NAnt for build process management, or similar high quality applications.

However, my patience starts running thin when Open Source advocates turn from their passion for great software, and the spirit of sharing, to preaching against the evils of the profit motive. It's one thing to actively promote the benefits of free software http://gnu.netvisao.pt/philosophy/why-free.html, it's an entirely different proposition to alienate and berate those who build and/or use proprietary software.

In my opinion, Open Source developers fall into three categories:

True Believers
This group lives and breathes Open Source, fully grasps the philosophy behind it, and despises most things corporate, and all things Microsoft. True Believers don't differentiate Open Source from the general realm of the greater world of politics. In their view, Open Source is just the latest tool in the on-going struggle against the evils of capitalism, corporations, and corruption. Richard Stallman http://www.stallman.org/ personifies this group.

Profit Seekers
This group is far less vocal than True Believers but see an opportunity to make a buck creating products and services around the Open Source concept. To an extent, they speak the language of the True Believers, but observing their actions, clearly are building revenue streams to make money. Profit Seekers struggle with deep-seated conflictions due their profit seeking motive banging up against the free and sharing philosophy of Open Source licensing agreements http://www.gnu.org/licenses/licenses.html#GPL.
Mathew Szulick http://www.redhat.com/about/corporate/team/szulik.html personifies this group.

Dabblers
This wide-ranging group cuts across many different and unassociated sub-groups. From students checking out new technologies, to workers pushed toward Open Source because it's used at their place of work, to the technically adept who could care less about the philosophy of 1s and 0s, and simply want to use the best available solution, Dabblers are either unaware of the battle Open Source True Believers are in, or who are aware, but are practical in their ways and care far more about putting food on the table than psycho-babble about "technical purity".

Finally, let me conclude with this.. in my view there is room for both Open Source products and services, as well as people like myself trying to earn an honest buck by "selling out". The war some in the Open Source movement have ignited only serves to bring the entire industry down... with outsourcing, razor competition, and the inherent complexity of technology, that last thing we need is a flame war between developers that has no practical purpose. In a world where software doesn't pay, where's the motivation to get up in the morning and slog away for ten hours cutting code? It may seem well intended to share your hard work with the world free of charge, but as the old saying goes, the road to hell is paved with good intentions.

Buy or Build??

A common dilemma presented to decision makers in the technology groups of many companies is whether to buy software off-the-shelf, or to build the software from scratch. As someone who has faced this decision, and faced the results from both sides, allow me to present my 2 cents.

There are certainly many situations when buying software makes sense. Buying pre-built software usually makes sense when the application has reached a level of maturity where change is minimal, such as standard accounting applications, database technologies, browsers, and the like. The age-old idiom, "why re-invent the wheel?" fits well here.

However, I would argue that some applications that seem to fit the buy criteria, such as contact management tools, often times have persuasive arguments pointing towards the opposite conclusion. For that matter, I would strongly argue that any application that must fit the unique requirements of a given organization, buying a solution from some vendor is a major blunder, and I've personally seen management careers take a major hit as the disastrous results of this ill-fated decision are eventually realized.

Here's my top reason why this is so:

Who's Process??
Why should a company or organization be forced to alter the way they are used to doing business in order to fit some new software into the process? Shouldn't it be the other way around? We see this all the time. The company has a standardized process.. albeit, typically, a flawed one. The employees are comfortable, and the work gets done. Then, some slick-tongued software salespeson comes along and convinces management that, if they would only buy their software, all your problems would magically go away. The software is purchased... often times, at an extremely high price. A few hundred thousand (or more) dollars are than spent on high-priced consultants to come in and integrate the software. Finally, a new application is launched that sort of works as advertised, as long as the company follows THEIR WAY of doing business. Employees are left scratching their heads and quietly berating the new software, and wishing they could have their old Excel spreadsheet back.

Who owns the kingdom??
Software purchased from a vendor does not usually come with the source code, and thus, getting "under the hood" is virtually impossible. This is always a shock to the decision maker not adept at technology. The software is launched, and tweaking it to fit a new model is very difficult or impossible to do. As the owner of the keys to the source code kingdom, the organization that chose to build their own version of the software can quickly react to changing requirements and control the change management process. The key word here is CONTROL.

How long is that line??
During the sales phase of the process the slick-tongued salesperson is quick to promise and always short to deliver when it comes to support. During the sale, your company is the center of the salesperson's universe... no one matters more to a good salesperson. Than, two years later, some issue comes along, and you quickly find out that the vendor's universe contains players far larger, and far more important to their bottom line than you could possibly offer. Thus, you find your problems sitting in their support queue along with everyone else. As the owner of the software, YOU control the support process. YOU decide the priorities. YOU determine how much you will spend on support. As a decision maker in a company, those are assets that make YOU look good.

Let's figure this out
My final reason why software should almost always be developed in-house is the most vital one to the success of the project. In a word... collaboration. When software is developed by your people, you can quickly and very early in the process involve the end-user community themselves, who will ultimately decide whether the software works FOR THEM. Thus, instead of layering another "new gadget" for them to fit into their already busy days, you've instead given them a personal stake in how the software will improve their jobs. This is no small thing, and I've found it makes ALL THE DIFFERENCE whether a project is ultimately successful or not.

Data Input Validation Using ASP.NET Forms

Introduction

Ask a group of web developers about the critical importance of data validation and you’re sure to get many heads nodding, for they know all too well the security of their own job depends on how successful they are at executing this task. Sites that lack data validation features are sure to invite errors downstream as that information is stored, processed, and ultimately, relied on erroneously.

Since the inception of web sites that did more then simply act as fancy brochures there’s been scant resources for the web developer to utilize for this purpose, and often times, getting data validation to work properly felt like fitting round pegs into square holes to effectively provide a web site with the proper data validation necessary to capture reliable information.

Background

One of the truisms that most web developers I’ve spoken to agree on is that end-users of a website are innately ignorant and will mess up data entry in a web form if given the chance. It is therefore the job of web developers to make his or her pages utilize a client-side scripting language, such as JavaScript to verify the data being entered. What needs to be validated is often up to the web developer. It is his or her duty to determine how extensively he needs to fool-proof his web form. Often times you just need to make sure that required fields are entered. Other times you need to ensure that the correct data type is entered; and still other times you need to make sure a user's input conforms to a certain standard (such as telephone numbers, social security numbers, etc.).

Let's look at an example. Let us say that you are wanting to collect information from your users about how they rate your site. You may have a form asking for the following fields:

• Their full name
• Their e-mail address (which you make optional)
• A rank for the site ranging from 1 – 10

We would want to write a JavaScript function to ensure that the name field had a value in it, and that the site ranking had a value between 1 and 10 in it. Let's take a look at what the JavaScript would look like (code is incomplete due to space and tag usage limitations):


SCRIPT LANGUAGE="JavaScript"

function ValidateData() {
var CanSubmit = false;
// Check to make sure that the full name field is not empty.
CanSubmit = ForceEntry(document.forms[0].txtName,"You supply a full name.");
// Check to make sure ranking is between 1 and 10
if (CanSumbit) CanSubmit = ValidRanking();
return CanSubmit;
}
/SCRIPT
And submitted here:
FORM NAME="frmSiteRanking" METHOD="GET" ACTION="SiteRanking.asp" ONSUBMIT="return ValidateData();"

This is one small example of validating data prior to the introduction of ASP.NET. In this world, developers had to write all of their own validation routines and cut and paste them in the various ASP scripts that needed to employ various validation techniques. Imagine needing to write similar scripts for every last form input value that an end-user might mess up. All in all, it was a real headache. This is where ASP.NET form validation server controls come into play.

ASP.NET to the Rescue

Validation Web controls are ASP.NET Web controls designed specifically to validate form field entries. For example, ASP.NET contains a RequiredFieldValidation control, which, as its name suggests, can be used to ensure that the user enters a value into a form field (such as a TextBox). Specifically, ASP.NET provides the following form field validation controls:

1. RequiredFieldValidator - Checks to make sure the user entered a value.
2. CompareValidator - Compares a form field's value with the value of another form field using relations like less than, equal, not equal, etc.
3. RangeValidator - Ensures that a form field's value is within a certain range.
4. RegularExpressionValidator - Makes sure that a form field's value corresponds to a specified regular expression pattern.
5. CustomValidator - Checks the form field's value against custom validation logic that you, the developer, provide.
6. Validation Summary - display a summary of the results from all validation controls on the page.

By default, page validation is performed when a control, such as button, ImageButton, or LinkButton is clicked. You can prevent validation from being performed when a button control is clicked by setting the CausesValidation property of the button control to false. This property is normally set to false for a cancel or clear button to prevent validation from being performed when the button is clicked.

Let’s take a closer look at one of these controls in greater detail.

RequiredFieldValidator

Footprint

asp:RequiredFieldValidator
id="ProgrammaticID"
ControlToValidate="ProgrammaticID of control to validate"
InitialValue="value"
ErrorMessage="Message to display in ValidationSummary control"
Text="Message to display in control"
ForeColor="value"
BackColor="value" runat="server"
/asp:RequiredFieldValidator

Use the RequiredFieldValidator control to make an input control a mandatory field. The input control fails validation if the value it contains does not change from its initial value when validation is performed. This prevents the user from leaving the associated input control unchanged. By default, the initial value is an empty string (""), which indicates that a value must be entered in the input control for it to pass validation.
Note Extra spaces at the beginning and end of the input value are removed before validation is performed. This prevents a space being entered in the input control from passing validation.

Example
The following example demonstrates how to use the RequiredFieldValidator control to make a TextBox control a mandatory field:

form runat="server"
Name:asp:TextBox id="Text1"
Text="Enter a value" runat="server"
asp:RequiredFieldValidator id="RequiredFieldValidator1" ControlToValidate="Text1"-->links validator to text box
Text="Required Field!"
runat="server"
asp:Button id="Button1" runat="server" Text="Validate"

Conclusion

Instead of the web developer needing to write extensive client-side JavaScript to validate this field the Validation Control does much of the work itself by linking validation rules to any HTML input control requiring similar examination. Significant time is saved, and the developer can focus on what really counts… the actual content being developed, instead of the minutia of old-style, client validation code.