Saturday, May 18, 2013

Building a house? Make room for guests (especially the ones that want to stay a while)...

First, a couple of disclaimers: 1) This has nothing to do with real estate. 2) I do not now, nor have I ever, laid claim to the title of "The Best Software Engineer in History". However, I do believe that I've been doing this long enough now to know what some of the more solid practices are, and which ones are just plain shoddy work. 3) This post originated almost two years ago. Since then I've seen instances where solid design did exist, but implementation over time left holes in the resulting user experience.

A couple of years ago, I posted on a fairly popular forum lamenting the software design practice of first building a data-driven  software application in Access and then simply up-sizing it when it got too big to handle.  One of the things I didn't talk about in that post was the necessity for a solid database design. As a company's data is the foundation of that company's overall business, I thought it might be prudent to flesh out more of my thoughts.

Data driven applications, whether on the desktop or on the web, all have one thing in common, apart from the (obvious) reliance upon a database engine. They all must consider the concept of scalability, or the ability to expand on what the original design expects, if the overall project is going to succeed in the distant future.

I say this because I believe that, even as a Software Engineer, it is possible to work yourself out of a job. Projects begin and end all the time.  Some only last a couple of minuets and never make it out of the meeting. Others last for years, and grow to encompass entire companies over time. The latter projects are the ones that wind up having been designed with an eye toward the future, and the trend continues throughout the life of the system. As new features are added an obsolete ones are deprecated, the one constant that remains is the possibility of change, and making allowances for it.

Whether you learn in school, or on the job, the one thing that we as engineers must consider is the structure of our product.  How easy is it to maintain? Is someone after me going to be able to make sense of this without stepping through code for a month? Am I, as the one who wrote it in the first place, going to remember what was going in this or that section in one or two years?  When a contractor builds a house, no matter what the floorplan, there are standards that have to be met. That's why inspections are required for certain portions before any other work can continue on the house.

At this point in time, with all of the tools at our disposal, we should be not only promoting this concept, but training those that will one day sit down, at their desk or in their cubicle, to make changes to that thing that we poured ourselves into, to think through their design first, and then look see where they can leverage what already exists, without re-inventing the wheel.

I will now yield the floor...

Friday, May 17, 2013

Working the Team with TFS Test Cases...The teaser

Hi! I've been out of pocket for a while now, and I thought I'd pop in to let you all that I've not forgotten about you. Fact is, I've not had the opportunity to bang my head against my monitors as much as in times past.  But not to worry! There has recetnly been much frustration and consternation afoot in the last couple of months.  If you've been following me on LinkedIn, then you might know that I'm smack in the middle of creating something that has, so far saved me not just hours of work.  I'm talking DAYS! And it's using Team Foundation Server no less!

In the coming posts, I'll be walking through the process of working with a large team of Testers, and assigning each of them the SAME Test Case, all from C# code.  It's in pre-Alpha right now, but as time moves along, I'm hoping to get through it. We'll work together, and hopefully learn some new stuff as well ...

Sunday, September 18, 2011

Scrolling the Textbox Inside a Deeply Nested UpdatePanel

One of the most challenging things (for me at least) is dealing with JavaScript. That's why I love the AjaxToolKit from MS. Using these tools means I don't have to write a lot of javascript myself.  The downside to this is that when the time does come to "get my hands dirty", I'm usually floundering around for a while until I hit on a nugget or two that I can use.

Such is the case with the subject of today's notes, so let's get to it, shall we?

The Scenario: Content page in a ASP.Net 3.5 Web Application that relies HEAVILY on the AjaxToolKit for much of it's functionality.  (The fact that this is a Content Page is important here).  This page has several UpdatePanels, most of which have triggers of some sort. the one that we will focus our attention on is nested inside the fourth TabPanel of a TabControl, and is triggered by an external Timer.  This Timer updates the text inside our subject TextBox.  I should also mention here that the TabControl itself is ALSO inside an UpdatePanel, so we have something like this (the sections in orange are the ones we will concern ourselves with):

<asp:Panel>
    <asp:UpdatePanel>
       <ContentTemplate>
            <asp:TabContainer>
                  <asp:TabPanel ID="TabPanel1" />
                  <asp:TabPanel ID="TabPanel2" />
                  <asp:TabPanel ID="TabPanel3" >
                        <ContentTemplate>
                             <asp:Button ID="Button1" Text="Begin Long Process" OnClick="Button1_Click" />
                        </ContentTemplate>
                  </asp:TabPanel>
                  <asp:TabPanel ID="TabPanel4" >
                       <ContentTemplate>
                           <asp:Panel>
                                <asp:UpdatePanel>
                                     <ContentTemplate>
                                            <asp:TextBox TextMode="MultiLine" Width="550px" Height="455px" />

                                    </ContentTemplate>
                                    <Triggers>
                                          <asp:AsyncPostBackTrigger ControlID="Timer1" EventName="Tick" />
                                    </Triggers>
                             </asp:UpdatePanel>
                        </asp:Panel>
                    </ContentTemplate>
                 </asp:TabPanel>
           </asp:TabContainer>
       </ContentTemplate>
   </asp:UpdatePanel>
</asp:Panel>
<asp:Timer ID="Timer1" runat="server" Interval="3000" OnTick="Timer1_Tick" Enabled="False">
                                        </asp:Timer>

Dizzying to say the least, and I wrote it!  OK let's move on.

The task, is to initiate a VERY long running process (NO WE ARE NOT THREADING AN ASP.NET APPLICATION!) using the Button1_Click event, and then update the status using TextBox1 on a different tab, with the Timer1_Click event.  The timer interval is set at 3 sec, because we have no idea what the external system is going to do, how it will respond, or how long it will take.

As a side note here, the button that kicks this off is on another tab because there are some conditions on that tab that play into the external call.  For this discussion, those conditions are not important.

Now that we have the UI set up, let's look at the server-side code.  Remember, we are using the AjaxToolKit, and it's ability to handle most of the Javascript required to take care of switching tabs, and partial page refreshes, while still using the postback of the page to trigger the timer event (jQuery could do the same thing, but hooking to the endpoint of the external system was a bit more cumbersome in this scenario). Because this is a webApp, the external system is available server-side by adding a reference, and then hooking to the endpoint when we need to.

The Button Click event:

protected void Button1_Click(object sender, EventArgs e)
{
      TextBox1.Text = string.Empty;
      Timer1.Enabled = true;  //this starts the timer
       //change the ActiveTab property of the TabContainer to the tab that has the TextBox
       try
       {
          //set the Text property of TextBox1 based on the conditions that have been previously met
       }catch (Exception ex){
             TextBox1.Text = ex.Message + ex.StackTrace;
             Timer1.Enabled = false;
       }

}

The Timer Tick event:

protected void Timer1_Tick(object sender, EventArgs e)
{
       string result = <result of the cal to the external system>

      bool done= <condition for process to exit>;
      if( !done )    
      {
           //put something in the textbox
      }else{
           //put something in the text box
           //alert the user that the process is complete
      }
}

So far so good.  We've got UI, and server-side code, but that darn text box keeps going back to the top, no matter what we put in it. So we have to do something on the client-side to show some sort of progress.  We are going to do this inside the ContentPlaceHolder that is handling the content for out page, to make sure that the ToolKitScriptManager on the MasterPage is already loaded.  Otherwise, we get the dreaded 'Sys.Application is not defined' error, and we have a hard time (at least I did) figuring out why.

With the AjaxToolkit, there are all kinds of things that are registered with the page behind the scenes.  One of them is the Sys.Application object.  We are going to use this object to tell the page to load a JavaScript function every time this page is initialized. Because we are using a timer that is set to every three seconds, that is how often this action will happen, until the timer is turned off.  So here we go, and I'll explain a bit as we go along:

<script type="text/javascript">
      //first we tell the application that there is a javascript function that we need to add
       Sys.Application.add_init(JavaScriptFunctionToRun);

       function JavaScriptFunctionToRun(sender){
             //next we tell the page that we need to do something after the HttpRequest is completed
             Sys.WebForms.PageRequestManager.getInstance().add_endRequest(EndRequest);
       }


       function EndRequest(sender, args) {
               //here we are telling the page WHAT to do at the end of the request
               SetScrollPosition();
      }

      //and finally(!) the instructions to move the scrollbar to the bottom of the textbox
      function SetScrollPosition(){
              var textArea = document.getElementById('<%=TestBox1.ClientID%>');
              textArea.scrollTop = textArea.scrollHeight;
      }

</script>

If you've made it to this point in the post, I hope it was worth the effort.  If your process is especially long, you can show the user that the page really is doing something by using the Session, a label and the TimeSpan object in the server-side code.

TaTa for now, and Happy Coding!

Tuesday, July 12, 2011

ClickOnce ... Through the Looking Glass

By now, I'm sure my developer buddies that are versed in the Microsoft(r) paradigm have at least "heard" of ClickOnce.  That wonderful feature that was released with VS 2005, and has made many the small project ridiculously easy to toss onto a disc, push out to the company network, and even make it available to the general public.  Configurable, and fully operational, out of the box.

But what happens when your project domain requires that development happen behind the firewall, and installation of the completed project must escape the friendly confines of the company network, and venture into the hands of an external source control system, that lives beyond the firewall? What is a dev to do? The beauty of the technology is the ability to continually ensure that the end user has the latest bits.  But when the target machine is abstracted by this additional layer of complexity, things get a bit dicey, and a lot more challenging.

The scenario: A project is being developed on an internal company network, with the source code under a company approved source code repository.  Pretty standard stuff.  Here is where the wrinkle comes in.  Upon deployment, the installation bits must reside in an EXTERNAL repository, that is also version controlled..  Sound crazy, it's not. It really does happen.  So then, how can we take advantage of the simplicity of a ClickOnce deployment, in a complex environment like this?

First, let's set up the environment.  We'll need:
  1. Source control for the code base.
  2. Separate, external version control for the published artifacts
  3. IIS running on both the local dev box, and the installation machine.

The key lies in the design of the ClickOnce feature set. This type of deployment is designed to function over the network.  Keep that in mind, as we move along.  Something else to keep in mind is that the local machine is part of the network as well (as a developer, I forget that part sometimes). And, as we are talking about developing in a Microsoft-based environment, let's also consider the web server (IIS) on the local dev box ("127.0.0.1" to be specific) as part of the network too.

So far, our environment looks like this:

So here's what we do with this setup:
  1. Configure the local IIS with a virtual directory that points to a physical location on the local machine.  This location MUST be the same physical location that the application will be installed from on the client.  
  2. In the Publish tab of the project Properties, there is a text box to define where the application will be deployed to. In that box, define the publish location as" http://127.0.0.1/<project deployment location>/". In doing this, ensure that IIS on the development machine has a virtual directory of the name that you want, and that virtual directory points to the actual physical location that the application will look for when it is installed.  This is important, because when the application is actually installed on the machine outside the firewall, it will look here to find the files required .  And if set up properly, the developer can publish to this location on the dev box.
From here, the developer can write code to whatever he needs, and throw it to the local IIS, and then take that and upload the changes to the versioning repository (no we are not going to get into the fact that ClickOnce handles it's own versioning as part of the publish process).

I'm sure with the ability of ClickOnce technology to be used just about anywhere, this use case may seem unnecessary. From my perspective, you would be right, and those discussions were many, and often.  In the end, this is what the client wanted, it works, and that's all  that really matters

Wednesday, June 29, 2011

On the Outside Looking In

Scenario: You are working in a system where an external process must not only communicate with your subsystem, but must do so across an intranet, using web services as the communication mechanism.  And to throw a little more into the mix, the entire process is automated.

So the Scenario looks like this:

What to do? The Client Processes must operate in the context of a user that is local to the client, and has access to the system level operations on the client machine.  By default, the ASPNET user does not have these permissions. 

If you are working with any IIS version 6 or later, the solution would be to use impersonation, and you would be correct.  But in this time when enterprises like to hang on to their systems, and endure the slow torturous death of an operating system, we are working with IIS 5, where impersonation is not so easy to accomplish.  

There are a couple of solutions to this connundrum, neither of which presented themselves at first glance (or at least they didn't to me)
  1. Use the Message Queue to communicate to a central processing module, placing messages on the queue with the web service, and removing them with the command module.
  2. Allow both User spaces to remain separate, by using the FileSystemWatcher object of the .Net framework.
After watching the remote invocation of the web service fail to gain access to some proesses that needed to be stopped as part of the functionality required, it became clear that permissions issue was the source of not only my troubles, but some that had come before me.

The previous solution used the Messaging Queue to communicate with the system.  While it is a solution that deals with the issue at hand (communication is a limited permissions environment), the process was rather cumbersome.

  1. Create and serialize a command object with several properties for transmission across the wire
  2. Receive the object on the remote machine , and place it on the message queue
  3. With the command module on the remote machine, remove the object from the queue, and de-serialize the command object to determine what process to run.
  4. Execute the process on the remote machine with the command module.
The code that accomplishes all of this is rather convoluted, so I'm not going to even attempt to post it.  It took me two months to begin to understand what was happening when, and there are still some things I'm not clear on.

However, the solution that I chose is radically simpler, and still accomplishes the same thing.  The command module running on the remote machine is still there, although there is a bit more functionality, and a lot less complexity.

The new process goes like this:
  1. From a web service on the server, make a call to a web service on the client with the name of the command that you want to run.
  2. On the remote machine: Build a web service that will receive the command (a string, mind you), and then run the necessary function to communicate to the command module with a file in a very specific place.
  3. Use the command module to perform the required functions, based on what the dropped file says to do.
The scenario itself is not a common one.  Normally Web Services interact directly with a database of some sort, and then throw data back.  Here we are talking about interaction with a remote host, and not trying to interfere with anything else.

With the web services on the remote machine, it was pretty simple (code is in C#):

string dropDir = Path.Combine(AppDomain.CurrentDomain.BaseDirectory,@"Reclaim\");
string dropFile = string.Format(dropDir + "KillProcess.txt",a.Name);
if (!Directory.Exists(dropDir))
    Directory.CreateDirectory(dropDir);
StringBuilder bldr=new StringBuilder();
if(File.Exists(dropFile))
    bldr.AppendLine(File.ReadAllText(dropFile));
bldr.AppendLine(a.Name);
File.WriteAllText(dropFile, bldr.ToString());

And in the command application that is running under the required permission set, you have this in one method of the application
string dropDir = @"C:\inetpub\wwwroot\<Application Path>\Reclaim\";
if (!Directory.Exists(dropDir))
                Directory.CreateDirectory(dropDir);
            FileSystemWatcher reclaimer = new FileSystemWatcher(dropDir);
            reclaimer.EnableRaisingEvents = true;
            reclaimer.NotifyFilter = NotifyFilters.LastWrite;
            reclaimer.Changed += new FileSystemEventHandler(reclaimer_Changed);

and in the event handler for the FileSystemWatcher
           if (File.Exists(dropFile))
            {
                MethodCall();
                string[] procNames = File.ReadAllLines(dropFile);
                if (procNames.Length >= 3)
                {
                    foreach (string str in procNames)
                    {
                        App a = AppList.Where(ap => ap.Name == str).FirstOrDefault();
                        if (a != null)
                        {
                            MethodCall(a);
                            System.Threading.Thread.Sleep(3000);
                            Ano(a);
                        }
                    }
                    File.Delete(@"C:\Inetpub\wwwroot\<Application Path>\Reclaim\KillProcess.txt");
                }

            }

And that's it! Simple call to a web service on the remote machine, and an application running on the remote machine and waiting for changes in the file to act on

Sunday, February 27, 2011

Boise Code Camp ver. 2011

Spent the day at Boise Code Camp 2011, or Geekfest 2011, as my wife calls it .  And it was good for her because she's not been feeling well of late, and she got to rest without me fawning over her and stressing her out even more.

But on to the upside, I learned a lot today.  The problem is going to be retaining int all, and then finding ways to implement any of it.  Here's some of what I learned:

  1. jQuery has the power to completely rewrite your entire web UI (including the CSS) under the right conditions.  Please don't ask me what they are, because I haven't even BEGUN to play with it yet.  I'm still trying to wrap my brain around CSS and Javascript int he first place.
  2. WPF and Silverlight.  Need I say more?
  3. Mobile computing may become less business-friendly in the future.  Why? See my explanation below.
  4. In this country there really ARE kids that are moving into engineering fields.  Just ask David Star of Elegant Code.com. One of them is his son.
OK,  this mobile computing thing.  With the advances in Andriod, iOS, and Windows Phone 7, along with the HUGE leap in the available memory on today's phones in general, and smartphones in particular, one would think that it would be easier to have data run on the compact device as opposed to having to access business data from "The Cloud".  Not so fast.  I had a small discussion with the Microsoft Developer Evangelist for the Northwest area (Mithun is a really nice guy) today, and he gave me the short version of a few things that I had not taken the time to think about.

First, is the memory footprint that would be involved.  For SqlCE, the size limit on the database is 4GB, and then at some point, it has to be hooked up to a full install of  SQL server anyway,  in order to do anything with all that data.  The second thing that we talked about was what I felt was a missing piece in VS 2010.  I don't know how many have yet to realize this, but Visual Studio 2010 DOES NOT CURRENTLY SUPPORT the Compact Framework (this is really strange because with WinPhone 7 kernel runs on top of it, and you can only do WinPhone 7 development in VS 2010).  Mithun couldn't give me an answer on that one, except to say that he is as hopeful as we are that this issue gets resolved.  He couldn't give me a timetable, or even a solid commit on if it's even planned.  

Bed Time now.
Good-night

Monday, November 29, 2010

Back From Beyond the Blue

Well folks, I'm baaaack.  As the song goes, "You can't keep a good man down..."  A complete system change and I'm back and my notes have landed here, at Blogger.com.  In the interim, I tried a couple of cms's on my own, and they didn't quite work out. That, and the fact that I couldn't wrap my brain around the concept of putting UI code in a database.  For those of you that can do that, my hat comes off to you.

For now, this is gonna have to be it.  Thanks, and Happy Coding!