Support proxy servers in your applications

Much of the software I use on a day-to-day basis requires a HTTP connection to the Internet. Un­for­tu­nate­ly, not all of this software includes reliable Web proxy support for Windows Au­then­ti­ca­tion (NTLM). Whilst many people are connecting to the Internet from networks without proxy servers, I'm often connecting from corporate networks through Microsoft ISA Server.

Here is some advice for anyone writing software that uses that needs uses the Internet:

  • Include proxy support in your ap­pli­ca­tion. You'll not believe how many ap­pli­ca­tions get un-installed because they don't support proxy servers.
  • Ensure that your proxy supports auto-con­fig­u­ra­tion (.pac) files. If you don't go this far make it clear how the proxy host name should be specified, whether to include "http://" at the beginning and what port number to use.
  • Provide support for various au­then­ti­ca­tion mechanisms. Many corporate networks use NTLM au­then­ti­ca­tion. If your ap­pli­ca­tion runs on the Microsoft CLR you have support for this au­then­ti­ca­tion with the Cre­den­tial­Cache class. Native ap­pli­ca­tions can use the support available in WinInet or the more recent WinHttp. The latter includes a proxy con­fig­u­ra­tion tool to make life a little easier.
  • Respect user cre­den­tials. If a user has to explicitly provide their NT logon cre­den­tials to your ap­pli­ca­tion make sure to store them securely.
  • When requests fail provide useful error messages and server names to the user. This will help them figure out how to make con­nec­tions work. A lot of times setup is a process of trial and error for users who aren't provided in­for­ma­tion by network ad­min­is­tra­tors.

Tagged with authentication, ntlm and webproxy.

Data Access Pain

One of the things that I find most frus­trat­ing about on .NET projects is working with relational data sources. My experience with DataSets in the 1.x days was far from positive. They proved too in­ef­fi­cient and difficult to debug. This has changed in 2.0 with the many im­prove­ments to the API and the in­tro­duc­tion of vi­su­al­iz­ers to the integrated debugger. I'm still not sold on this solution, but at least things are improving ;)

My preference has been to develop a layer of custom objects which get called from the upper layers of the ap­pli­ca­tion. This is very flexible and easy to debug. In addition, you can create these objects without having any back end developed so that pro­to­typ­ing is simpler. To be fair this can be a bit time consuming, and I have tried to augment this with code generation using CodeSmith. Working this way lets me deal with objects in a fashion native to the .NET platform, take advantage to in­tel­liense and simplify unit testing.

I'm looking at two other solutions - LLBLGen Pro and NHibernate. LLBLGen seems to be better suited to my needs at present since it has a better user experience. Both of these tools map generated objects to the tables in the database, so you can avoid switching back and forth between pro­gram­ming models. Complex queries are expressed using custom syntax and this is where the story sours for NHibernate and LLBGen to a lesser extent. LLBGen makes it simple to wrap existing stored procedures so this is po­ten­tial­ly useful when the the SQL gets complex. Ideally I'd like to rid myself of the relational model and SQL altogether but I guess we're going to have to live with it forever.

On this topic it's worth reading a paper by Ted Neward on the object-relational divide and various tech­nolo­gies that have been developed to bridge it. The paper was for MSDN so it covers the LINQ technology that will likely be part of C# 3.0.

Tagged with databases, llblgen and nhibernate.

Experiences with Atalasoft DotImage controls

I've been evaluating some imaging controls from Atalasoft for a client project. The ap­pli­ca­tion uses Windows Forms which poses some licencing issues with many imaging components out there. After some searching I ended up on the Atalasoft site and downloaded a trial. What you get in the box is impressive: hybrid managed C++/C# assemblies that don't rely on native code, excellent online help and a number of sample ap­pli­ca­tions that cover useful areas of the API. These haven't been updated to support .NET 2.0 features such as Back­ground­Work­er but this is simple, if tedious to code yourself.

Unlike vendors that have carried a product forward from the COM days, Atalasoft have im­ple­ment­ed an object model which is close to the framework guidelines. Base func­tion­al­i­ty in the toolkit is good, but DotImage Pro is where the cool WinForms bits live. They include Thumb­nail­View and FolderThumb­nail­View classes which can load from custom objects or watch the filesystem re­spec­tive­ly. I'd imagine most people just need to load thumbnails from disk, but my ap­pli­ca­tion needs to load images from a range of sources.

I used the PDF Rasterizer extension to extract thumbnails from an Acrobat document and was pleasantly surprised by the results. Memory con­sump­tion was low and didn't increase massively even with large numbers of thumbnails. You can find out more about the memory management on the Atalasoft site. As I use more features of the toolkit I'll probably post some snippets online.

Tagged with net, atalasoft and imaging.