JATY Bluetooth Hands Free Kit

By Eli Ofek at March 15, 2008 23:54
Filed Under: Personal, Personal


this is some kind of a weird post, as it's not really a professional material.
We recently got a JATY Bluetooth Hands Free Kit for the car, Model JBHW-0310C.
No Manual, It got lost in the installation.

We tried to pair it with my Nokia mobile phone, but failed to make it "publish" itself in pairing mode.
Even when calling Cellcom support, and got redirected to their Internet support (*320),
They just told as that it will enter pairing mode when pressing the + button for a few seconds,
Which did not work.
I decided to search for the manual on the net, but failed to find it,
even in the Korean company itself.

I finally managed to find a hard copy manual, which solved the problems
(Pairing mode is started by pressing "Send" + "End" for 3 seconds.)
I decided to scan it and upload it here so the mighty Google will bring it to the next person who needs it.

Some relevant data regarding pairing from the manual:

*What is Pairing?
- Pairing is the process of establishing device addresses
and connection parameters for the devices so that they
automatically connect when required.
Pairing need only be done once for each device.
- So, before you use Bluetooth Car Kit, you must link it to your phone.
Once the Car Kit and phone are paired, the Car Kit will automatically connect to your phone
every time you start your vehicle or power.

Pairing and Connect the phone.
Before you use Handsfree Kit, you must link it to your phone.

1) Please press send and end button together for 3 seconds to go to pairing mode and than,
the pairing mode keep 60 Seconds.
2) Start pairing with your phone (refer to your phones manual)
3) Password for the pairing is "0000"

Note: These instructions may differ from your phone but will be the same process
(1) Put the Handsfree kit into Pairing mode as in section above.
(2) In your phone's Menu find Bluetooth and turn On.
(3) Search for a New Paired device.
(4) Select JBHW-G508" and enter password 0000'.
(5) Next select options for "JBHW-G508" and select Connect (Required for some phones only)

Auto Connection
1) When you turn on the Car Kit. Car kit will search for the last 3 devices.
It takes 6 second for each devices.
2) If auto connection fail, there will be a short beep sound.
3) If there's no paired information stored, Handsfree Kit will go to the Paring Mode.
4) If you press End button for 4 second, you call disconnect the phone from the Handsfree Kit.
Note: Some phone needs authorize for the auto connection.

Deleting Paired Devices

If your kit will not connect or if you pair using the wrong mode,
try deleting all stored paired information and then Re-Pair your devices.

1) Press and hold Volume Up(+) and Volume down(-) button for 5 seconds.
Note: You can pair the JBHW-G508 With up to 8 different phones.
ie... The JBHW-G508 can stove connection information for 8 devices,
but can connect with only one phone at a time.

The full manual can be downloaded here:
JBHW-0310C – PDF Format
JBHW-0310C – Zipped JPEGs


Back me up, Carbonite!

By Eli Ofek at January 16, 2008 23:23
Filed Under: Personal, Personal, Proffesional, Proffesional

Hi all,

About 3 months ago I was privileged to hear David S Platt's "Why Software Sucks" show.
During the show (Which is both interesting & fun!) David gave an example of Backup software,
telling the story about how his hard drive crashed with no updated backup…
This brought me to wonder …
When did I last back up all my latest photos, documents and PST to a DVD?
I really couldn't remember, which told me it's probably not too good…
David demonstrated some screenshots from Carbonite,
An online backup software he seems to admire, after using it for quite a while.
When I got home, I Googled for Carbonite, and found out that for less than 50$/year I can backup
all my data with no limitations…
At first I was a bit skeptic about the idea…
My backup size is above 25GB… but I decided to give it a try.
They give you 2 weeks of free trial, which is quite enough to feel it…
After 2 weeks I managed to backup over 10GB of data in low bandwidth priority
over a 1.5Mb/128Kb line …
I decided to subscribe.

Now, 3 months after, I can say I am very happy!
All my documents, photos, mail & even my home videos are regularly backed up without
needing to do much, and I am now over 27GB!
I was impressed that during this time Carbonite never crashed or gave me trouble,
and even the Email support I got to use once for asking a question was prompt.

One thing I must say: If you have a large backup store like mine
(Most people don't in my opinion), bare a bit with the software until if finishes
the first upload… my suggestion is to let it run in full capacity during the night,
and in law priority when you are using the PC.
After the initial backup, only diffs are uploaded, even if the files are in use (like a PST)
which is not noticeable.
It's even better then a DVD, as the data is stored many miles away,
which is like a nice data DRP backup site

There are other similar solutions over the internet, in different levels,
Starting with Mozy, and even Amazon's S3, but this was the cheapest service
I could find for large backup stores (Over 2 GB) which is both friendly & simple.

If you want to give it a try, you can use this referral link.
If you get to purchase a subscription after the trial like I did, you will get an extra month bonus
period and me too J


Slow Assembly loading in Intranet environment

By Eli Ofek at November 08, 2007 16:37
Filed Under: .Net, .Net, Proffesional, Proffesional


Lately we ran into a weird problem when using Enterprise Library 3.1 in the Intranet environment,
Where Internet connection is not available.

The symptom was a slow loading of the referenced assemblies during the application startup.
We managed to reproduce the problem using a simple console application:


using System;
using System.Diagnostics;
using System.Collections.Generic;
using System.Text;

namespace TestCrlSample
class Program
        static void Main(string[] args)
        Console.WriteLine("Started at:" + DateTime.Now.ToString());
        Debug.WriteLine("Started at:" + DateTime.Now.ToString());

        Stopwatch sw = new Stopwatch();

        Console.WriteLine("Total Time elapsed(Milliseconds):" + sw.ElapsedMilliseconds);
        Console.WriteLine("Ended at:" + DateTime.Now.ToString());
        Debug.WriteLine("Ended at:" + DateTime.Now.ToString());
        Console.WriteLine("Press any key to exit...");


    public static void DoSignedLibWork()
        Console.WriteLine("Before Work at:" + DateTime.Now.ToString());
        Debug.WriteLine("Before Work at:" + DateTime.Now.ToString());
        Microsoft.Practices.EnterpriseLibrary.Data.ConnectionString cs = new
Microsoft.Practices.EnterpriseLibrary.Data.ConnectionString("Data Source=DBSRV;Initial
         Catalog=Repository;Integrated Security=True"
, "Admin", "Bla");
        Console.WriteLine("After Work at:" + DateTime.Now.ToString());
        Debug.WriteLine("After Work at:" + DateTime.Now.ToString());



Here are some test results we got when diagnosing the problem:

Running the program as usual in an Intranet environment, no Internet connection at all:

Started at:22/07/2007 18:16:47
Before Work at:22/07/2007 18:16:56
After Work at:22/07/2007 18:16:56
Total Time elapsed(Milliseconds):9234
Ended at:22/07/2007 18:16:56
Press any key to exit...

Notice how long it took this simple program to run… almost 10 seconds !
We ran Microsoft Network Monitor during the test to check what's going on behind.

This is the Netmon output:

111    5.019387        DNS    DNS: QueryId = 0xAEAE, QUERY (Standard query), Query for crl.microsoft.com of type Host Addr on class Internet
112    5.024270        DNS    DNS: QueryId = 0xAEAE, QUERY (Standard query), Response - Server failure
113    5.024270        DNS    DNS: QueryId = 0xAEAE, QUERY (Standard query), Query for crl.microsoft.com of type Host Addr on class Internet
123    6.024270        DNS    DNS: QueryId = 0xAEAE, QUERY (Standard query), Query for crl.microsoft.com of type Host Addr on class Internet
124    6.024270        DNS    DNS: QueryId = 0xAEAE, QUERY (Standard query), Response - Server failure
125    6.024270        DNS    DNS: QueryId = 0xAEAE, QUERY (Standard query), Query for crl.microsoft.com of type Host Addr on class Internet
126    6.024270        DNS    DNS: QueryId = 0xAEAE, QUERY (Standard query), Query for crl.microsoft.com of type Host Addr on class Internet
127    6.024270        DNS    DNS: QueryId = 0xAEAE, QUERY (Standard query), Query for crl.microsoft.com of type Host Addr on class Internet
128    6.024270        DNS    DNS: QueryId = 0xAEAE, QUERY (Standard query), Query for crl.microsoft.com of type Host Addr on class Internet
129    6.024270        DNS    DNS: QueryId = 0xAEAE, QUERY (Standard query), Response - Server failure
130    6.024270        DNS    DNS: QueryId = 0xAEAE, QUERY (Standard query), Response - Server failure
131    6.024270        DNS    DNS: QueryId = 0xAEAE, QUERY (Standard query), Response - Server failure
184    10.024270        DNS    DNS: QueryId = 0xAEAE, QUERY (Standard query), Query for crl.microsoft.com of type Host Addr on class Internet
227    14.025102        DNS    DNS: QueryId = 0xB6AF, QUERY (Standard query), Query for crl.microsoft.com.myorg.com of type Host Addr on class Internet
228    14.030961        DNS    DNS: QueryId = 0xB6AF, QUERY (Standard query), Response - Name Error
229    14.030961        DNS    DNS: QueryId = 0xBFAC, QUERY (Standard query), Query for crl.microsoft.com.myorg.com of type Host Addr on class Internet
230    14.035844        DNS    DNS: QueryId = 0xBFAC, QUERY (Standard query), Response - Name Error


Notice how the DNS requests for "crl.microsoft.com" took almost 9 seconds !
This was very funny, so we googled for this symptom.

Internet research shows these results:

Support Certificates In Your Applications With The .NET Framework 2.0:

Microsoft, VeriSign, and Certificate Revocation:

How Office Performs Certificate Revocation:

This one is talking about IE slowness:

Management Studio slowness:
talking about the similar symptoms, only when using MS SQL Management Studio.
The cause of the problem is the same…

FAQ, Why does SSMS take 45s to start up?

Why does the .NET Runtime Optimization Service keep trying to use the internet:

so, what have we got here?
It seems that Microsoft added a mechanism to the .Net CLR that checks every signed assembly when loading it
against an online revocation list.
When working in an intranet environment with IP segments
that are no defined as local Intranet, the mechanism is trying to locate the crl server for 9 seconds before it
gives up, delaying the assembly loading.

When checking the Enterprise Library Common assembly,
We see that is indeed signed using a Microsoft Certificate:

How do we work around it you say ?
Well, there are several options.
The easiest one is to disable the crl check. Oddly it is done from the Internet Options dialog available from Internet Explorer:

Now, let's run our test program again:

Running the program after removing the check:

Started at:22/07/2007 18:17:47
Before Work at:22/07/2007 18:17:47
After Work at:22/07/2007 18:17:47
Total Time elapsed(Milliseconds):147
Ended at:22/07/2007 18:17:47
Press any key to exit...

Notice that this time we are down to a total of 147 milliseconds, which is much more reasonable.
What about the netmon output ? Well, since there is no check
Netmon output is empty !

What about security you say?
If you are canceling the crl check, you are exposed to bogus certifications.
My answer to that is: If you are already disconnected from the Internet,
then you are just exposed as before, only this time you are not slowing down your applications for nothing.

A questions might be asked about situations where Internet connection is partly available through a firewall.
In this situation you might consider asking the network administrator to allow connections to the crl server,
Or if you which, make it fail the requests immediately, so you don't need to configure the Internet Options for every node.

Another optional workaround is to define the crl address in the etc/hosts file pointing it to localhost (,
which will quickly fail every crl request it gets.

Let's run our test again, this time without removing the revocation check, but defining the crl address in the hosts file:

Running the program after redirecting the dns name to localhost:

Started at:22/07/2007 18:18:37
Before Work at:22/07/2007 18:18:37
After Work at:22/07/2007 18:18:37
Total Time elapsed(Milliseconds):219
Ended at:22/07/2007 18:18:37
Press any key to exit...

Notice that this time we are using 219 milliseconds, which is a bit more then the first workaround, but still reasonable.
What about the netmon output? Well, since there is no outside communication
Netmon output is empty in this case too!


When working with signed assemblies having no Internet connection to crl.microsoft.com,
you need to consider the loading delay time for the assemblies, Or work around them as suggested above.

Good Luck,


Team System - Phases 4,5 - Source Control & Bug Tracking

By Eli Ofek at August 25, 2005 13:05
Filed Under: Team System, Proffesional, Team System
Since we decided to stop the pilot for now, we did not check the Source Control & Bug tracking thoroughly.
We already know that the performance is bad.
From functionality point of view, I can say that we used almost all the features at least once & everything worked.
We did not try use cases where 2 people are working on the same code doing merging.
Since performance is one of the big issues of phase 4,5,there is no point in testing it now.
We can only say for now that the functionality & the integration is nice.
I will post new information when the pilot will be resumed after the release version comes out .

Team System - Phase 3 - Migrating the code from 1.1 to 2.0

By Eli Ofek at August 24, 2005 16:09
Filed Under: Proffesional, Proffesional, Team System, Team System
After migrating all the source files, it's time for the actual code migration.
AS stated before, the pilot is implemented on a real project.
The project is divided to 1 infrastructure subsystem and another 4 subsystems.
For now, we decided to migrate only the infrastructure subsystem,
and one of the other subsystems, that contains both server & client code, so it's a good case to work on.
The conversion was done on a VMWARE machine, running with one 2.7Ghz CPU & 512MB RAM with Windows 2003 Server SP1.
We asked Ziv (one of the senior developers in the system) to migrate the code.
Ziv did not have prior experience with CLR 2.0. Most of the findings here were translated from and Internal Report Ziv wrote.
The target was to make the code compile with no errors or warnings what so ever,
and then make all the unit tests pass as green ( Nunit 2.2.2, see prior post).
It took 2 days to make the code compile, 2 more days to solve runtime errors, and another day to make all the unit tests pass.
No QA tests were done to the system beyond that.
All the code changes were made with special regions & were well commented,
for easy merging to the  real dev branch in the future.
It took a total of 5 work days for 1 senior developer to complete the mission.
General impression from the code migrations stage:
The new IDE is pretty stable for a Beta 2 version. during the working week we encountered only few Exceptions/Aborts.
The newly added features such as code refactoring etc are great.
The main problem was performance. The performance is unacceptable for normal developing.
Compilation time increased about 5 times more then VS 2003, the IDE often freezes for several seconds which is pretty annoying...
Another general problem we encountered is that the new IDE make a bad assumption that the developer has an Internet connection.
A big part of the links in the IDE are links to the Internet and not the local MSDN installation.
in our case the developers work in a closed intra-net environment, so it's a bit frustrating.
During the migration, the compiler often tries to help by giving a link to MSDN
which explains how to convert the code which is great, but most of the links again - refer to the Internet,
even if the information is also available in the Local MSDN installation.
sometimes the explanation was available only in the Internet, which forced us to go to another network and print some pages there...
The new MSDN Feel & Look is nice, it takes time to get used to for us "old" developers,
but it seems like an improvement as the search tab now tries to be a "Search Engine" like.
The big problem with the new MSDN was that it was very buggy.
The search engine constantly returned false results. For example, if I search for the work "XML" I get X documents,
and when I search for "XML and Schema" I get Y documents, and Y was greater then  X which is logically impossible.
it gave us a bad time while searching for migration information, especially since the original compiler help links did not work for us.
I edition to that , the new MSDN Archive Manager which is a nice Idea, gave us trouble too.
We selected several topics to be included, but when we saved and open the MSDN tool,
the list of topics we got there was different from the one we selected...
The integration with the Team Foundation Server is very nice & user friendly, but performance is terrible too.
The Team Foundation server has memory leaks which causes performance degradations, and after 3 days of work with 1 developer
I had to restart the services in order to be able to continue testing in a reasonable time...
List of Source Code changes we made during the migration phase:
  • Changes we made in order to make the code compile with no Errors or Warnings:
    • Major Changes:
      • The usage of XML Schemas has changed. The new framework uses a class called XmlSchemaSet which represents a collection of schemas which compile together to a single logical unit.
      • The usage of XML Readers has changed. The new framework makes the XmlValidatingReader obsolete. Instead, we now need to work directly with the XML Reader, supplying it with an object of type XmlReaderSettings which defines some attributes on the reader we have created.
      • Compile-time error identification of unmanaged resources usage problems. I must say we liked this new feature a lot. the new framework identifies in compile time bad usage of Finalizers and reports on it. We encountered only one instance of this case in the system and replaced it with a correct implementation. Coolness.
    • Minor Changes:
      • The Assembly signature method has changed. in CLR 1.1 we signed the Assemblies using the AssemblyKeyFile Attribute usually found in the AssemblyInfo.cs file. Instead, the new framework makes as use  Project Level definition. This is not so bad, only it was frustrating to change it manually to our 70 assemblies. in VS2003 we could select multiple projects and change a property for all of them at the same time. In 2005 The GUI did not let me select multiple projects for the Project properties dialog. I wonder why.
      • The usage of Path.InvalidPathChars was replaced with the Path.GetInvalidPathChars() method.
      • The Certificates usage was changed. The new framework defines a delegate method on ServicePointManager for Certificate validations instead of the class usage which replaces the standard behavior of the Certificates Control.
      • Form.AutoScaleBaseSize used to be an int value. In the new framework it was replaced with Form.AutoScaledimensions which is a float.
      • The AppDomain.GetCurrentThreadId() Method was replaced with CurrentThread.ManagedThreadID.
      • The SmtpMail class is now obsolete. there is a new class which replaces it. we decided to remove the entire code that uses SMTP which implemented special critical error log reporting using email, since we never used it in the past 3 years.
      • The Dns.GetHostName() method was replaced with Dns.GetHostEntry() method.
      • Parameter.Add() method was replaced with Parameter.AddWithValue() method.
  • Changes due to runtime errors:
    • This is a new cool feature of the runtime. The new Framework identifies at runtime of objects trying to access  other objects from a Non GUI thread to a GUI Thread object without a proper Invoke. It throws a proper Exception. we identified only one case in the system (Bad code in the Client's Splash Screen). This is very cool, because such code can ofthen create random results, and hard to re-create bugs. Other instances of the problem might come up when the system will be sent to QA for a proper regression test.
    • Microsoft score again. Another cool feature ! The runtime now identifies inconsistencies between XSD Schemas of Datasets, against the real Database Types ! When the runtime encounters an inconsistency, it thrown an exception. The the system we encountered 2 instances of the problem, and the Schemas were corrected according o the DB Types. I guess it happened when someone changes the DB without changing the DAL Code properly.
  • Changes made due to Unit Test Failures:
    • The new framework adds the `xml:space="preserve"` Attribute when it finds an empty node in the XML Document. We added the new attribute to the proper places in the XML files.
  • Some Unresolved Issues we decided to let go for now:
    • The client application had a special mechanism that used a special "Home made" Semaphore ( CLR 1.1 didn't have a built in one, CLR 2.0 does). the implementation was bad and we knew it for some time now. it caused a some bugs in the GUI, but they were very rare. When running on CLR 2.0 for some reason the mechanism stopped working almost completely. We did not try to fix or search why, as the proper action here is to re-write the entire mechanism correctly.
    • The server Application started to load very slowly. Short investigation discovered that there is a problem the the Assemblies loading phase that causes some kind of a deadlock. The runtime releases the deadlock somehow after 20 seconds and everything goes on well. It's mainly annoying for debug time to wait these extra 30 seconds each time we start the server process...
    • The Client Splash Screen started to appear in the top left corner of the screen instead of the center. We didn't bother to check why for now.
Pilot Milestone Conclusions:
Due to these findings we have decided to stop the Pilot process at this time.
We understood that the migration process is not very heavy,
so we don't need to start it too much ahead before we actually want to transfer to 2.0.
We are estimating that giving the knowledge we got during this work week,
 the entire team (4 People) can Migrate all the system in one more week).
One of the possible ways we thought of going here before the pilot
was to transfer the dev team to start developing Version 2.0 of the System which starts in less then a month on the Beta2,
with the new Team Foundation, and only transfer them to the release version in November after the Whidbey final release,
as Version 2.0 of the system is not expected to reach production stage until the middle of 2006.
Sadly, since the development experience with the current performance is not acceptable,
we have decided to wait for the final Release in November,
Hoping that Microsoft will fix the performance problems & server Leaks by then.
We saved the converted source, awaiting November 7.
Phase 4 & 5 of the Pilot will not happen for now,
I will write some conclusions about it in a separate post, but it won't be informative as I planned too.

Team System - Phase 2 - Migrating VSS 6.0d to Team Foundation

The Environment is ready,
Even played with it a bit, creating new Team Projects, and all is well.
Experience thought me a long time ago, that if something is working for several files,
It won't necessarily still be working just as well for hundreds or thousands.
The mission in this phase was to import the real project's VSS DB (6.0d)
with all files in all versions & labels into the newly created Team Project.
The VSS DB is currently 2.6 GB, after working on it for a few years now.
I figured that it's a good challenge for the Team Foundation.
For the mission I needed a disconnected copy of the VSS DB, which I can use with the VSSConverter tool ,
in order to migrate it to the team system project.
This request discovered some unpleasant information:
All the legato backups from the last year kept a corrupted DB, since users were never disconnected from the db before each backup ...
I notified the person in charge, and asked for an archive file, in order to restore my own copy I can work with aside.
Alas, I found out that for some weird reason, the Archive tool (Both the GUI & Command line) could not archive the DB properly.
The archive file was always created suspiciously too small,
and when opening it we found that it archived only a small portion of the projects in spite the fact that we specified to backup everything.
With all this trouble I decided not to wait any longer
(this procedure took several days, it takes lots of time to archive & restore 2.6 GB VSS DB for each "try"),
And decided to change my target.
I have decided to let go on all the file versions & labels, and just import a pure latest revision to the Team foundation server.
The challenge is now easier, but still dignified.
We are talking about a total of 140 MB for all files, around 4,400 of them...
I achieved that by doing a "Get Latest" on all the projects recursively to a folder in the file system.
Now I had to disconnect all the projects from source control, but because there are so many of them, I didn't want to do it manually,
So I used a batch tool to do the job for me - "SourceSafe Binding Remover tool " from GotDotNet.
The tool seemed to work well, reporting no errors. (I removed all the RO attributes from all the files before running it).
Later on , I opened each Solution file (there were about 5 of them) using the VS  2005 IDE,
and used the "Add To source Control" menu item.
The wizard reported that the project files contain some "left overs" from a previous source control system, and offered to correct the files.
I accepted the offer, and it worked like magic.
It took about 5 hours to import all the files, but it finished with all files safely in the Team Project.
I checked to see the SQL DB "Data" folder, and the size was ~ 200MB, which seemed pretty reasonable considering I have just added about 140MB of data to it ...
this concludes Phase 2 of the experiment. Didn't get it as I planed too, but got enough to work with for the next phase.

Team System - Phase 1 - Installing the environment - Cont.

By Eli Ofek at August 22, 2005 14:54
Filed Under: Proffesional, Proffesional, Team System, Team System
Hi friends,
Again, didn't have time to blog my findings until now, but I will try to catch you up with the experiment.
While continuing to install the environment, I ran into some more problems, most of them easily fixed after googling for them.
the first problem was getting the "Initialization for plugin "Microsoft.Pcw.wss" failed"
Message in the portal site.
The problem is easily solved if going into the IIS manager and moving the "Report" & "Reports Server" Virtual Applications
to a new pool, so they can run under ASP.Net 2.0 by themselves.
The full explanation can be found at Microsoft Technical Forums, here.
Another problem I ran into happened after creating a new Team  Project to play with, which later on I wanted to delete.
Surprisingly, I found out that there is no obvious way from the IDE to do that.
I did not expect an "easy" way, as no one wants this kind of action which is very destructive to be activated by people "with an easy finger on the trigger".
Googling for this again brought me to a command line solution that is working pretty well.
You can find the "How To" right here on Buck's Blog.
Another problem discovered later on in an Exception in the "Work Items" Web Part.
This was easily solved too by adding a new Work Item so the work items repository would not stay empty.
This consludes the environment installation.
Next: Importing the code from VSS to Team Foundation.

Fighting Blog Spammers

By Eli Ofek at August 05, 2005 09:15
Filed Under: Personal, Proffesional
Hi People !
Lately I have been getting a lot of spam in my blog from spammers posting ads
using the blog comments system or add trackbacks to it.
To fight this I have turned on dasBLog's captcha  system.
It means that when you add comments,
you will have to enter a string of chars copied from an image near the save button.

Playing with Team System Beta 2

By Eli Ofek at June 06, 2005 20:32
Filed Under: Proffesional, Team System
I started testing Microsoft's new Team System Beta 2 (Visual Studio 2005 & Team Foundation).
On the coming weeks I will test them with a test case of a real project,
the same one I used to convert Unit Test from Nunit 1.0 to Nunit 2.2.2.
The test will consist of the following phases:
1) Installing the environment.
2) Loading all code from VSS 6.0d to Team Foundation Source Control
3) Converting all code from Framework 1.1 to 2.0, without warnings And all Unit Tests green.
4) Testing the source control (Labeling, Branching etc...)
5) Testing the Bug Tracking.
From Time To time, I will publish interesting stuff I encounter during the process, so stay tuned ...

Been Busy

By Eli Ofek at June 06, 2005 20:15
Filed Under: Personal
Hi friends,
I opened my Blog, having tons of ideas to write about,
But all you got was 2 months of silence.
I've been busy at work, going on some trips, scuba diving, and lots more.
(I will upload some new photos to my site album soon)
Hopefully I will have some more free time on the coming weeks to write ...

Migrating from Nunit 1.0 to Nunit 2.2.2

By Eli Ofek at April 03, 2005 21:39
Filed Under: Proffesional, Proffesional
Hi all,
One of our projects at work began 3 years ago.
at that time we searched for a Unit Testing tool for .Net.
After trying some options, we made the right decision and chose Nunit 1.0 as the Unit Testing tool for the project.
Nunit GUI 1.0 was not perfect for our requirements at that time, so we used the benefits of open source code and
inserted some light changes, that later on became a natural part of Nunit on the Public version.
When Nunit grow and developed it became an excellent tool,
only we could not start to use it easily as Nunit 1.0 was not Attribute based, and we had tons of "Legacy"
Unit testing code in our hands and a tight project deadline ...
So it became that after 3 years of coding, the project stayed with Nunit 1.0.
Only recently the team has decided that enough is enough and it's time to move on.
After consulting with Roy Osherove we have decided on a general plan of converting the code
by using an evaluation version of Resharper & Modifying the Nunit 1.0 open source interfaces
to become as close as possible to Version 2.2.2.
Me and Galia decided to take responsibility for the mission and transformed the code in less then a day of work.
We decided to share the experience in case there are other people around stuck in the same problem ...
Here is the recipe for transforming Nunit 1.0 Tests into Nunit 2.2.2 Tests
A bit of technical information about the project:
  • Over 50 Assemblies
  • Written in C#.
  • Using Visual Studio 2003.
  • Visual Source Safe 6.0d Source Control
Here are the steps we went through:
   "Transformation milestone X" is defined as the actions:
  1. Check in all you Code ( Keeping it checked out, ore checking it out again)  commenting it with the changes made so far.
  2. Create a label "Nunit Upgrade - X"
  • Coordinate a day with all the development team so that you can check out all the code without intervention
    • (Unless you are working with a Source Control System that allows nice branching)
  • Install Resharper in the Workstation.
    • If you have 99$ to spare, it will be a money well spent in my opinion.
    • If not, use the evaluation version as we did, and uninstall it when you are done.
  • Make sure all code is checked in the source control .
  • Clear your working folder on the hard drive.
  • Get all code from source control and Re build all projects.
  • Run all Unit tests in all the projects to make sure all is green and that we are starting on a "clean surface".
  • In case you have multiple projects spreading over a few solutions, create one solution with all the projects.
  • Create a Label in the Source Control named "Before Nunit upgrade"
  • Check out all code in the new solution.
  • Rename all methods to begin with "Assertion."
    • This conversion was done using the IDE's  Find & Replace dialog.
    • Replace all "TestCase." instances with "Assertion."Find instances with one white space before name and ( after. and replace them.   
       According to your coding conventions, you might need to change the spacing, tabs prefixes and so on..          
    • Here are the strings we needed to replace ( for some projects you might need more)
      • "AssertEquals("
      • "AreNotNull("
      • "AssertNull("
      • "AssertSame("
      • "Assert("       - This one is done last !
    • If code uses C#'s Debug.Assert, correct it using the transformation: Debug.Assertion -> Debug
  • Create Transformation milestone "After renaming methods to static formation"
  • Add Nunit 1.0 source code to the solution.
    • You can download the source from here.
    • You only need to add the NunitCore project.
    • Rename the project from NunitCore to NunitCoreOld
  • Change all assembly references from the binary file to the added projects.
    • We know .. it's dirty work, but we didn't think it was time worthy to automate this process...
    • Save & Build all. Check that there are no compilation errors.
      • If done correctly, none should be.
      • If there are, fix them.
  • Using Resharper , Auto rename all public methods of "Assertion" class in the NunitCoreOld project.
    • These are the transformation we used. We actually found out not all the methods were is use, so we got lazy and did only the ones we needed. you might need to add some more to the list. simply check out the new API documentation for differences.
      • Assert -> IsTrue
      • AssertEquals -> AreEqual
      • AssertNotNull -> IsNotNull
      • AssertNull -> IsNull
      • AssertSame -> AreSame
    • Rename "ExpectException" attribute class to "ExpectedException"
  • Save & Compile code to make sure there are no errors, If there are any, fix them.
  • Now we need to change all parameters order according to the new API in the "Assertion" class using Resharper.
    • The Rule is "For all public methods that has a string as a first parameter, move it to be the last.
    • For methods that has 3 string parameters, sample the transformed code to see it was done correctly. We found a problem, didn't bother to make sure if it was a software error or a human error.
  • Save & Compile code to make sure there are no errors, If there are any, fix them.
  • Using Resharper, rename the "Assertion" class to "Assert".
  • Save & Compile code to make sure there are no errors, If there are any, fix them.
  • Create Transformation milestone "After Resharper transformation" .
  • Install Nunit 2.2.2
    • You can download the installation package here.
  • Change all project references from NunitCoreOld  to the new nunit.framework binary in the GAC from Version 2.2.2.
  • Remove NunitCoreOld from the Solution.
  • Create Transformation milestone "Referencing Version 2.2.2" .
  • Now we need to deal with adding the new Attributes of the 2.2.2 framework to the tests.
    • We did all the transformation in this section using the IDE's Find & Replace dialog with the Regular Expressions checkbox enabled.
    • Here are all the transformation & Regular expressions used to make them:
      • TestCase inheritance -> [TestFixture]
        • public class {:c+}.*TestCase -> [TestFixture]\n\tpublic class \1
      • Setup method ->  [SetUp] :
        • protected override void SetUp -> [SetUp]\n\t\tpublic void SetUp 
      • TearDown method -> [TearDown]
        • protected override void TearDown -> [TearDown]\n\t\tpublic void TearDown
      • TestMethod - > [Test]
        • public void Test -> [Test]\n\t\tpublic void
        • ( notice one white space at the end of the replace string ! )
      • Removing Inheritance:
        • \([Ss]tring name\).*\n*.*base\(name\) -> \(\)
  • Save & Compile code to make sure there are no errors, If there are any, fix them.
  • Create Transformation milestone "Changing to Attribute formation" .
  • Run all tests using the new Nunit 2.2.2 GUI.
    • You might find some errors created by bad transformation . we found only 2. it took 5 min. to solve them.
  • Create Transformation milestone "Ready for Nunit 2.2.2" .
  • Delete old Nunit Source code from Source Control & File system. (Normal delete, not permanent)
  • Delete old Nunit 1.0 binaries from source control * File system. (Normal delete, not permanent)
  • Optionally, Install "Test Driven .Net" for easy usage in development process.
That's all ! Your'e done !
And it took us less then a day.
Should take you less because you have all the process figured out already right here :-)
Good Luck !
Eli & Galia.

Welcome to my blog !

By Eli Ofek at March 16, 2005 21:14
Filed Under:


This is my new blog.
you are welcome to monitor it & visit from time to time...



<<  June 2021  >>

View posts in large calendar

Page List


    I work for Microsoft Israel as s Senior Premier Field Engineer.
    The opinions expressed here are my own personal opinions and do not represent my employer's view in anyway.