News

What is the point of Unit Testing?

I don’t normally add two entries to my blog on the same day but something attracted my interest. Somebody asked the question: “What is the point of Unit Testing?”
Actually, they asked two questions:
  • Why do we perform Unit Testing, even though we are going to do System testing?
  • What are the benefits of Unit Testing?
To which my immediate thought responses were:
  • Would you deliberately make something from parts you knew were broken? Or
  • Would you make something from parts which you suspected were broken?
And then I thought “that might sound a little rude” and reconsidered…
Modern development methods have lots of benefits, but sometimes in flexible and rapid methods something gets lost. People forget why things are done. Or, if they’ve never been told, they wonder if they are worth bothering with.
Now, you should always question everything, but sometimes things are there for a good reason. If you plan to take something away;
  • Understand why it was there in the first place,
  • If it is no longer needed, explain why it is no longer needed.
  • Understand (and be prepared to live with) the consequences of taking it away.

An old-fashioned view of a System Development Process

(by the way, you’ll notice that some of this material has been re-cycled from elsewhere)
If we take a rather old fashioned view of systems development using a “waterfall” model, then we will have a number of phases (an old IBM Development Process, but does n’t really matter).
  • Each phase produces something, and the stage below it expands it (produces more “things”) and adds detail.
  • The “Requirements” specify what things the system needs to do. They also identify the things that need to be visible on the surface of the system.
  • For each of the things that need to be visible on the surface, we need an “External Design”
  • The “External Design” specifies the appearance and functional behaviour of the system.
  • For everything we have in the “External Design” we need a design for the “Internals”
  • And finally someone needs the “Build” what we have specified.
You can view this as a waterfall, down the side of a valley. The process is one of decomposition.
I don’t especially recommend “Waterfall” as a way of running a development project, but it is a simple model which is useful as an illustration.

The Testing Process

On the other side of the valley we build things up.
  • Units are tested.
  • When they work they are aggregated into “Modules” or “Assemblies” or “Subsystems”, which are tested.
  • These assemblies are assembled into the System which is tested as a whole.
  • Finally the System is tested by representatives of the Users.
The process is one of developing “bits”, testing the bits and then assembling the bits and then testing the assembly.
The assembly process (in the sense of “putting things together”, not compiling a file written in “assembler”) costs time and effort. Parts are tested as soon as practical after they are created and are not used until they conform to their specification. The benefit is that we always working with things that we think work properly.
In a well-organised world, you would like to think that the Users are testing against the original requirements!

Development and Testing should be mutually supportive

What should happen is that at every level, each component or assembly should have some sort of specification (it may be a very rudimentary specification, but it should still exist) and it should be tested against that.
In fact, there is a thoroughly respectable development approach called “Test Driven Development”. The idea here is that the (Business) Analyst writes a “Test” which can be used to demonstrate that the system, at whatever level, is doing what it is supposed to be doing. Of course, the Analyst may need help to write an automated test, but the content should come from the Analyst.
This approach is really useful all the way through the development process. It’s a really good idea if a developer writes tests for the code s/he is writing before the code! In fact, I have known places where they insisted that a test was written for a bug before the developer attempted to fix the bug. That way demonstrating the fix was easy: Run the test without the fix – Test demonstrates the bug. Apply the fix and run the test again.

The Cost of Not Doing Unit (or other low-level) Testing

All bugs are found at the topmost level, which means that they are found after the product has been assembled or “built” and then we have to work out where the error has actually originated.

The Benefits of Unit Testing

  • Bugs are found sooner, and they are found closer to the point at which they are created.
  • Unit testing lends itself to automated testing which can be integrated with the build process. Ask a professional Java developer about “JUnit” or a Python developer about “UnitTest” (one word).
  • Automated testing increases the chances of trapping “regression bugs” as code is enhanced and bugs are fixed.
All of the above mean that well-planned and executed Unit Testing results in:
  • Reduced overall cost
  • Improved product quality

Oracle and Courses

My personal development time last week was spent completing an online course “Oracle DBA for absolute beginners”.

I wouldn’t have described myself as an “absolute beginner”, but I found plenty to enjoy in the course and came away having learned quite a bit about what is going on inside Oracle, and I assume most other database managers.

Circumstances influence what we do in life and so far I have had much more exposure to DB/2 and MS SQL Server than to Oracle. That hasn’t been a decision on my part, simply the choices that had been made for the projects I was involved in.

In a similar way, I’ve spent much more time “dealing with users” as a Business Analyst, than I have working out how to manage the space requirements and performance of a database. It does me good to learn just a little about the things a DBA has to consider. I don’t have to let those considerations govern what I consider the requirements to be, but at least I can understand where other people are coming from.

Taking the course led me to what you might consider “meta” thinking: thinking about not the content of the course, but the way it was presented and the platform Udemy on which it was presented.

I find Udemy interesting. It seems to work well. It certainly worked for me.

Udemy seem to be aiming to be a “neutral marketplace”. The courses belong to the course instructors. Of course Udemy have standards for courses, but beyond the usual “fit to print” conditions, they are mostly technical standards (quality of video and sound) rather than subject matter related. In a similar spirit, Udemy promote the platform, but the promotion I have seen seems to be fairly neutral with regard to individual courses. On the other hand, instructors or course owners are completely free to advertise their wares elsewhere and direct potential customers into Udemy. It’s a simple model which I think I will investigate further.

Another video – Splitting an Access database

When I’m making something I sometimes learn new things. They say “you should never stop learning”. I agree with “them”, whoever they may be.

While I was working on the SOPAG project, I investigated “splitting” the Access database into:

  • Logic and presentation, and
  • Data (database definitions and data values)

Components. I knew this could be done, but I had not spent much effort on it before.

The splitting itself was a straightforward enough exercise. Most of the work is done by Access itself. However, you might want to confirm that all the decisions it has made are sensible!

I decided to document the results for my own benefit, and then decided to convert a scrappy Powerpoint presentation into something a little more presentable to upload to YouTube.

Here it is: Splitting an Access database

I had fun doing the work to find out how it worked, and fun making the video. I hope you get something from watching it.

Collaboration software and badgers on the internet

A little earlier this week I attended a Webinar on PBWorks ProjectHub product.

Here is a link to a recording of the webinar:

I have used PB Works’ wiki product for years (anyone who remembers me from LCCH may remember me setting up a reference site which was based on PBWiki).

Right now, I’m using the “Freemium” version of ProjectHub to manage a small project I’m running. The project team is split across two countries (Ireland and Wales), so the opportunities of meeting face to face are limited, but thankfully we’re all in the same timezone and all speak more-or-less the same language!

It’s all going reasonably well, and I started to ask myself “why?”

I’ve used various bits of collaboration software on various different platforms for years now. Sometimes it works well for the project, sometimes it works less well.

With ProjectHub, I like the way that I can switch between a top-level view, to a short term  “what is the next focus” view, to an individual task view quickly.

One of the things which helps my team, is that we’ve known one another for quite a while and we’ve agreed:

  • the way that we are going to use the tool,
  • what our roles are, and
  • what our individual responsibilities are.

In order to keep things running smoothly, we have a role which I call “the badger”. The badger’s job is to spot when people have forgotten to complete updates (usually because they’ve been doing something more important and more interesting) and remind them. The good thing about something like ProjectHub is that the actual effort of adding a couple of line comment on a task, or ticking the “complete box” is easier than arguing, so things remain reasonably up to date. The badger doesn’t have to be the PM (right now it is me though), in fact, it’s better if it is someone else, because even the PM needs to be badgered sometimes!

Almost my first video on Youtube

Sometimes things don’t go quite as I intend. A little while ago, someone approached me with a potential project. Unfortunately I was too busy at the time to take it on.

The idea had tickled my fancy. It bubbled away in the back of my mind and as I had odd moments I created bits of it, as what I would describe as a proof of concept. It was a useful exercise in that it has reminded me of a few things, taught me some things about what Microsoft Access is good at and some things it is not so good at. Inevitably, there are some things I would do differently if I did it again. That’s all right, because after all it was only a proof of concept, and there was no real input in the form of “requirements” anyway.

Having produced the thing, then I wanted to show it to an acquaintance. I messed about with a few things and after a couple of iterations, produced this:
SOPAG – A simple Access Application

Having produced the video, and decided to write a “business related blog” it seemed appropriate to share it here.

I wouldn’t claim either SOPAG, or the video are marvelous, but I’ve learned a lot from both of them. In fact, I have set up a little project to take them both a little further.

But that is for the next instalment!