Notice: Undefined index: n156633 in /home/eiuxzat/public_html/richard/inc/collections/model/_blog.funcs.php on line 1

Notice: Undefined index: nf33bb8 in /home/eiuxzat/public_html/richard/inc/skins/_skin.funcs.php on line 1

Warning: Cannot modify header information - headers already sent by (output started at /home/eiuxzat/public_html/richard/inc/collections/model/_blog.funcs.php:1) in /home/eiuxzat/public_html/richard/inc/skins/_skin.funcs.php on line 379
Richard Develyn
25
Oct

The SCRUM Methodology

Suddenly everyone seems to be talking about SCRUM. In fact it's become (rather amazingly, given what it is) a requirement for employment in a lot of the jobs I am currently looking at. I therefore thought it best to investigate what SCRUM was to see whether I could sensibly put "has SCRUM experience" on my CV.

Following a short article by Robert Martin comparing methodologies I decided that I could, though after looking at it in some detail myself (see the references below) I'm not so sure it's such a good thing.

Because I have a bit of a problem with it - not so much about what the Methodology is saying (which isn't very much), but about what it omits.

If we cast our minds back to XP, those of us who were around at the time might recall how it was stated that XP's principles enabled one another and worked off one another when they were all being put to use at once.

I wonder now whether XP was just a bit too big a pill to swallow for most management teams so that SCRUM was presented as a sort of "acceptable face". But - it's all very well talking about a light-weight method, SCRUM simply doesn't work without that one thing that all of these methodologies need and that one thing which accounts for 99% of the difficulty and effort of working in this fashion:

Flattening the cost of change curve.

Ken Schwaber in his presentation below, bless him, does allude to the fact that your speed might just slow down as time goes on, but he makes sure he describes this as happening between releases (i.e. outside SCRUM) rather than between Sprints. I'd like to know what Scrum Masters do when they start plotting their Burndown Chart and find that they have a curve, rather than a line, and that that curve rather disturbingly decreases in gradient as time goes on.

You could rather blithely answer that it is the Scrum Master's job as QA guy or whatever to make sure that code maintains its beautiful integrity across Sprints. This, however, is where *all* the effort in software engineering happens, and it had better be your best engineer that is Scrum Master if that's going to be his particular job.

And even then I don't think anyone ever believes you will flatten the curve completely. Rules of interoperability (c.f. The Mythical Man-Month) still state that as time goes on the same step-wise increment in functionality will take longer and longer to implement. Keeping code's integrity clean will stop you getting into a mess, but it wont prevent your code getting more complex.

And even if you could still somehow do it you still have to factor in requirements-churn, which is not going to be constant throughout the lifetime of the project.

And (and I'm sorry about all the "and"s) you also need to measure how much extra deprecated work you are having to produce in order to sustain your short iterations and rapid release. I do most certainly believe that some of this has to happen, however I do not believe that time should be your only consideration in determining your iteration life-cycles. You have to consider the certainty of requirements and the risk of implementation as well. If you're sub-optimal, you could end up doing a lot of un-necessary refactoring.

Everyone wants predictability and everyone wants increased productivity. SCRUM and XP both recommend the mega-sensible approach of taking vertical slices through a product and developing iteratively. I'm not quite sure about SCRUM insisting on "release" - sometimes that's simply not possible - just "completion" is enough for me. Time-boxing meetings just seems like a bit of a sales-pitch to all those managers who think their engineers waste a lot of time arguing (those managers should read PeopleWare or start managing something else). I certainly wouldn't want anyone ringing the bell and saying time's up if I'm still not sure what I should be doing that day. Of course, you need structure, but SCRUM doesn't have to lay that down - decent team leaders do that already in whatever way suits them and their team best.

After that, you have to look at XP or other techniques for the best way to improve your engineering practices.

Ask yourself: if 65% of software doesn't make it out the door, why is that? Is it because programmers aren't typing fast enough? Is it because programmers are spending too much time chatting about TV or something? Is it because project management is up the spout?

Or is it because programmers are producing too many defects?

SCRUM, well, ok, maybe it's a good way to get XP through the door. Introducing iterative development is fantastic but you really need to go elsewhere to make the biggest improvements in productivity and predictability. I strongly suspect that where claims of SCRUMs great successes are being made that actually it's something else, not actually part of SCRUM itself but something that SCRUM has helped introduce, that is making all the difference.

Richard

References:

Robert Martin's article can be found here.

The original SCRUM white paper can be downloaded from here.

The author himself talks about it here:

And Jeff Sutherland adds his views to it here:

free b2evolution skin
16
Oct

Requiem

When my mother died a few years ago her husband played some Joan Baez music at the end of the funeral as a way of communicating the sort of person that she was and the way she would like to be remembered.

Ever since then I've been wondering to myself what I would like to be played when the time comes for everyone to say goodbye to me.

Music is very important for me - I listen to music a lot while I'm working.

Actually, I think that music is a necessity for programmers that I think a lot of non-programmers struggle to understand. The reason for this is that programming is an activity that alternates between the highly cerebral and the dreadfully boring. Ramp up someone's mind to fever pitch and then give them something dull to do and you are in danger of causing depression. Music is the way that programmers rescue themselves from that. When you're thinking hard you turn it off; when you're committing your thoughts to keyboard you turn it back on again.

Despite the intervening decades I remain principally into 70s music.

The thing about growing up in the 70s was that there was so much good stuff around back then that you couldn't possibly be into it all at the time. I'm catching up with a lot of it now, particularly with the Rolling Stones (loved Black and Blue) and more recently Genesis.

And I think I have found in "Dusk", from the album "Trespass", my Requiem song (assuming Messrs Gabriel and co don't mind me pinching it off them as I'm sure they wrote it for themselves).

Music works, of course, when it engages you emotionally, and when you're considering something of this importance it's going to be a very personal. "Dusk" gets it spot on for me because it communicates the beauty of the tragedy of human existence (if that makes any sense to you).

The beauty of existence comes from its intensity, even if only for a few short moments:

"The scent of a flower,
The colours of the morning,
Friends to believe in,
Tears soon forgotten,
See how the rain drives away, another day."

The finality of life gives it its pathos. I will never see the wonders of the 22nd century and beyond and I wish I could. We are, as the song says in its final line, "passers by, born to die."

And when, finally, a false move by God does destroy me, there'll still be another day, albeit not for me. The leaf may have fallen but the tree isn't broken. I was what I was, loved life as much as I could, even though I knew I wasn't going to be here very long. I wish I could have stayed longer but the option wasn't offered. The world belongs to our children, and then they'll have to pass it on to theirs.

Richard

free b2evolution skin
2
Oct

The Principles of Software Engineering #4: Object

At first hand this may not seem like such a big deal. After all, we had methods, we had data structures - what is an object if not just a simple collection of methods, each of which takes a pointer to a data structure as a parameter, all gathered together somewhere nice and convenient.

Indeed, from a basic programming mechanics point of view that is all it is, at least until you get into polymorphism and the next principle "Interface" (and more on that in the next post).

However the concept of the Object and Object Based Programming introduced a whole new approach to programming which was, in my opinion, a far greater Paradigm Shift than the later much more lauded Object Oriented Programming.

Objects brought with them Object Design, and Object Design allowed designers to construct solutions on the basis of a model of the problem domain, rather than approaching the problem from the point of view of, well, a computer.

The programmer's job prior to the arrival of, let's call it, "Object Based Thinking", was along the lines of a well know University Textbook of the 80s called "Algorithms + Data Structures = Programs". Although a programmer would quite sensibly use names from the problem domain in the solution domain in order to make understanding of the code easier, the two domains were still very much separate. The solution solved the problem using its own artifacts - i.e. data structures and functions. Programming was procedural in the main because people generally think procedurally - i.e. in order to solve the problem we do this thing first, then the next, and so on. Keeping track of what was going on came down to Flow-Charts and Data-Flow Diagrams, and there was always a "main loop" somewhere, processing events and farming the work out.

Object Based Thinking turned that on its head completely. Now the solution domain consisted of interacting objects bouncing of each other like billiard balls on a table. No more main loops. No more data flows. Stack-tracing, that vital debugging technique of the procedural days, more or less lost its value as objects got smaller and better defined (cohesive and loosely-coupled) and as the number of billiard balls knocked and re-knocked around the table in response to one external input increased.

Indeed, in Object Design, you never think beyond what is using you and what you use - i.e. your immediate neighbours. Later on (next principle) you stopped even thinking of them (and them of you) beyond your interfaces. Reusability was just around the corner.

And with Object Design came the rather lovely realisation that this flexibility allowed you to build your solution as a close model of your problem. If the problem domain has a Customer asking for a Loan from a Broker, then the solution domain could have a Customer object call a Loan_Request function (now called method) from an object called a Broker. If the Broker has to check bank references on the customer before agreeing, again the software can have that Broker object call a Bank object's Check_References method passing in the Customer object. And so on.

And this makes software so much easier to understand that it is probably the most important development in software engineering that we have ever had.

But - there's something rather counter-intuitive to the engineer, who likes to think in terms of machines sorting out problems, which has definately caused a slowness in the acceptance of problem-domain modelling as a technique. Although it is clear that it cannot be taken to the n'th degree, the temptation to use algorithmic artifacts rather than problem artifacts in design is quite clearly strong (as I have witnessed in my career). However in my opinion, Problem-Domain-Objects are the linchpins on which the whole future of software development hangs - a future based on the steady increase of the ready-comprehensibility of the software that we write.

Richard

The Principles of Software Engineering: start previous

free b2evolution skin
25
Jul

√42

Well, QuSheet was finally released yesterday, so it's time to break my silence here, and how better than with that little thorny question called "The Meaning of Life".

It seems to come down to me to a simple conflict between two incompatible principles: "Causality" and "Free Will".

Causality adherents claim that Free Will is an illusion. In The Science of Diskworld II, the authors actually devote a chapter to this called "Free Wont". I'm not sure why they, and the scientific community in general as far as I can see, throw their hats in so whole-heartedly with Causality. I, personally, favour Free Will.

And I do think it's a matter of favouritism, because neither can be proved. Free Will cannot be proved because it is impossible to re-run time and see whether someone would do something different. Causality cannot be proved because we cannot work backwards from what we see the universe is doing to whatever formula it is (if there is one) that is making it happen.

Consider this little thought experiment. Imagine two computers. One is infinitely powerful (really) the second one is finitely powerful. Call the first one Omega and the second one Squeak. Although Omega is a lot more powerful than Squeak, it doesn't actually know how powerful Squeak is. All that Squeak does is put out digit after digit of some irrational number, say the square root of some non-perfect square. Omega catches those numbers and is tasked with figuring out what Squeak is actually up to.

Assuming that Omega assumes that it is getting a square root, its problem is that at any point in time there are always an infinite number of possibilities as to what this irrational number is a square root of. The square root of 42 begins 6.4807406984, but then so does the square root of 42.0000000001, 42.00000000011, and an infinite number more. And where with the square root of 42 the next number is 0, with 42.0000000001 the next number is 1.

So Omega can never predict what the next digit will be, since it can never know exactly at what Squeak is taking the square root of. Even with Squeak being finite and Omega infinite, Omega doesn't know just how many 0s occur between the 42 and the 1 (and then the next 1, or 2 or whatever).

And given that, Omega can't even be sure that the numbers coming out of Squeak follow any formula at all. They might be random (as in truly random). Maybe Squeak has free will and is choosing what numbers to put out (and if you re-ran time it would choose different ones).

And if Squeak is our universe, where irrational numbers abound, then Omega will never be able to figure out how the universe works. Or even whether it actually works at all in accordance with logic and formulae.

Unless the day comes, of course, when someone figures out the one and only formula which could possibly work. It would have to be a formula which explained *everything*. Until then, there will always be more than one way to explain our observations, indeed an infinite number of ways, and while that is the case we can never be sure that the universe isn't choosing what it's up to.

Which in my opinion puts Causality and Free Will on an even footing.

And I choose to believe in the latter because, although I can see Causality at work when I throw a stone into a lake and watch it cause those nice little ripples, I know my senses can be fooled, whereas Free Will is a lot closer to home, if you see what I mean.

Richard

free b2evolution skin
2
May

It's been a little quiet here for a while

The reason for this is that I'm on the last furlongs of getting QuSheet finished and I haven't wanted anything to distract me.

Things are likely to be a little quiet here for a couple more weeks until I get QuSheet out to my Beta testers.

Just out of interest, after more or less finishing the development work on QuSheet (I've got about a day's worth of bits to clear up), I still had the following to do:

1) Write all the help text.
2) Sort out installation (using ClickOnce)
3) Sort out activation code so that installations are legitimate
4) Sort out icon and logo
5) Script and write tutorials (still doing this - massive job in lieu of writing user manuals (another massive job))
6) Sort out web-site (to-do)
7) Script and write over-view presentation

Then there's beta testing. I'm hoping I'm not going to get a lot of problems thrown up by this, as I've been quite meticulous testing out all my functionality as I wrote my help text and tutorials.

Time ticks on and I'm rapidly running out funds. I'm pretty confident I'll finish before I run out completely, though then it'll be time to either get some work, funding or quick sales.

All the best

Richard

free b2evolution skin

:: Next >>

Free Blog Themes/Templates