Tag Archives: project management

Prototyping and the SDLC

The Prototyping Model Applied to the SDLC

Embarking on any development project in a new supplier/customer relationship can be a daunting experience for all parties involved. There is a lot of trust to be built in what is usually a fairly short time and it is sensible to select an approach that improves the chances of the project startup succeeding and progressing as planned to the next phase.

Read More

In my experience, there is no, single, ‘correct’ method to do this, though clear dialogue and an experience with project management methodologies can help immensely. Depending on the school of thought, project type and customer requirements, any one of a number of project management methods can be employed and it usually falls to the project manager to select an approach that also best suits the needs of the business case.

One such approach that has worked well for me in the past is the ‘prototyping model’ approach to the software development lifecycle (SDLC). Software prototyping itself, of course, isn’t a new concept, but it is becoming more popular and should be seriously considered when starting-out on a new project where it is recognised that there are other risk factors involved, such as fresh partnership agreements, subjective designs and/or new technologies.

An obvious reason prototyping is becoming more popular is its relatively risk averse nature. In a short space of time, a customer has an opportunity to perform a comprehensive skills assessment on the supplier before deciding to move forward (or withdraw) with the main project. This substantially reduces cost and quality risks at the outset.

In-turn, a supplier can ascertain if the customer has a decent grasp of their product vision and an ability to specify a clear set of requirements, so the prototype partnership is usually a mutually beneficial one. If the conclusion to prototyping is that it has been a positive experience for both parties then there is good reason to remain confident in the project partnership going forward.

There are a number of benefits to prototyping which can suit either party, but one that is of particular benefit to the customer is using prototyping as a vehicle to choose between a number of suppliers who are all bidding for the same project. Again there is less risk, certainly to the customer and potentially to the supplier as well, since neither party would wish to continue with a project that has failed in its first initiative.

So really, what I am saying here is that prototyping is a cheap, powerful assessment tool, and depending on the approach could form the foundation of the main project. Code developed in the prototype phase could be reused, so the time taken to complete the prototype is not lost in the overall project timescale.

Additionally, prototyping is tool for building successful working relationships quickly and it can prove invaluable as a supplier capability yard stick. Generally speaking, a prototype SDLC model has an overriding advantage over other SDLC models since it doesn’t rely on what is supposed to happen, i.e. what has been written in technical design documentation. Instead it canvasses the users directly and asks them what they would really like to see from the product. Gradually, the product is developed through a number of iterations and addresses the needs of the users directly in that phase of the project.

The Prototyping SDLC Model

The prototyping model starts out with a initial phase of requirements gathering, pretty much like any other software development process, however, it quickly moves to development after an initial, simple design is produced. A first iteration is released and given to the customer for review and feedback and this in turn may elicit more requirements as well as alter the design and function of the application.

This process continues until the customer accepts the prototype as complete and the project moves to the next phase of development. At this point the prototype can become redundant, or it can be continued to be used as a tool for examining various design options and/or functional changes; or it can be incorporated into the project main, as it is.

Since the prototype is developed in what is largely an agile process, there is no reason that the main application cannot be developed in the same way, although purists may argue that this is an inherently waterfall approach, I would argue that any purist approach for the SDLC can cause issues and practically speaking one should adopt an approach that suits the customer and project, i.e. flexibility is key.

Prototyping – Pros and Cons

Prototyping offers a number of advantages:

1. Time-boxed development ensures a minimum viable product
2. Users are encouraged to participate in the design process
3. Changes can be accommodated quickly and easily
4. The application evolves as a consequence of a series iterations of corrective feedback loops. Generally, this leads to a much more widely accepted product
5. Defects can usually be detected and fixed early on, possibly saving resources later on
6. Areas of functionality that are missing and/or confusing to users can be easily identified and remedied

There are however a few disadvantages as follows:

1. Most models are never fully completed and may exhibit technical debt which is never fully addressed. This is especially important if the prototype is literally used as the basis for the real application
2. Documentation, if required, can be a bit of a nightmare since changes are usually frequent and can be substantial
3. There can be a reluctance within developers to change from the initial prototype design. Instilling a new design mindset can be difficult
4. If integration is required and the prototype is unstable, this can cause other issues
5. An over enthusiasm in the iterative stages can result in too much time being invested in this phase

Will it work for you?

Before you can answer that question, it will probably be a useful exercise to ask yourself what do you need to get out of the prototype and, what do you intend to do with it afterwards?

The prototype model can be exercised in a few different ways and this can have a substantial impact on the project as a whole. More often than not prototypes fall into 1, of about 4 different categories:

1. Prototypes that are meant to be thrown away after the prototype phase
2. The evolutionary prototype where iterations continue and the prototype evolves into bigger and more functional prototypes and ultimately, the final application
3. The incremental prototype which is a more modular approach. Each subsystem is prototyped and then all of the sub-systems are integrated to build the final application
4. A web-based technique known as extreme prototyping where only the web pages are developed initially and then the functional aspects are added in the final stage. An intermediate stage is used to simulate the data processing as the prototype evolves.

The technology used to develop the application could also have an impact on how development proceeds, for example with mobile apps, most IDEs have built-in simulators to help with rapid design and demonstration of the app, so prototyping is almost implicit in the overall build approach.

Whichever SDLC approach you chose, prototyping should be considered as a useful tool in application development. It can save time, cost and be an invaluable indicator as to the future success of your project. Priceless!

“I Am Not A Number, I Am A Free Man!”

No one likes to be numbered, categorised, graded, pigeonholed, ranked, classified or otherwise grouped into a systematic or a similalry defined logical set. We’re all individuals, right? Especially ‘us’ IT, techie-type individuals. We’re as unique as virtual grains of sand on a synthetic beach, washed up by the tides of cyber-space. We like to flaunt our egoistic personalities by adopting avatars, twitter handles and pen names etc., and we do it all in the name of individualistic pursuits or anonymity on the web. We like to distinguish and uniquely identify ourselves in the blogosphere, on forums, in games arenas and chat rooms or wherever it is we may electronically hang out and virtually socialise.

Well, I have news for you. According to this article we’re ALL going to be reduced to one of only three functional possibilities;

1. Consultants
2. Project managers
3. Developers

Now that the post is nearly one year old, is it still actually the case, or was it ever the case? Do we look around our offices and see only Consultants, Project Managers and Developers? And if we do, what then? Does that mean we’re more efficient or less creative? Now, call me boring but I’m all up for seeing and utilising the differences in people, and whilst we may broadly fit into categories I really don’t see the benefit of this mundane, over simplistic approach. After all do we not already relish in shoe-horning ourselves into rather pointless categories of height, size, race, religion etc.? And where exactly has that go us?

But this is not a philosophical debate, it’s practical one and I personally think that, in the IT industry there are more roles than just Consultants, Projects Managers and Developers. I mean if we’re going to be pedantic about it, what about the role of scrum master? Granted it’s a managerial-type role, but it’s mainly a servant leader role and usually not development orientated nor indeed does it necessarily involve managing projects as a whole. The fact that people have wide and varied skills is a good thing; it allows for, and is conducive towards creative thought processes, compromising attitudes and entrepreneurial ideas. If we were all the same or even broadly the same, what a dismally boring place we would have on our hands and how pathetically bland our work environments would be.

I could write more on this here, but it might appear as a rant and I am reminded this is a blog, so let’s consider and respond to the points raised on the website mentioned above.

Consultants first then…

1. Consultants

“Lets face it, all but the largest enterprises would prefer to not to have any IT professionals on staff, or at least as few as possible. It’s nothing personal against geeks, it’s just that IT pros are expensive and when IT departments get too big and centralized they tend to become experts at saying, NO. They block more progress than they enable.”

Now, I’m not sure what world this particular writer lives in, but generally in my world if you have spent a number of years learning, practicing and becoming well recognised as an industry professional in your arena, then I believe this entitles you to a decent salary. Furthermore, as a consultant myself I am used to making the effort to say “YES” to my clients. Obviously there are times when one simply cannot achieve what is being asked, usually within tight deadlines and/or cost estimates, and at those times it is best to say “NO”, but to suggest that IT departments generally say “NO” because they have ascertained and degree of power within an company is not a point-of-view that I would willingly go along with.

Next up and commenting on Project Managers, our writer suggests that…

2. Project managers

“Most of the IT workers that survive and remain as employees in traditional companies will be project managers. They will not be part of a centralized IT department, but will be spread out in the various business units and departments. They will be business analysts who will help the company leaders and managers make good technology decisions. They will gather business requirements and communicate with stakeholders about the technology solutions they need, and will also be proactive in looking for new technologies that can transform the business. These project managers will also serve as the company’s point of contact with technology vendors and consultants. If you look closely, you can already see a lot of current IT managers morphing in this direction.”

I may not disagree entirely here, but I think that it is a gross generalisation to say that employees who remain will become project mangers. Perhaps this may be true in very large scale organisations, but in small to medium-size companies there are simply not enough positions to justify everyone moving into a project management type role. Additionally, the idea of a project manager gathering business requirements etc. doesn’t quite ring true. Most projects that I have worked on have had a dedicated Business Intelligence Analyst to do that and the idea of a Project Manager communicating technical solutions to stakeholders is also rather flaky. Usually an architect or team of architects will design and present a technical solution to a client since many IT project managers do not have either an IT or technical background and are wholly incapable of operating in this area. The role of the Project Manager in a nutshell is the overall responsibility for the successful planning, execution, monitoring, control and closure of a project. They may perform other duties, but it is unlikely that this will be requirements capture, system design or any of the usual technical functions. There are of course exceptions and I have managed projects with a number of hats on, scrum master, developer, team lead, project manger etc., but on the whole this has not been my experience in the IT world.

Finally our writer considers the Developers, a bunch dear to my heart.

3. Developers

“By far, the area where the largest number of IT jobs is going to move is into developer, programmer, and coder jobs. While IT used to be about managing and deploying hardware and software, it’s going to increasingly be about web-based applications that will be expected to work smoothly, be self-evident, and require very little training or intervention from tech support. The other piece of the pie will be mobile applications both native apps and mobile web apps. As I wrote in my article, We’re entering the decade of the developer, the current changes in IT are shifting more of the power in the tech industry away from those who deploy and support apps to those who build them. This trend is already underway and it’s only going to accelerate over the next decade.”

OK, so there are some good points here I’ll admit, especially with the expansion of the mobile app world, but I think the writer falls short of fully understanding what a revolution this has actually been. In the past the developer role was a skilled role and usually took several years to master in any given language. Since IOS, Android and Windows CE were introduced, coupled with free IDEs, mobile emulators, on-line source code repositories etc. the ability for the average Joe to create a decent application has never been easier.

Coding is no longer the exclusive right of those who can afford the software tools and with many big software vendors offering ‘lite’ or free versions of their flagship software, it really has taken the development world by storm. The writer also fails to consider the other massive revolution which had definitely taken hold last year and which continues to boom; that of course is the ‘cloud’. Just put ‘estimated cloud growth’ into Google to see what the results are. It certainly doesn’t take a monkey with two heads to figure this one out. It’s massive.

So, lessons learned?

1. Forget about generalising people, not even in terms of functionality.
2. Focus on why people are different and how that can help your business.
3. Respect talent and give credit where it is due.
4. Give project managers a break they really can be useful (OK, a bit tongue-in-cheek).
5. Stick it out with longer blogs, they can be worth it.
6. Live long and prosper.