Series: Software Engineering Fundamentals | Part Part 3 of 19 > Delivered at Universidade Potiguar (UnP) in 2010
In the third lecture of the Software Engineering course at Universidade Potiguar (UnP), we dove into defined process approaches — with a focus on the well-known waterfall model. But more than memorizing stages and sequences, this class was about context, reflection, and adaptability.
When does a strict sequence make sense?
We opened the lecture with a direct provocation: when does it make sense to follow a rigid sequence of steps? For that, we used a simple analogy: how do you add two numbers?
The idea was to clarify the difference between defined and empirical processes. A defined process assumes we understand the steps, risks, and outcomes well. It’s effective in highly predictable contexts. On the other hand, in uncertain environments, the empirical approach — based on inspection and adaptation — proves to be more appropriate.
Activity: Classifying contexts
The first activity of the class involved students working in groups to analyze real or fictional scenarios and answer: is this a defined or empirical process? Each answer needed justification based on the components of the environment.
This dynamic works great in companies too. Just adapt the scenarios to internal products or workflows and promote a discussion: where does our current approach fit? The goal is to stimulate discernment between control and adaptability — something vital for both students and professional teams.
The waterfall model, with a critical lens
In the second part, we presented the classic software lifecycle model: requirements, analysis, design, implementation, testing, and maintenance — step by step, like a recipe.
But we went beyond the sequence. We discussed Big Design Up Front (BDUF) and the historical reasons behind this approach. We introduced Barry Boehm’s curve on the cost of changes throughout a project, showing how early decision-making made sense in an era of months-long projects and rare deployments.
We also used analogies like “building a house” to explain the rationale behind the model’s rigidity. Still, we made it clear that rigidity and clarity are not the same thing — and that modern projects demand flexibility without sacrificing planning.
Real-world challenges
In the third segment, our debate turned to how the model fails in real environments. Can you move a building once it’s constructed? Does software need to wait until everything is ready to start delivering value?
We brought excerpts from classic literature that expose the model’s limitations in unpredictable contexts. We also reflected on the role of active supervision (“the ox only fattens under the eye of the owner”), questioning the overreliance on documentation and fixed timelines.
How to facilitate with depth
If you want to deliver a class or workshop with this content, start with simple, progressive provocations. Use everyday analogies to explain technical concepts. Encourage activities that contrast models with the actual experience of your team or students.
And above all, don’t turn the waterfall model into a villain. Show that it is one of the tools in the software engineer’s toolbox — useful in certain contexts, limited in others. We’re aiming to cultivate critical thinkers, not just followers of method.
Posted as part of the lecture journal for the Software Engineering course. Today, we reflected on recipes, plans, and the importance of thinking critically before following any process.
Series Navigation
- Introduction: Part 1 - Why Software Engineering?
- Previous: Part 2 - Taming Complexity
- Current: Part 3 - Waterfall Model
- Next: Part 4 - Evolutionary Models
- Complete series: Why Software Engineering? | Taming Complexity | Waterfall Model | Evolutionary Models | Agile Mindset | Scrum Productivity | Scrum Cycle | XP Quality & Courage | XP Principles & Practices | XP in Practice | Domain-Driven Design