We work in iterative cycles. The methodology we use is XP and kanban. Our preferred sprint length is 1 week. We do quarterly goal planning & review.
Kanban refers to keeping tracks of small, accomplishable tasks by means of cards, and moving the cards through lanes. Our typical lanes are “todo”, “doing” and “done”, although this varies as the complexity grows. If the work is split by team function, this is reflected by the lanes. There are additional lanes for “content”, “Q/A”, “graphics”, “marketing”, and “sales”. Our iterative cycle consists of the following steps:
- Step 1: Scope & Mockup
- Step 2: Develop & Test
- Step 3: Deploy & Automate
- Step 4: Measure & Monitor
- Step 5: Benchmark & Pentest
- And back to step 1
Kanban and sprint planning help us split the work into accomplishable tasks. We would rather deliver early, than allow scope creep and deliver late. It is important to note that the work is iterative, as opposed to waterfall. After some scoping and mocking is done, and some software is built and deployed, and accepted, there is typically more scoping and more development. This relieves the stress on the shareholders, who aren’t required to specify the scope correctly at the onset of the project. In prototyping and delivering viable early-stage products, it is critical to keep the scope flexible, as it inevitably changes during the project run.
Step 1. Scope & Mockup
We are good at estimating how long it will take to deliver code. It is a part of our job. In order to arrive at accurate estimates, we break down the work into accomplishable, measurable chunks. Optionally, we calculate the velocity of the team (how quickly stuff gets done) to adjust future estimates.
We separate graphic design from backend development. We strive to have different team members handling each category of tasks, although in a small enough team that is not possible. We prefer to sign off on mockups before they are implemented, to relieve the stress on the developers. We understand that frontend work and backend work involve very different ways of thinking, and prefer to hand off front- and back-end tasks to different team members.
Step 2. Develop and Test
The majority of work happens in this step.
Although Quality Assurance (Q/A) is often a separate step from development, we think of it as one. The developer writes the tests for her own code, in the process called test-driven development (TDD). Depending on the client needs, we may also implement behavior-driven development (BDD). If we add a separate Q/A step to the development cycle, as per client needs, we still keep development test-driven.
Step 3. Deploy & Automate
Delivering software can be more difficult than making it. That is why we build and use automation tools as a part of our job. Seamless deployments, automatic deployments relieve stress on the team and reduce mistakes and bugs. We balance application complexity against automation complexity: where it is appropriate, we increase automation; at other times, we increase the application layer.
Step 4. Measure & Monitor
Monitoring here refers to automatic health checks that are built into the infrastructure. Backups and disaster recovery are also done during this step. Collecting feedback is more human: we sit with stakeholders and end users and discuss how the software is performing, what should be changed and improved upon. Collecting and acting on feedback from the actual users is a critical component of our development cycle.
Step 5. Benchmark and pentest
Depending on client needs we benchmark the performance of the application. This is an early step in getting it ready to scale. Some scaling decisions are made during the development step: we always strive to make scalable technology decisions. In order to actually scale, we measure throughput and application performance, and tweak settings and components. The make informed decisions, backed by data, when addressing specific scaling issues.
Privacy and security are a part of our job. We perform penetration testing and security testing according to client needs and latest industry standards.
At the end of step 5, we are ready to sit with stakeholders to address the current needs and outline the next tasks, that go into the next sprint. Our preferred sprint length is the industry-standard 2 weeks.
~ * ~ * ~ * ~
At the end of step 5, we are ready to sit with stakeholders to address the current needs and outline the next tasks, that go into the next sprint. Our preferred sprint length is 1 week.