Notes for SDL Instructors
Overall structure of SDL:
- Fall semester: establish scope & epics, build prototypes, run four
sprints. The goal will be to deliver an initial, trial version of the system
for use by the product owner. The sprint that crosses Thanksgiving week
should include 1 period in which students present their work focusing on
system design; this evensout the sprint lengths.
- Spring semester: five more sprints, incorporating feedback from evaluation
version. A final version of the system is delivered to the client by the
end of the school year. The expectation is that the team will carefully
document the system design (in enough detail that a following team can
pick up the project) and that the final deliverable will include a
comprehensive, automated test suite.
The expectation is that by the end of each term, all code meets standards,
designs are documented in diagram form, and all work in progress has been
merged into the main branch and that all other branches have been closed.
One thing that is not included is a formal user evaluation. Student teams
in SDL do not have the resources for such tests. They may be able to run
informal tests with small groups of students, but full evaluation with
larger groups must be done by the client.
The bulk of the students take SWE 3710 in the fall and SWE 3720 in the
spring. However, students start the sequence in either semester and can have
gaps between the courses. As a result, SWE 3720 students may be taking the
course at the same time as some SWE 3720 students. Generally students
follow the schedule of the dominant group of students on a team. This means
that all students will follow the 5-sprint schedule in the spring semester
even if some students are enrolled in SWE 3710. The assumption is that
students will change roles (as guided by the instructor). Students who
follow the 5-sprint calendar in (their) SE 3710 will then bring the
experience about finalizing a project to the start of the project in the
following term.
In theory, it may be possible for a student to take SWE 3720 while studying
abroad. Our experience is this does not work for students at this level. It
is too difficult to monitor engagement when the student is in a very
different time zone. The expectation is that such students will take
advantage of opportunities in their host country and re-engage with SDL
when they return.
Notes on Project Selection
- At a minimum, include the PO, a description of the system goals, key
epics for the upcoming year, and the key technologies.
- Six projects is too many for one instructor, largely because it is
difficult to meet six teams during lab time and because it is difficult
to track the details of that many projects.
- Teams should typically have four people and must have at least
three. When a team has more than five, allocate additional time (at least
an extra 25%) to monitor the backlog and ensure that all five are always
working on useful items. If a team has seven, split it into two teams
with clearly demarcated boundaries.
General Instructions
- The primary responsibilities of each instructor is to keep the team
engaged in the project, performing ceremonies as appropriate, and
motivating the team to be responsive to the client. Instructors also
validate the work to ensure high quality.
- Instructors organize the first meeting with clients. Teams will
organize later meetings throughout the year. The expectation is that
teams will meet with clients at least once a sprint and often twice a
sprint, especially in the fall. Video conferencing can be used for
meetings if it is more convenient to and preferred by the client.
- When clients come on campus, call campus security close to the date
they will arrive and give them a license number.
- Generally, the product owner proxy (POP) role is assigned for one
semester. The preference is to rotate the POP role at the semester break,
but in some cases (perhaps for 4-person teams) it might be better to keep
the same POP for the full year.
- All other team roles are rotated on a regular basis so that each of
the other team members receives experience with each of the roles (with
the possible exception of POP).
- The team, rather than the product owner (PO), is expected to write
most PBIs. The notable exception is at the start of the project; it can
be useful to allow the PO to write initial stories. The general
expectation is that POs review and prioritize PBIs. As usual, assigning
story points to each PBI is the responsibility of the team.
- Teams are sometimes tempted to assign 5 or 8 points to simple
PBIs. This leads to sprints with 70 or 80-point targets. This is an
antipattern: it seems reasonable, but the consequence is that a point
corresponds closely to an hour of work, so the estimates essentially
degrade to counting hours of effort.
- A key method for confirming contributions by each team member is to
examining commit and time logs. It is important that both provide
meaningful information to ensure instructors have visibility to what was
accomplished. In particular, quality commit messages will help
instructors distinguish between trivial changes and substantial commits
that need to be reviewed.
- It is expected that spike PBIs in the second fall sprint will focus
on the technologies involved in the project. They should generally not
include elements of the project; that is, they should not relate to any
project-specific PBIs that are the subject of the third fall sprint.
- Some instructors prefer to create a group for each SDL team, giving
all students Maintainer access. Creating groups ensures each team can set
up epics (since they are global to the group). The group should be in
gitlab.com/msoe.edu/sdl/ and then moved to an archive when the project is
inactive. Alternatively, create a single repository for the team with
Developer access for each team member. Set expiration dates to the
following summer. If there is a single repository, have the team record
epics in the Wiki.
- Within each group, create a project with a readme. Students need
Developer access. It is advisable to set the expiration date to July of
next year so it is clear who is contributing to the project.
- I typically make sprint reports/reviews due at least a few days after
the end of the sprint. Students should not do the review or
retro during lab time once the sprint is finished; that time needs to be
reserved for finalizing the sprint backlog and getting started on the
next sprint.
- Starting with at least sprint 3, have the team schedule time during
the sprint to groom the backlog, where grooming includes full acceptance
criteria and pointing. Starting the sprint should consist of simply
moving the top X PBIs to the sprint backlog.
SWE 3710
- Overall expectations for SWE 3710:
- Review all PBIs, preferably prior to the start of each sprint.
Students typically struggle to write effective acceptance criteria in
the first term, so it is important to reinforce this through feedback.
- Check time logs for specificity. It is easy for students to get
into the habit of logging time in hour-long chunks with inadequate
descriptions.
- Check contributions in repositories. Ensure the committed material
is commensurate with the claimed time.
- Ensure each student completes their personal spike with sufficient
evidence to support assessment.
- Check that teams are recording meeting minutes, discussions with
their client, and discussions around decisions that are made in the
first term.
Additional expectations are implied by the following discussions.
- First week:
- Organize meetings with clients. I talk to the students for the
first hour of the first day, so I arrange the meetings for the 2nd hour
or the 2nd day. Send email with the client's make, model, and license
number to PublicSafety-Staff (with a reason) so they can park on campus.
- Ensure all teams are using the same project tracking
tools. Any differences will soak instructor time in finding
appropriate materials.
- Ensure all teams are using the same repository tool, again
for critical functionality. Clients are free to move the work to
another repository when the project is finished. Clients should not
be committing changes to the repository without your approval, so any
access should be read-only. If clients do work, it can be hard to assign
credit and we have had cases in which clients introduce errors or
complexities that make it harder for the team to take ownership.
- Ensure there is a single repository for each project. Multiple
repositories simply mean instructors have more places to look for
project and process evidence. This increases grading effort with little
benefit. Separate repositories for spikes can be helpful if approved by
the instructor.
- Technology assignments:
- These are to be individual; students can help each other, but all
code will be independent and not related to the project.
- If projects include concurrency, prioritize building concurrent
applications in the target language. This likely includes discussing
the concurrency issues with teams, even if that material will also
be covered in Operating Systems.
- Have each student document the goal, technologies tried and used,
how to build the system and test that it works, and lessons learned.
- It is sometimes useful to swap the team-based spike and the
individual spikes. You may also wish to move the individual spikes to
the second sprint, especially when the requirements are relatively
unclear and it is not obvious which spikes are needed early in the
first sprint.
- The bulk of the work will be done outside SDL time. This works for
the first sprint in the fall because most other PBIs will be ones that
the team needs to work on together during lab time.
- Each student is to schedule demonstration time in the instructor's
office. NO demos during SDL lab time, and do not demonstrate these to
your product owner.
- Graded on 30% functionality, 30% for getting the work done (on
time, by the end of the sprint; may allow late penalties), 30% for
documentation.
- 2nd sprint:
- 25% of the grade is to be on the lessons learned around the spike
solutions, and these lessons must be documented and included
(verbatim) in the sprint report. Quality of this documentation is a
major part of the sprint grade.
- 10% of the grade is to be on PBIs for sprint 3:
- Is the description sufficient to understand the problem area?
- Do PBIs for stories use the standard form, "as a ___, I would
like to ___ so that ___." ?
- Do PBIs have acceptance criteria using the given/when/then format?
- Is the PBI sufficiently constrained that teams cannot keep adding
new criteria?
Note we are not checking for automation at this point.
- 10% of the grade is to be based on the quality of diagrams.
- Sprint 3
- Ensure that people assigned to tasks are not those working
on that task for the mockups.
- Emphasize that a significant portion of their sprint 3 grade will
be based on their contributions and pull requests. People without pull
requests will not get As or Bs.
- Emphasize that PBIs are not assignments:
- They must submit pull requests when they have
significant portions done, even if the full PBI is not ready for review.
- POPs are absolutely expected to validate PBIs to move
them to Done; only show PBIs to clients if the team believes
they are done. The primary role of the product owner is to attend the
review and set priorities for future work.
- Have teams demonstrate all functionality to the
instructor before demonstrating it to clients. Ask the team
what can be improved about each PBI.
- Last minute commit-storms will result in a 5% grade reduction.
- The quality of the sprint report will be 30% of the sprint 3 grade.
The report must make it clear what was accomplished and by whom,
how many hours were worked by each student (accurately! students
are also responsible for recognizing overlapping times and time
entries with no description), that successes are noted by
individual, and that process goals contain numbers indicating
measurable improvements.
- Hold a formal, 45-minute retrospective with each team outside the
SDL hours. (Teams will often do their own retrospectives in future
dates, but this one must be done with the instructor.)
- Sprint 4 (or whatever sprint crosses Thanksgiving break): this
sprint will have 4 weeks, but part of that is the break. Students are
not expected to put in time during the break. Each team is to present
their project and a high-level design to the class. Encourage other
class members to suggest solutions to design issues and otherwise share
information.
- By sprint 4, sprint reports should be concrete enough
that they are sufficient to understand the state of the team and project.
Give the team specific directions for report improvement.
- By sprint 4, students should be writing useful acceptance
criteria. Instructors should still review PBIs, but it's often difficult
to do this evaluation before the sprint starts. It is still useful to do
it while the sprint is ongoing.
- Push teams to establish effective CI toolchains by the end of the
term. Check that all tests
pass. Often the early tests will not be effective at finding lots of
errors; working on this can be a goal for SWE 3720.
- An initial version of the system must be deployed for use by the
product owner by the end of the term. This will help the PO provide
useful, concrete feedback on system operation.
SWE 3720
The model established in the third sprint in SWE 3710 applies to this term
as well. Additional notes:
- Work with teams to establish effective process improvement
goals. Teams will often start with "estimate better" - this is typically
a weak process improvement goal because good estimates require the
experience the students are getting by taking the course. Push towards
other process improvement goals, working the teams to ensure they have
measurable criteria.
- Overall, an instructor's focus typically moves to ensuring the team
is meeting design and communication challenges rather than closely
monitoring process. For example, there is less need to be closely
involved in reviews and retrospectives.
- Push for multiple releases over the semester. There will be a final
release at the end of the term, but intermediate releases provide
important information.
- Ensure the CI system is effective at identifying errors and that the
team achieves a high code coverage value. Encourage teams to develop a
behavior-driven development (BDD) model; that is, tests written in a
natural language that can be understood by non-developers.
- If possible, instructors should build the system for themselves and
run it locally. Doing this frequently increases the likelihood that the
final release will be stable. Relying on student builds means that key
steps are likely to be undocumented. Doing additional exploratory testing
is also critical to ensuring there are not major faults that would make
the system unusable by clients.
- The last sprint must be focused on design documentation, ensuring
there is a stable build, and fixing only the most critical
issues. Strongly discourage new functionality; it will likely not work,
and time spent on adding functionality will degrade important
documentation. Inadequate documentation increases the chances of projects
being rejected by clients and significant portions of the projects being
rewritten by later teams.
- Ensure all branches are closed by the end of the year and that all
code going forward is in the main branch. If a branch
captures incomplete work that might be useful to future teams, have the
team document that with a PBI and close the branch without
merging. Typically, future groups will not find such code
useful, so it might be good to warn teams that unfinished work is likely
to never be used. The focus should be on polishing features rather than
adding new features, even if those features are considered critical by
the client.