Professional Computing CITS3200
Definition of Deliverable C
This Deliverable has multiple parts. All are to be submitted
in electronic form via cssubmit.
Please use a single (Zip or tar/gzip) containing all the required parts.
Please ensure your Group ID is cleary in the filenames.
Please also make sure that every document/file has proper authorship and version information.
The final system plus any user documentation, including instructions on how to run the application
should also go to the client.
Once that has been done, acceptance testing will need to be undertaken and documented, with
a copy of the results going to the client.
Each Group needs to submit:
- The source code of the completed system and any user documentation (see below).
(This can be varied if, for example, the source is very large, or a component of a very
large system.
The main thing is that your client has the source code for your system.)
- Instructions on how to run your system on a environment to be
agreed between the team and Client, typically the Client's system.
- Show how your Group handled version control, typically vi a source code version repository
(i.e. RCS/CVS/Subversion version tree). This should
be in form of a trace/log of version operations (i.e. a text file), including time stamps.
- Show how your group handled issue-tracking , e.g. Trac (or similar) trace/log (text file, with timestamps)
- Documentation of your testing process, planning and results. LIMIT 30 PAGES.
- A check list of Acceptance Tests, from Del B, and results of tests conducted with the Client.
- Time analysis and discussion
- Ethics discussion
The first two of these go to the Client.
Apart from the single submission for the entire Group, each group member is to complete a Group Relative Effort Computation.
There is a separate per-student cssubmit line for this component, separate from the per-group DelC submission.
There will be a penalty of 5 marks from a student's project mark if a Group Relative Effort Computation has
not been submitted.
Looking more closely at these:
- Testing
- Documentation of the attempts you made during the Deliverable C milestone to perform high quality verification and validation.
- System testing specification consisting of a series of test cases and expected results, and evidence of actual test results (several pages).
Include details of any automatic testing process you used.
You may wish to follow the format suggested in Bruegge & Dutoit Section 9.5.2
as appropriate, but ensure that you use mainly tables and lists of tests actually planned,
performed and repeated. Credit will only be given for credible work! (Give brief evidence
that you actually did run these tests).
Do not repeat material from Part B, although you may show test results.
- Unless there has been prior agreement to the contrary (e.g. for security reasons),
testing is to be carried out on the client's system.
- Time Analyis. Summarise your time data, showing differences between planned with actual values.
Show where those hours of toil went.
Give the total hours spent on the project as a whole.
This is really a distillation and analysis of the information that was included in your timesheets.
Provide brief explanation of any major variations between estimated and actual times. Perhaps indicate briefly how your
estimates were constructed.
- Ethics Discussion. Using the ACS Code of Ethics and ACS Code of Professional conduct,
provide a short (100-200 words) discussion of any ethical issues encountered in the course
of the project.
To help you think about it, please use the
spreadsheet.
For each item indicate the degree to which it was observed, on a scale Observed, Partially observed, Violated, Not applicable.
Give it some real thought, but don't invent something just to tick the box.
- The Group Relative Effort Computation is an Excel spreadsheet which is found
here.
The aim of the spreadsheet is to get your view of the relative effort contributed to the
project by each member.
It is not expected that all members of a group will make exactly the same
contribution to the project; you have different roles in the group and come with a range of skills -
it is a team effort!
This is a management tool to get the big picture and see if there are any significant discrepancies.
As such, it may influence the apportionment of marks for the project, though the default
remains that every member will get the same mark.
The source code should contain:
- A very brief document called README.txt which details a little bit about the project, the people involved and how the source directory is organized and how a user would go about running the application.
- A document called INSTALL.txt which details how to build, install and run the software including running the acceptance tests. Other information such as login names and passwords, plus other essential requirements, e.g. some movies in your database for an initial search test. These instructions are to be very brief indeed, in point form, taking no more than HALF A PAGE maximum; they are not full documentation.
- Any user documentation as specified by the Client. This will be assumed to be part of the deliverables to be marked by the Client under the Acceptance Testing process. (You are recommended to get a clear specification from the Client as to what, if anything, is required here. Perhaps even add the docoumentation as a separate Acceptance Test section.)
If the platform is projects.csse.uwa.edu.au, leave a working version on that machine in addition to the version in
CVS.
Ensure that file permissions are appropriate for the unit coordinator.
School of Computer Science & Software Engineering
The University of Western Australia
Last modified: July 17 2017
Modified By: Michael J Wise
|
|
|