2.2. Technical Design Report
Each team is required to submit a Technical Design Report (TDR) that describes the team’s design principles and competition priorities. The report should address the rationale for which autonomy challenge tasks have been chosen to attempt and how this competition strategy influenced the design decisions for the airframe and subsystems. Teams must follow the TDR instructions provided below. To be eligible for full points, teams must submit their TDR by the deadline found in Section 1.4: Competition Schedule and Timeline.
A strong TDR provides a coherent narrative and addresses the elements of the rubric as much as possible, including citing references used. The competition strategy justifies the choices of autonomy challenge tasks and design decisions that trace back to those task choices. The report also identifies which software tools allow the team to accomplish the tasks chosen.
The technical design report is worth a total of 150 points. The outline of each content section includes a scoring weight with guidance for scoring considerations that are provided to the judges during evaluations.
2.2.1 Deliverable Requirements
The content of the written paper shall include the following sections:
The format of the written paper shall adhere to the following guidelines:
6 page limit (excluding References and Appendices)
8.5 x 11 in. page size
Margins ≥ 0.8 in.
Font: Times New Roman 12pt
Header on every page including team name and page number
Submitted in .pdf format
Optional Formatting: Teams may choose to follow the two-column format, editorial style for IEEE Conference Proceedings: www.ieee.org/conferences/publishing/templates.html.
RoboNation Tip: It is recommended that papers be peer-reviewed prior to submission. For example, teams can utilize resources at their institution, fellow students, or professional editing services.
Formatting Scoring Metrics (5% of score)
2.2.2 Abstract
The abstract is a short summary of the main points in the paper. The abstract should summarize the linkage between overall competition strategy and system architecture, design, and engineering decisions.
Abstract Scoring Metrics (10% of score)
2.2.3 Acknowledgments
Participating in the competition, as in all research projects, involves leveraging resources and support beyond the efforts of individual team members. This support can take many forms such as technical advice, labor, equipment, facilities, and monetary contributions. Acknowledging those who have supported efforts is important.
Acknowledgements Scoring Metrics (5% of score)
2.2.4 Competition Strategy
The paper must include details on the team’s strategy for the competition, including the plans for approaching the tasks and how the vehicle design relates to this approach. The mission consists of multiple tasks with associated points for accomplished behaviors. The more tasks a vehicle is designed and engineered to accomplish, the more complex the overall vehicle system will be.
Discuss the team's strategy on trade-offs between system complexity and reliability. For example, teams have a limited number of working hours to prepare for the competition; this time could be spent adding additional capabilities or testing and improving the reliability of an existing capability. As system complexity grows, changes in subsystems can propagate in unmanageable ways when time is limited. Based on history and the system engineering talents of the team, include a description the team’s strategic vision.
Competition Strategy Scoring Metrics (25% of score)
2.2.5 Design Strategy
Given the strategy for success at the competition and the approach to managing complexity, the paper must include a description of the system design to meet the goals they established for the competition. Justification for design choices should be clear. Discuss how components and subsystems were selected and integrated on the vehicle. For teams that are working with a previously designed vehicle, discuss how the design meets the current competition strategy and any modifications needed at the component, subsystem, and/or integrated system levels. Describe the experience in making both architectural/design decisions and system engineering decisions.
This section should not include detailed component descriptions and/or specifications not of original design.
Design Strategy Scoring Metrics (25% of score)
2.2.6 Testing Strategy
Testing and experimentation is a crucial step to preparing and innovating a system design that strongly correlates with a competitive performance in the arena. The paper must include the approach to a testing strategy, including various test plans, both physically and in simulation.
Discuss considerations of the time needed to thoroughly test to meet the determined goals and the demands of design and engineering with those of testing and experimentation.
Testing Strategy Scoring Metrics (25% of score)
2.2.7 References
As with any technical publication, original ideas and content not generated by the paper’s authors should be properly cited. The references should follow the IEEE Conference Proceedings citation style.
References Scoring Metrics (5% of score)
2.2.8 Test Plan & Results (Optional Appendix)
Based off the testing approach outlined in the paper, this appendix showcases the test plan that was developed and the detailed results that came out of testing. Teams should present their plans for testing, including algorithm testing in a virtual environment, component testing in a laboratory setting, subsystem testing in a relevant environment, and full system testing in a pseudo-competition environment. Test set up should be included and results presented. Any design modifications or changes in competition strategy as a result of testing should be discussed.
While this appendix is not required, excellence seen in this section can be eligible for a special judges’ award.
The appendix may include detailed documentation covering the following areas:
Scope: Objectives and test cases (this may also specify what was not included in tests)
Schedule: Start/end dates and deadlines
Resource and Tools: Resources and tools needed to conduct tests and assess results
Environment: Description of the test environment, configurations, and availability
Risk Management: Outline potential risks that could occur throughout testing
Results: Detailed outcomes of test cases
Last updated