Skipped Questions: Impact On VA Forms And Review Process
Introduction
This article delves into the discovery process of enabling skipped questions for in-progress forms within the Department of Veterans Affairs (VA) system, specifically focusing on va.gov. As engineers, our primary goal is to rigorously test this functionality behind a feature flag. The core of our exploration revolves around understanding how the review and submit page behaves when users skip questions in a Business Driven Development (BDD) flow. This feature aims to enhance user experience by allowing veterans to navigate forms more flexibly, but it's crucial to ensure that this enhancement doesn't compromise data integrity or the submission process. Our focus is to ensure a seamless and error-free experience. This involves meticulous testing, detailed documentation, and adherence to stringent engineering standards. The successful implementation of this feature promises a more adaptable and user-friendly form-filling experience for veterans, aligning with the VA's commitment to modernizing its digital services. This article will cover the user need, description, tasks, acceptance criteria, definition of done, engineering considerations, code review process, and refinement checklist.
User Need
The user need driving this discovery is centered around the engineering team's requirement to thoroughly test the impact of enabling skipped questions for in-progress forms. Engineers need to validate the behavior of the review and submit page in this scenario. This testing is crucial for ensuring that the introduction of skipped questions does not negatively affect the user experience or the integrity of the submitted data. By enabling this feature behind a feature flag, engineers can isolate and examine its effects in a controlled environment. This allows for comprehensive testing and debugging before the feature is rolled out to the general user base. The ability to skip questions can significantly improve the user experience by allowing veterans to focus on the questions that are most relevant to their situation. However, it's paramount that the system handles these skipped questions gracefully, ensuring that no critical information is missed or misinterpreted. The ultimate goal is to provide a more flexible and user-friendly form-filling experience while maintaining the accuracy and completeness of the submitted data.
Description
The description of this discovery task is intrinsically linked to the implementation of a flipper for enabling skipped questions in the BDD flow, as outlined in issue #125115. The core objective is to understand and document how the review and submit page behaves when this flipper is activated for in-progress forms. This involves a detailed examination of the user interface, data handling, and overall workflow. The engineering team must assess how skipped questions are presented on the review page and how they affect the submission process. This includes verifying that all required fields are correctly identified and that users are prompted to complete them before submission. Furthermore, the team must ensure that the system accurately reflects the skipped questions in the submitted data and that this information is readily accessible to VA personnel. The documentation should provide a clear and concise explanation of the expected behavior, potential issues, and any necessary workarounds. This thorough understanding is essential for ensuring a smooth and reliable user experience when the skipped questions feature is fully implemented.
Tasks
To achieve the objectives of this discovery, the following tasks must be completed:
- Check what happens on review and submit for in-progress form when flipper is turned on: This involves activating the feature flag and navigating through an in-progress form, skipping certain questions, and then proceeding to the review and submit page. The goal is to observe and document any changes in behavior or appearance. This task requires a keen eye for detail and a systematic approach to ensure that all aspects of the page are thoroughly examined.
- Document what happens on review and submit page: This task involves creating a comprehensive record of the review and submit page's behavior when skipped questions are enabled. This documentation should include screenshots, descriptions of any observed changes, and any potential issues or concerns. The documentation should be clear, concise, and easily understandable by both technical and non-technical stakeholders. The documentation serves as a valuable resource for future development and maintenance efforts.
Acceptance Criteria
The acceptance criteria for this discovery task are as follows:
- In-progress forms with BDD flow are tested for adding skipped questions: This criterion ensures that the testing process is comprehensive and covers all relevant scenarios. The BDD flow should be followed to ensure that the testing is aligned with the user's perspective.
- Review and Submit page behavior for such cases is documented: This criterion ensures that the findings of the testing process are properly recorded and communicated. The documentation should be clear, concise, and easily accessible.
Meeting these acceptance criteria is essential for ensuring that the discovery task is successful and that the engineering team has a clear understanding of the impact of enabling skipped questions for in-progress forms.
Definition of Done
The definition of done for this discovery task includes the following:
- Meets acceptance criteria: This ensures that all the objectives of the task have been achieved and that the findings have been properly documented.
- Reviewed and approved by product and/or design: This ensures that the findings have been validated by the relevant stakeholders and that they are aligned with the overall product vision.
Engineering
From an engineering perspective, several key considerations must be addressed:
- All tests pass: This ensures that the new functionality does not introduce any regressions or break existing functionality. Unit tests, integration tests, and end-to-end tests should be executed to verify the correctness of the code.
- New functionality is covered by unit tests: This ensures that the new code is properly tested and that any future changes will not break the functionality. Unit tests should be written to cover all the edge cases and boundary conditions.
- Logging and monitoring are implemented (if applicable): This ensures that the system can be monitored for any issues and that any errors can be quickly identified and resolved. Logging should be implemented to capture relevant information about the system's behavior.
Code Review & Pull Requests
The code review and pull request process must adhere to the following guidelines:
- PR includes Local testing steps: This ensures that the reviewers can easily test the changes and verify their correctness.
- PR includes Flipper/testing state details (if applicable): This provides context for the reviewers and helps them understand the impact of the changes.
- PR includes Author's local proof of submission screenshot: This provides visual confirmation that the changes are working as expected.
- Copilot review completed and feedback addressed: This ensures that the code has been reviewed by an automated tool and that any potential issues have been addressed.
- Internal reviewer approved: This ensures that the code has been reviewed by a senior engineer and that it meets the required standards.
- Internal reviewer added local proof of submission screenshot: This provides additional visual confirmation that the changes are working as expected.
- Code functionality verified on Staging after merge: This ensures that the changes are working correctly in a staging environment before they are deployed to production.
Refinement Checklist
The refinement checklist ensures that the discovery task is properly defined and scoped:
- Added description, tasks, and acceptance criteria
- Added estimate
- Labeled with Practice Area (engineer, design, product, data science)
- Labeled with issue type and characteristics of the ticket (bug, accessibility, request, discovery, documentation, research, content, ux testing, front-end, back-end, datadog, etc.)
- Added any other relevant project fields (team, OCTO priority...)
- Added an Epic or Super Epic
By adhering to this checklist, the engineering team can ensure that the discovery task is well-defined and that all the necessary information is available.
Conclusion
In conclusion, enabling skipped questions for in-progress forms is a significant step towards enhancing the user experience within the VA system. However, it is crucial to thoroughly test and document the impact of this feature on the review and submit page. By following the tasks, acceptance criteria, and engineering guidelines outlined in this article, the engineering team can ensure that the implementation is successful and that the VA continues to provide a user-friendly and efficient platform for veterans. This discovery process is essential for maintaining the integrity and reliability of the VA's digital services. To learn more about web accessibility standards and best practices, visit the Web Accessibility Initiative (WAI) website.