Before I get into the actual content of the email, I'd like to introduce myself. My name is Vale Tolpegin and I am a student participating in this year's GCI competition. I am really interested in Haiku and have really enjoyed my time working on it so far!
One of the projects I worked on required the creation a web app that can show a rating and compatibility overview of hardware for Haiku. The goal is to eventually have a system that can be hosted by Haiku which will allow people to submit hardware test reports.
At the moment, a lot of work has been done to prepare the system and get an initial working version. I have designed the backend to use Django, on top of which I am using only CSS & HTML which keep the pages very lightweight. In addition, I am using the style that the main Haiku website uses. The technical specs of the current site's front-end pages are as follows:
- Page size is roughly 135 KB with all files hosted locally (no JQuery required)
- CSS is Bootstrap with Shijin4
As I stated earlier, the backend is written using Python and Django's web server framework. I have not done any load testing yet, in part because we do not have a test instance of the final version of the server setup. The current status of implemented features includes most of the requirements of the final version, just in very simple implementations. Features implemented include:
- Login & Logout. The backend authentication is currently just Django's built in authentication system, but I have enabled support for LDAP, so the implementation of LDAP will be simple on the final production-ready version of the server.
- Submitting tests. Once logged in, you can submit a test for a component or a device.
- Editing tests. Almost all fields of a component and device can be edited.
Although I have done a lot of work so far, a lot more will be required to get the server to a production-ready state. This includes:
- Code refactoring. I need to break up the views for the site and separate them properly to create an easy-to-understand system that others can maintain.
- More in-depth hardware analysis options in the form. Currently, I only allow the user to briefly explain whether each component passed or not, and do not provide the user the ability to give a lengthy explanation for what happened in the test.
- Automatic hardware analysis. This is a little more complicated and will take some more work, but it is the preferred path for the production version. I'm currently planning on using mmu_man's HardwareChecker script to accomplish this.
Thoughts? Comments? Suggestions? Ideas?