image_caption: <aclass="link b dim moon-gray"target="_blank"href="https://www.flickr.com/photos/wocintechchat/25926651781/in/album-72157664006621903/">Photo</a> by WOCinTechChat / <aclass="link b dim moon-gray"href="https://creativecommons.org/licenses/by/3.0/"target="_blank">CC BY 3.0</a>
Milestone 4 - "Managing data" - has been an important one for us. Finishing it means Offen Fair Web Analytics is now close to being feature complete in the scope of our initial plans, and we can start transitioning into a Beta state, meaning we can finally offer a stable product for users to use in production environments.
Before removing the Alpha label, we'd still like to have external audits in Milestone 5, but we are already in touch with potential users and are starting to see installations in the wild. Exciting times ahead!
During Milestone 4 we have released the following versions:
Account management for users has been a part of Offen Fair Web Analytics for a while, but in this Milestone we took the time to bring it to a level that it satisfies the needs of real world teams. There are now read-only users, fine-grained controls for sharing access and other management options. We are now looking for feedback in how this works out for setups like smaller dev shops or agencies.
In Milestone 5, we want to add integration test coverage for all of our user-facing features. To prepare for this, we did research on what tools we can use and how to integrate them into our existing development and CI setup.
We ended up choosing and implementing a setup using [Cypress](https://www.cypress.io/), which is a popular MIT-licensed tool that can run tests in multiple browsers like Chromium and Firefox.
Another great thing about this setup is that is allows us to run automated Accessibility and performance audits (for example using [Lighthouse](https://developers.google.com/web/tools/lighthouse) or [Pa11y](https://pa11y.org/).
This has been implemented in PRs [362](https://github.com/offen/offen/pull/362), [365](https://github.com/offen/offen/pull/365) and [368](https://github.com/offen/offen/pull/368)
For self hosted software like Offen Fair Web Analytics, giving potential users an idea of what the software looks like without having to do a proper install. Many softwares do this by sharing the credentials for a demo account on their website, but in the case of Offen Fair Web Analytics we do not want to do this as it would expose the usage data of our real world users - which is what we are trying to protect with Offen Fair Web Analytics.
This is why we built a downloadable demo of Offen Fair Web Analytics that you can run on your local machine. This demo exists for a while now, but with Milestone 4 we made major improvements to this feature:
- A demo is now populated with randomly generated usage data at start, so that users will get an idea of how an install that is already in use will look like, instead of having to generate usage data themselves beforehand.
- We added a dedicated landing page for demo users that explains them how to use the demo from both a user's and an operator's perspective.
An ongoing part of our work on Offen Fair Web Analytics is implementing features and fixes that come from our own experience with running our own Offen Fair Web Analytics instance. This is why Milestone 4 contains a few UX improvements and fixes regarding the operator facing Auditorium. Among others, we improved the referrer stats, improved the mobile UX for tabular data and fixed issues with the user flow for resetting your password.
This week already we will have an Accessibility Audit by Stichting Accessibility. We look forward to implementing the feedback we receive and making Offen Fair Web Analytics accessible for all users.
Securing user data is a key aspect of Offen Fair Web Analytics, so it's important to make sure we did not accidentally leave any unwanted loopholes in our system architecture. This is why Milestone 5 also includes a Security Audit by "Radically Open Security". We'll look for proper use of cryptography and a hardened HTTP interface for the server specifically, but if we can pick up other improvements along the way we won't hesitate to implement these.
Now that we have built a solid foundation for Offen Fair Web Analytics, we want the public to be able to hack on and participate in the development of Offen Fair Web Analytics. To make sure this is a safe and enjoyable journey, we'll add comprehensive integration test coverage for all major user stories there are so that we can always be sure the software keeps working as intended when we review and merge patches and features by others (and ourselves of course).
Closely related to the above, we will also do a thorough check to make sure Offen Fair Web Analytics is ready for external contributions. Is documentation up to date? Does our development setup work reliably across different OSes and hardware? Is it easy to open an issue and get in touch with us? We're definitely looking forward to having the community become a part of our efforts.
Offen Fair Web Analytics tries to be a slim and lightweight solution but nevertheless, crucial user flows can break unexpectedly and cause frustration for users, operators and developers alike. To prevent such breakages we'll focus on adding integration tests in the next milestone. In case you're curious, why not have a peek at what this looks like right now?
Offen Fair Web Analytics collects data only after opt-in. In addition to the consent banner that is shown on websites that embed Offen Fair Web Analytics, the Auditorium itself allows users to manage their consent status. As an exercise, let's write a test where a user first grants consent, reviews the Auditorium and then opts out again, seeing that data has been deleted.
As noted above integration tests are written using [Cypress](https://www.cypress.io/) which has a `mocha`-esque DSL for writing tests. In the `offen/offen` repository, create a new file called `integration/cypress/integration/consent.spec.js`. We're ready to write a basic test now.
N.B.: these examples use `.contains('some text')` for selecting elements as this is very obvious in the context of an example. Our real world tests will use dedicated `data-testid` selectors for targeting DOM elements.*
```jsx
describe('Consent', function () {
it('displays options for opt-out and opt-in to a user without no decision set', function () {
cy.visit('/') // visit the index page
// next, we check whether both options are present and enabled
Assuming you have a running development environment set up, you can now use `make integration` to see the test runner run the tests you just added.
Of course, there are a lot of other cases to cover for this feature, but you probably have an idea of how the setup works by now and how it allows us to ensure we ship working software.
If you have any feedback, comment or bug report on this milestone release, we'd love to hear from you. [Open an issue](https://github.com/offen/offen/issues) or send us an email at [hioffen@posteo.de.](mailto:hioffen@posteo.de)