Abstract

I've identified four main areas of improvement: scripts running during HTML parsing, HTTP1.1 requests, unnecessary client-side rendering of Markdown and unnecessary requests being made. Scripts running during HTML parsing were delayed until parsing was complete producing a time to first paint improvement of 127ms at the issues page and 138ms at homepage, upgrading to HTTP2 made the homepage and the new issues page load the first paint about 100ms faster in all devices tested, removing unecessary use of markdown-it reduced the download size by 154kb which caused an improvement of about 1s on the total download time (measured on 3G) and a 100ms decrease on the parsing html time, removing unecessary requests from the homepage decreased the loading time by 310ms and the download size by 113kB. Putting all of these changes together I got a total improvement at the first paint of about 250ms for both homepage and new issues page, the total time to finish load decreased 1689ms on homepage and 550ms on new issue, the time to load DOM was 870ms faster on homepage and on new issue 625ms faster and finally the load time decreased 772ms on homepage and 480ms on new issue.

Introduction

This document was structured as follows:

The Running tests & finding bugs four first sessions are related to the four first sessions of Results they contain tests and/or detailed explanations about how I obtained my results, the last session on Running tests & finding bugs, General analysis, is about a myriad of tools I've used to run tests on Webcompat and its results.

The Results session explains in detail what was mentioned on the Abstract session.

The Future implementations in details session consists in things that I thought I could improve but I didn't have the time to implement.

The Timeline session contains what I intend to produce in case I get approved on Outreachy.

Motivation

Performance is a key characteristic for a webpage, according to a research made by Google a few years ago 53% of mobile users abandon sites that take longer than 3 seconds to load its first paint and desktop users too aren't willing to spend much longer whatching a screen loads, according to Fiona Fui-Hoon's work, the average user will wait approximately 2 seconds.

This work intends to improve rendering of webcompat.com by exploring a number of tools and solutions on frontend and backend.

Running tests & finding bugs

One thing I realized while making tests is that you can get a hot cache from Github, that means that if I test a lot of things in sequence I'll get a better result than testing one thing and then waiting for a while before testing it again. I tried to minimize the differences between these two states by testing everything using Github's hot cache. Also, before each new test I've refreshed the page a couple of times and looked at the general statistics so I could remove outliers (for better or for worse), for example, if a page usually takes 1s to load but in one of my tests it took 5s than I'd know something out of ordinary happened and I wouldn't use this result on my analysis. I know it's not the ideal scenario but it was the best solution I could come with until now (profiling is hard!).

HTTP 2

Results I've obtained running Webcompat on a Ngnix 1.15.5 server using uWSGI 2.0.17.1 to make HTTP/2 requests:

Mac homepage

Analysis HTTP/1.1 HTTP/2 HTTP/2 + push
Load requests 490ms 463ms 402ms
First paint 508ms 472ms 430ms

Mac new issue

Analysis HTTP/1.1 HTTP/2 HTTP/2 + push
Load requests 424ms - 352ms
First paint 444ms - 317ms

Nexus homepage

Analysis HTTP/1.1 HTTP/2 HTTP/2 + push
Load requests 1153ms 1288ms 950ms
First paint 1343ms 1740ms 1105ms

Nexus new issue

Analysis HTTP/1.1 HTTP/2 HTTP/2 + push
Load requests 450ms - 429ms
First paint 1060ms - 650ms

Sony homepage

Analysis HTTP/1.1 HTTP/2 HTTP/2 + push
Load requests 760ms 780ms 680ms
First paint 811ms 755ms 730ms

Sony new issue

Analysis HTTP/1.1 HTTP/2 HTTP/2 + push
Load requests 701ms - 452ms
First paint 634ms - 474ms

I've experimented with three different approaches, the first one using plain HTTP/1.1, this approach emulates the real Webcompat server, then I tried a plain HTTP/2 approach in which I've simply added the word http2 to listen 443 ssl http2; and finally I thought it'd be a good idea to use push to common files like webcompat.min.css and webcompat.min.js, so I've added the following lines to server:

http2_push /css/dist/webcompat.min.css;
http2_push /js/dist/webcompat.min.js?bd3a7d264933c78b0dbd5f35d1f077c3;
http2_push_preload on;

These changes can be checked in this commit.

The load requests line in the table refers to the time the website took to load all of the requests necessary to render the first paint. On homepage and on new issue page the necessary files were the same: open sans font, webcompat.min.css, ga.js, analytics.js, webcompat.min.js, untriaged.js and the .svg leaves.

While building the server at my local machine I had some issues with my authentication with Github I didn't have much time so I decided to test pages that weren't very dependent on that, because of this issue I wasn't able to fetch any Github issues.

I ran the tests on the homepage first and then I realized my changes with push always performed better so when I tested the new issue page I didn't run tests using HTTP2 only because I knew they would perform worse.

Markdown-it

These tests were made using a Nexus 5 with the same setup as described at List of issues setup and the option Fast 3G available on Chrome DevTools checked.

Subpage Total load time DOM content load Load time TTFP Parse HTML
Homepage original 7080ms 4777ms 4775ms 2200ms 166ms
Homepage modified 6102ms 3757ms 3752ms 2112ms 99ms
Issues original 6782ms 5040ms 5026ms 2238ms 477ms
Issues modified 6012ms 3776ms 3768ms 2240ms 343ms

Box plots generated with data gathered on issues:

lighthouse_homepage

I've only collected 5 different numbers for each table line, it's a small number and could mean these results are not as precise as they could be. Either way I obtained a standard deviation of 134 on issues using the original code and 194 using the modified code. In homepage's case I had a standard deviation 82 for the unmodified code version and 215 for the original.

List of issues

The following table was generated using the data sampled from 70 tests. 35 using my modified code and 35 not. I removed outliers and this is what I got:

Status First painting Parse loading(ms) Parse loading(%) XHR loading(ms) XHR loading(%)
Non-optimal 1083ms 620ms 40% 399ms 32%
Optimal 956ms 304ms 20% 416ms 28%

This data was generated using a Nexus 5 connected on a MacBook running Chrome/69.0.3497.100. To be able to do that I had to add a few lines of code that can be found on this PR.

I've noticed that there are some differences in the development and production environment, for example, a lot of files that aren't minified in dev are minified in production. So what I did was change the environment from development to production using export DEVELOPMENT=1 and change my secrets.py file to if True instead of if env.LOCALHOST so I could test the production environment while I was running the server on my computer. What I learned from doing that is that my result was consistent, it doesn't matter if I'm on development environment or on production.

Homepage

I ran 30 tests, 10 times for each line of the following table. I've used the same method as described on List of issues to make the tests.

Status First painting
No changes 1079ms
bugform.js 941ms
bugform.js & untriaged.js 1159

Image showing 30 requests being made on homepage: