In our first blog, we used network speed test data captured by Ookla (www.speedtest.net) to analyze how internet speeds have changed in Canada from 2008-2014. While upload and download speeds are critical factors for people looking to participate in the digital economy, other factors such as network quality should not be overlooked, as they can have a dramatic effect on how usable the internet is for consumers. For example, excessive latency can be the cause of that annoying lag in your videoconference call. As such, our goal for this blog series was to determine how bandwidth quality has changed in Canada over time and how they compare to those published in other reports.
The three quality indicators we used (based on the available Ookla dataset) are:
a measure of how long it takes a data packet to get from your computer to the test server and back. This includes the time needed for the server to process the data packet and send a reply message.
- a measure of how long it takes a data packet to get from your computer to the test server and back. This includes the time needed for the server to process the data packet and send a reply message.
the variation observed in latency, or the variation in how long it takes to receive the return packets.
- the variation observed in latency, or the variation in how long it takes to receive the return packets.
- the likelihood that a data packet is lost during transfer from your machine to the test server. This is expressed as a percentage of the total number of packets sent.
Figure 1. Boxplots of annual average jitter, latency and packet loss. For more info on boxplots, see our first blog.
Network Quality in Canada
Overall, the Ookla quality dataset is much more sparse than the Ookla speed dataset. It ranges from 2009 – 2014, with data from five provinces for 2009 – 2011, and four provinces for 2012 – 2014. This is not as complete as we would have liked it to be, as only a portion of the country is represented. However, in the data that is present, we observed increases in latency and jitter overall, but only limited annual change since 2011 (Figure 1). For example, mean jitter values for Canada have ranged between 40 and 50 milliseconds, while mean latency has increased from 67 to 105 milliseconds. Between 2011 and 2013, the mean latency was ~93 milliseconds. This increased to 105 milliseconds in 2014 due to the New Brunswick contribution. If we exclude New Brunswick 2014 data, the mean latency for Canada was ~91 milliseconds, which is similar to that observed between 2011 and 2013. In terms of packet loss, there was a slight decrease in packet loss seen for 2014, but overall values remained near the 2% level.
Figure 2. Bar graphs of annual average jitter, latency and packet loss for each province. National averages are shown as horizontal lines for each year.
By diving deeper into the data we can look at how individual provinces compare with respect to network quality. Among provinces, Quebec consistently ranked as having the highest quality internet, recording the lowest mean jitter, latency and packet loss values from 2009 to 2013 (no data available for 2014). Alberta ranked second in terms of network quality metrics for 2012 and 2013, and jumped to first place in 2014 with the absence of Quebec. On the opposite end of the spectrum, the largest mean values for jitter and latency were seen for New Brunswick in 2014, at 184 milliseconds and 622 milliseconds, respectively. These values are dramatically higher (~4-fold for jitter and ~7-fold for latency) compared to the other provinces over the same timeframe. It is not clear why New Brunswick has such high values for jitter and latency. However, it should be noted that these values were obtained from a relatively small sample size (112 records), so they may not represent the network quality for all of New Brunswick over the given timeframe.
We also note that during this time period, New Brunswick had the highest mean download and upload speeds for Canada (link), despite its apparent poor network quality. The largest packet loss values were seen for Nova Scotia in 2010 (4.74%) and 2011 (3.62%), and Ontario in 2013 (3.15%). Again, from the available data, it is not clear what the source of these packet loss increases are. For Nova Scotia (2010 and 2011) it should be noted that the mean distance between the test location and Ookla test server was > 1.5 times higher compared to other provinces.
Similar to our speed blog, we also attempted to examine how location (urban or rural) impacts network quality characteristics. Unfortunately, the number of records available for a rural analysis was small (only 11 communities throughout the country were represented). We therefore could not do further analysis based on this query.
Other Published Reports
We then compared our Ookla network quality results with other published network quality data. The April 2016 CIRA report on Canada’s Internet Performance is based on data collected from CIRA’s Internet Performance Test from May-December 2015. CIRA reported a 2015 national average of 96.43 ms for ping test round-trip times, which is similar to the 2014 round-trip latency result of 105.89 ms (91.44 ms when New Brunswick is excluded). However, it should be noted that differences in latency times can also be attributed to other factors, including the type of test employed (e.g. TCP, HTTP, ICMP, UDP).
As well, as highlighted by the CRTC’s SamKnows Analysis of Broadband Performance in Canada from 2015 and 2016 (which is based on the voluntary deployment of SamKnows Whiteboxes to Canadian households), latency is highly dependent on the speed and connection technology used to access the internet. From both studies, fibre services had the lowest latency results (2016: < 10.2 ms), followed by cable (2016: 15.7 – 21.2 ms) and DSL (2016: 17.7 – 25.5 ms) technologies. Interestingly, for DSL (Digital Subscriber Line or telephony) technologies, the highest latencies were seen at the lowest speed bucket (5 – 9 Mbps), while for cable technologies, the highest latencies were seen at the 40+ Mbps speed bucket (which are attributed to an ISP that delivered high speed services to remote areas). Overall, the SamKnows study reported latency that was much lower than what we observed with the Ookla data. For example, their highest reported latency value (29.6 ms) for 2015 data was ~4-times lower than the national latency average (105.89 ms) observed in the 2014 Ookla data.
In terms of jitter, the 2015 national average from the CIRA study reported a ~6-fold higher value (304.66 ms) than the 2014 Ookla data (46.65 ms). Interestingly, the CIRA report was able to perform a rural vs urban community analysis, with urban areas reporting lower average jitter value (287.08 ms) compared to rural communities (358.6 ms). Although the CIRA study did not report on packet loss, data from the SamKnows studies showed a 2015 national average of between 0.04 – 0.23%, depending on connection technology and speed bucket, with the highest packet loss reported for DSL technology at the 5-9 Mbps speed bucket (0.54%). Similar results were also reported in the 2016 SamKnows study. Our 2014 Ookla data analysis revealed a national packet loss average of 1.44%, which is significantly higher than SamKnows’ reported values.
The varying reported results for network quality factors such as jitter, latency and packet loss are not surprising, but the magnitude of the differences for some of the reported values is unexpected. Differences in network quality are likely related to a number of factors including the type of test employed, the connection type, and distance between the user and the testing server. For example, Ookla round-trip latency results are HTTP-based and are performed 10 times, with the lowest value determining the result. In contrast, SamKnows Whitebox latency and packet loss results are UDP-based and are continuously running in the background, where each test records the number of packets sent per hour (typically around 600 per hour). SamKnows also uses the 99th percentile when calculating the summarized minimum, maximum and average results.
This study looked at national and provincial networking trends from 2009 – 2014. Nationally, since 2011, jitter, latency and packet loss has remained relatively stable, with the notable exception of the large increase in latency in 2014 due to the contribution of New Brunswick data. Among the five provinces that had data available, Quebec had the best network quality in terms of lowest mean jitter, latency and packet loss values.
Although our results varied widely from other published reports, it should be carefully noted that we only had data for a small number of provinces, which limits any effective comparison that we can make to other, broader sampling studies. Overall we saw large differences between the Ookla results and 1) the 2015 and 2016 SamKnows studies (with respect to national latency and packet loss), and 2) the 2015 CIRA study (with respect to jitter).
Despite the fact that the tools and timeframes differ, these studies all report on Canadian data, and the clear lack of consistency between them is both fascinating and troubling. The average Canadian consumer would have difficulty knowing which internet report to trust. Our results indicate that there is no clear, definitive answer on network quality, and you certainly cannot form a strong opinion based on a single study.
Our advice to the average consumer looking to research how their network results compare to others is to look at the different tools, study results, and options to aggregate results (e.g. averaging results of studies) before drawing any conclusions. Reference articles such as the OECD Communications Outlook 2013 and Measurement Lab’s (M-Lab)’s “Understanding Broadband Speed Measurements” provide excellent insight into how the various speed and quality tests are conducted. As well, tools like M-Lab allow consumers to get direct access to their raw results, which can help them do their own data exploration and analysis.