Fershad IraniWritings about web sustainability, and how we can takes steps towards a greener web.2024-02-20T13:25:46Zhttps://fershad.com/Fershad Iraniitsfish@fershad.comWhy are you estimating digital carbon emissions?2024-02-20T13:25:46Zhttps://fershad.com/writing/why-are-you-estimating-digital-carbon-emissions/<div><p>I've had a fair few conversations about calculating digital carbon emissions recently. It's great to see so much interest in this space, it can only be good for the development of the tools and methodologies we have access to. One thing, though, which I've noticed has been missing from a lot of those conversations is a discussion around <em>why</em> folks might want to estimate digital carbon emissions.</p><p>In this post, I want to look at the two different carbon accounting models used for estimation, and in what scenario you might use one over the other. In this way, I hope to try to provide a framework people can refer to when they are first approaching the topic, or even if they're deep in the weeds.</p><h2>Semantics</h2><p>I've used the work <em>estimate</em> very carefully here. We're not going to cover <em>measuring</em> digital carbon emissions, because that's a very different thing.</p><p>Measuring is a precise practice. It requires setting up detailed instrumentation and tooling. The results of <em>measuring</em> digital carbon emission will give you the actual emissions produced from a given digital service or activity. When measuring digital carbon emissions, you'd look to use tools that allow you to record the <em>actual</em> power or energy used by a process - such as the <a href="https://fershad.com/writing/co2e-estimates-in-firefox-profiler/">Firefox Profiler</a>.</p><p>Estimating is an imprecise practice. It relies on models, methodologies, calculations, some assumptions, and some generalisations. Depending on the type of methodology you use, the results of <em>estimating</em> digital carbon emissions will give you results that might under-represent or over-represent the emissions produced from a given digital service or activity.</p><h2>Why does <em>why</em> matter?</h2><p>Understanding why you are measuring digital carbon emissions is very important because it will determine the kind of estimation model you'll end up using. And, the kind of model you use will end up determining where you sit on the over- or underestimating emissions spectrum.</p><p>There's two main reasons why you might want to estimate digital carbon emissions:</p><ol><li>You need to capture and include these emissions for compliance and reporting purposes</li><li>You are looking where you can make optimisation and improvements to reduce the emissions of your digital product or service</li></ol><h2>Estimating for reporting</h2><p>If you're estimating for reporting, then you're going to be <em>looking backwards</em> over a period of time to estimate emissions.</p><p>In most cases, you'll be reporting emissions on an annual basis. You'll probably also be required to use some kind of reporting framework, like the Greenhouse Gas Protocol (GHG Protocol). To the best of my knowledge, all of these reporting frameworks are based on <em>attributional accounting methods</em> to estimate emissions.</p><h3>What is an attributional model?</h3><p>Attributional accounting models are backwards looking. They look at the total emissions of a system, and then try to attribute a part of those emissions to the usage of a system. The simplest example is the emissions per passenger on a plane, is the total emissions of the plane divided by the average number of passengers.</p><p>They are useful for understanding the impact of long-lived interventions on the structural evolution of the electricity system. This makes them suitable for reporting, carbon accounting, and raising awareness of the scale of general impacts.</p><p>The nature of attributional models lead to a likeliness that emissions are overestimated. Take our plan example where each passenger is counted as one passenger. There's limited nuance there - no accounting for weight, or number of carry-on bags, or what flight class they're seated in.</p><h3>Examples of attributional models</h3><p>In the web development space, the <a href="https://sustainablewebdesign.org/calculating-digital-emissions/">Sustainable Web Design</a> (SWD) model is perhaps the most well known attributional model. Prior to SWD, the <a href="https://theshiftproject.org/en/lean-ict-2/">OneByte</a> model was another example.</p><p>Both models estimate the emissions of a website across three segments - hosting, networks, and user devices. Both allocate a percentage of the total annual energy used by the internet to each segment. Data transfer (in bytes, megabytes or gigabytes) is then used as the "usage" metric for estimating the emissions resulting from loading a given webpage or online content.</p><h2>Estimating for optimisation</h2><p>When you're estimating for optimisation, you're likely <em>looking at the present and future</em> to estimate emissions. Put another way, you want to see what <em>consequence</em> of a change to part of the system might have on overall emissions. For this, you'll use <em>consequential accounting methods</em>.</p><h3>What is a consequential model?</h3><p>A consequential accounting model is forward looking. It looks at the role individual parts of a system play in the overall emissions produced by that system. To take the plane example again, looking at the incremental emissions of a single passenger boarding a plane would take in many more inputs - such as flying class, number of carry-on bags, the weight of these bags, the weight of the passenger, the weather conditions during the flight and so on. These inputs would all have either emissions factors set for them, or would have a specific calculation to estimate their resulting emissions.</p><p>Consequential models are much more detailed than attributional models. As you can see from the example above, they require much more information to be fed into the model in order to produce a meaningful result. This makes them suitable for investigating more detailed behaviours within systems, and facilitating nuanced discussion, so that the system can be optimised.</p><p>This level of detail might sound like a good thing. But consequential models are still models nonetheless, and to reuse a line from previous posts <a href="https://en.wikipedia.org/wiki/All_models_are_wrong"><em>all models are wrong, but some are useful</em></a>. Due to their detailed nature, consequential models are far less approachable when compared to attributional models. The information gathering required, and the level of detail that goes into them can be daunting to those looking to get started with estimating emissions. This might lead to consequential models including many assumptions, generalisations, or default values to fallback on. It also leads to a greater risk that someone using the model might enter incorrect or inappropriate data.</p><p>Consequential models are more likely to produce results that underestimate the emissions of a given activity. Due to the number of inputs involved also makes them far less suitable for measuring the long-term impacts of system changes.</p><h3>What are some examples of consequential models?</h3><p>Consequential models capture the short-term, marginal impact on energy or emissions caused by a change in a particular activity. A good example of this is the Power Model that Carbon Trust introduces in their <a href="https://ctprodstorageaccountp.blob.core.windows.net/prod-drupal-files/documents/resource/public/Carbon-impact-of-video-streaming.pdf">Carbon impact of video streaming analysis</a> (2021 - see section 5, pages 54 - 67). This model exposes how a change in view patterns (device used, duration, quality of stream etc) impacts the energy required to deliver that content to the viewer.</p><h2>Wrapping up</h2><p>I hope that this post has helped you better understand the differences in the accounting models used to estimate carbon emissions, and why you might use them.</p><p>As more organisations start to explore ways to measure, report, and reduce their carbon emissions, understand why you're doing so is an important first step. Not only will it help to clarify your own thinking, but it will help guide you towards the kind of model you should look to use for your purpose.</p></div>Thinking about a way to estimate website energy use2024-02-20T13:25:46Zhttps://fershad.com/writing/thinking-about-a-way-to-estimate-website-energy-use/<div><p>In January, I had some fun <a href="https://fershad.com/writing/adapting-cloud-carbon-footprints-methodology-to-website-carbon-estimates/">rejigging the Cloud Carbon Footprint (CCF) methodology</a> to see how it might be adapted to calculate server-side emissions in a website carbon estimation model. It was a great little thought experiment, and something that I learnt a lot from in the process.</p><p>One of the things I picked up, thanks to Tom Greenwood, was a better understanding of the <a href="https://qt.fershad.com/writing/thinking-about-attributional-consequential-models/">differences between attributional and consequential models</a>. The tl;dr is attributional model take the full energy used by a system & allocate part of that energy to a unit of whatever is being measured. Consequential models are far more granular, estimating energy for individual parts of the system.</p><p>Through conversations with a few folks, especially <a href="https://github.com/camcash17">Cameron Casher</a> a maintainer of the <a href="https://github.com/cloud-carbon-footprint/cloud-carbon-footprint">CCF project</a>, I began to understand that it falls more on the consequential side of the spectrum. It is better suited to examining a system (cloud infrastructure in this case) in detail, and surfacing areas for potential optimsation to reduce those impacts.</p><h2>About this post</h2><p>In this post, I want to continue building out an incremental model, but rather than focusing on emissions calculations I want to create a model to estimate energy use. My hope is that beginning to think about an incremental model like this will help give developers a way in which they can identify parts of their technology stack which could be optimised to reduce energy use.</p><p>I will revisit the work I did previously on the data center segment in my previous post. I will then work on the two other segments that make up operational energy use for a website - networks and user devices. As I did in the previous post, I'm going to lean on the work of others here to speed things along. Of course, I'll do my best to provide links and explanations of the things I change along the way.</p><h3>Why only energy & not emissions?</h3><p>Grid intensity (the emissions attributed to the mix of fuels that are powering a grid) can vary so much from region to region. In that way, it can have a considerable impact on the result of calculations, though it is often outside of the developer's control.</p><p>Focusing on energy allows us to concentrate on individual parts of the system that developer might actually have some control over. It allows us to highlight areas that could be optimised, and gives a consistent means to measure these changes.</p><h2>Segment: Revisiting Data Centers</h2><p>Let's start by quickly revisiting the calculation I formed when looking at how Cloud Carbon Footprint might be used to estimate website server emissions.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Read first</p><p></p><p>I strongly recommend that you read the previous post - <a href="https://fershad.com/writing/adapting-cloud-carbon-footprints-methodology-to-website-carbon-estimates/">Adapting Cloud Carbon Footprint's methodology to website carbon estimates</a> - before you read this one. It will set some context which I'm skipping over here.</p><p></p></div><p>In that post, I arrived at a formula for data center emissions that was:</p><pre class="language-text"><code class="language-text">Data center CO2e (grams) = (Compute Kilowatt-Hours + Memory Kilowatt-Hours) * PUE * Local grid intensity + (Storage Kilowatt-Hours + Network Kilowatt-Hours) * PUE * Global grid intensity</code></pre><p>So the data center segment has got several sub-segments - Compute (CPU), Memory, Storage (HDD/SSD), and Networking. Since we only want to calculate energy use, we will remove the grid intensity parts of the calculation. We'll also spilt the calculation up so that it's easier to read for each sub-segment.</p><pre class="language-text"><code class="language-text">Data center Energy (kWh) = (Compute Kilowatt-Hours * PUE) + (Memory Kilowatt-Hours * PUE) + (Storage Kilowatt-Hours * PUE) + (Network Kilowatt-Hours * PUE)
Compute Kilowatt-Hours = (2.292 * server process time (seconds)) / 1000 / 3600
Storage Kilowatt-Hours = 0.0000000009 kWh/MB * data transfer MB * number of CDN regions
Network Kilowatt-Hours = 0.000001 kWh/MB * data transfer MB * number of CDN regions
Memory Kilowatt-Hours = 0.000000392 kWh/MB * data transfer MB</code></pre><h2>Segment: Networks</h2><p>Network energy is interesting, especially as research now suggests that <a href="https://fershad.com/writing/website-carbon-beyond-data-transfer/#data-transfer-network-energy-usage">network energy doesn't scale linearly with data transfer</a>. That is to say, the more data being transferred over a network does not mean that network is using significantly more energy than it otherwise would. Likewise, transferring less data doesn't result in a reduction in network energy consumption.</p><p>The best work I've seen reflecting this comes from a <a href="https://ctprodstorageaccountp.blob.core.windows.net/prod-drupal-files/documents/resource/public/Carbon-impact-of-video-streaming.pdf">Carbon Trust report</a> into the impacts of video streaming. In the report the cite <a href="https://online.electronicsgoesgreen.org/wp-content/uploads/2020/10/Proceedings_EGG2020_v2.pdf">research by Jens Malmodin (2020)</a> (page 87 - 96) which looks at power consumption of mobile & fixed data networks.</p><p>The carbon trust report allocates a fixed baseline power load to fixed and mobile networks. It then also provides a means to account for the marginal increase in energy use caused by long-duration, data intensive activities like streaming. The folks over at Scope3 have broken down these figures, and I've used their summary below. <a href="https://methodology.scope3.com/data_transfer#power-usage-by-time-and-bandwidth-power-model">Read their docs</a> to understand how they reached these figures and the changes they've made to Carbon Trust's assumptions.</p><pre class="language-text"><code class="language-text">Fixed Network Energy = 9.55W + 0.03W/Mbps
Mobile Network Energy = 1.2W + 1.53W/Mbps</code></pre><p>Now there is a baseline and dynamic portion to the energy calculations for both fixed and mobile network energy. The question now becomes, what input should we use for the dynamic portion. The calculation asks use for the bitrate (Mbps), but we don't really measure websites in that way.</p><p>For this part, we can use values from the Malmodin research mentioned above. In it, the value of 0.4 Mbps for "web surf". So let's plug that in.</p><pre class="language-text"><code class="language-text">Fixed Network Energy (kWh) = 9.55W + 0.03W x 0.4Mbps = 9.56W / 1000 = 0.00956kWh
Mobile Network Energy (kWh) = 1.2W + 1.53W x 0.4Mbps = 1.81W / 1000 = 0.00181kWh</code></pre><h3>What about non-web surfing data transfer?</h3><p>The Malmodin research does provide figures for other activities like YouTube, Netflix, and file download. The Carbon Trust report also provides figures specific to video streaming quality. As do Scope3, who have figures for video and digital audio. So go ahead an pick your poison. You can swap those figures into the calculation above if you need to. Since this post is focused on website carbon estimates, we'll stick to using the figures we can find for website transfer.</p><h3>One small change to these figures</h3><p>There's one more change I'm going to make to the Scope3 figures above. The value of 9.5W for Fixed Network Energy includes 1.3W allocated to CDNs (taken from the Carbon Trust report which labels it as <em>data centers and content delivery networks</em>. However it doesn't seem this figure is applied to Mobile Network Energy, so I will include it there.</p><p>And, as a final step, we'll covert this value to kilowatt-hours per second. This allows us to multiply the factor by the <code class="language-markup">transfer time</code> for the web page or content being measured.</p><pre class="language-text"><code class="language-text">Fixed Network Energy (kWh) = 9.55W + 0.03W x 0.4Mbps = 9.56W / 1000 / 3600 = 0.0000026556kWh/s x transfer time (s)
Mobile Network Energy (kWh) = 2.5W + 1.53W x 0.4Mbps = 3.11W / 1000 / 3600 = 0.0000008639kWh/s x transfer time (s)</code></pre><p>So now we've got figures to use for the network segment of our calculation. You would only use one of these figures in a calculation, based on the kind of network environment that is being tested for.</p><h2>Segment: User Devices</h2><p>For user devices, it makes logical sense that the time spent on the device would be a factor in overall energy use. I've decided to settle on a mix of device power figures from <a href="https://methodology.scope3.com/consumer_devices">Scope3</a> and the methodology work done by <a href="https://dimpact.org/">DIMPACT</a> - a project involving over 20 media, entertainment, and technology companies, as well as members from the University of Bristol.</p><h3>Calculating a single web page view using DIMPACT</h3><p>The DIMPACT model relies on quite a few users inputs, and is designed to calculate emissions for a large number of sessions. For the purpose of the model we're creating, we'll simplify the model and strip it back to just calculating a single session/page view.</p><p>We'll use the DIMPACT publishing module, which is suitable for calculating emissions for static content like a website. They also have a video streaming module for dynamic content but that's to explore on another day.</p><p>For each type of device (tablet, computer, laptop, mobile), we'll take the following inputs:</p><ul><li>Duration of the page view (seconds)</li><li>Estimated average power of the device type in use. We'll used Scope3's figures here, which are primarily US centric but give a breakdown by mobile & desktop which we can use. <ul><li><strong>Desktop/Laptop (incl. Monitor):</strong> 53.2W</li><li><strong>Tablet:</strong> 3W</li><li><strong>Smartphone:</strong> 0.77W</li></ul></li></ul><p>Now using DIMPACT's calculations:</p><pre class="language-text"><code class="language-text">Device Energy (kWh) = Device active power draw (W) / 1000 / 3600 x Duration of pageview (s)</code></pre><p>Now we're normally not viewing web pages for hours. Maybe minutes, but probably seconds. So we've made a slight change to DIMPACT's calculation to reflect this.</p><p>To turn the device energy result into a carbon estimate, we then multiply it by the grid intensity of the local grid where the device is being used.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">What about standby power?</p><p></p><p>DIMPACT does have additional calculations for allocating the impact of device standby energy usage. However, they state that it is to be excluded for device types such as computers, tablets, and smartphones. Since we're dealing solely with those devices, we'll get to keep this post a bit shorter by skipping that part.</p><p></p></div><h2>What this model is & what it isn't</h2><p>We've now got calculations for all three system segments, and I'll wrap things up soon by putting them together into a single calculation for the model. But first, I want to take a moment to clarify what a model like this is good for, and what it isn't suitable for.</p><h3>Good for</h3><p>The thoughts outlined in this model would best serve developers who want to estimate the energy used by their website or web app. It also serves as a handy means to measure and compare the short-term impact of changes they make to their code or platform on energy use.</p><h3>Not good for</h3><p>A consequential model, such as this one, is not suitable for use as the basis for any carbon accounting or reporting. For that, you are better off using an attributional model such as Sustainable Web Design. As mentioned at the beginning of this post, attributional models look at the total energy used by a system and then attribute part of that to the website being measured.</p><p>A model like this is also not suitable for forecasting, or making any predictions of the long-term impacts of specific changes to code or platforms. The impacts it measures are short-term.</p><h2>A final calculation</h2><p>Bringing all the work above together, we arrive at the below calculation combining all three system segments.</p><pre class="language-text"><code class="language-text">Website Energy Use (kWh) = Data Center Energy (kWh) + Network Energy (kWh) + User Device Energy (kWh)</code></pre><h3>Data Center Energy</h3><pre class="language-text"><code class="language-text">Data center Energy (kWh) = (Compute Kilowatt-Hours * PUE) + (Memory Kilowatt-Hours * PUE) + (Storage Kilowatt-Hours * PUE) + (Network Kilowatt-Hours * PUE)
Compute Kilowatt-Hours = (2.292 * server process time (seconds)) / 1000 / 3600
Storage Kilowatt-Hours = 0.0000000009 kWh/MB * data transfer MB * number of CDN regions
Network Kilowatt-Hours = 0.000001 kWh/MB * data transfer MB * number of CDN regions
Memory Kilowatt-Hours = 0.000000392 kWh/MB * data transfer MB</code></pre><h3>Network Energy</h3><p>Choose the calculation appropriate to the network being tested.</p><pre class="language-text"><code class="language-text">Fixed Network Energy (kWh) = 0.0000026556kWh/s x transfer time (s)
Mobile Network Energy (kWh) = 0.0000008639kWh/s x transfer time (s)</code></pre><h3>Device Energy</h3><p>Choose the calculation appropriate for the device being tested.</p><pre class="language-undefined"><code class="language-undefined">Computer Device Energy (kWh) = 0.0532 x Duration of pageview (s)
Tablet Device Energy (kWh) = 0.003 x Duration of pageview (s)
Smartphone Device Energy (kWh) = 0.00077 x Duration of pageview (s)</code></pre><h2>Have a play around with it</h2><p>You can play around with the calculation above in <a href="https://observablehq.com/d/1b0b36a7cf6619d8">this interactive Observable Notebook</a>. This environment will allow you to adjust the different inputs, and see how they impact the overall energy use being measured.</p><div><iframe width="100%" height="788" frameborder="0" src="https://observablehq.com/embed/1b0b36a7cf6619d8@266?cells=inputsHeading%2Cviewof+dataCenterInputs%2Cviewof+networkInputs%2Cviewof+userDeviceInputs%2CresultsHeading%2CresultsTable"></iframe></div><h3>Also see</h3><p>If you’re interested in this, then also check out <a href="https://greenframe.io/">GreenFrame from Marmelab</a> which also brings together research to estimate device emissions for website use.</p><h2>Let me know what you think</h2><p>This post is aimed at being a conversation starter. I've tried to use existing data and research and looked at how they might be applied to estimating energy used by a website. I'm interested to know what you think about it. If you know of any other data/research I might have missed please also share it!</p><p>Get in touch: <a href="https://www.linkedin.com/in/fershad/">LinkedIn</a>, <a href="https://indieweb.social/@fershad">Mastodon</a>, or by <a href="https://fershad.com/contact">email</a></p><p>Finally, a big thank you to the folks at Cloud Carbon Footprint, Scope3, DIMPACT, Carbon Trust, and Jens Malmodin & his co-authors for all the work they've done to further this area of research.</p></div>Adapting Cloud Carbon Footprint's methodology to website carbon estimates2024-02-20T13:25:46Zhttps://fershad.com/writing/adapting-cloud-carbon-footprints-methodology-to-website-carbon-estimates/<div><p>There's been some conversation lately about the <a href="https://www.debugbear.com/blog/website-carbon-emissions">shortcomings of different website carbon estimation models</a>. It's something I've <a href="https://fershad.com/writing/is-data-the-best-proxy-for-website-carbon-emissions/">touched on here</a> <a href="https://calendar.perfplanet.com/2023/why-web-perf-tools-should-be-reporting-website-carbon-emissions/">and elsewhere</a> in the past too. I think there's a consensus in the community on the need for more accurate models that representative of how the modern web works. This entails a <a href="https://fershad.com/writing/website-carbon-beyond-data-transfer/">shift away from data transfer</a> as the sole proxy for all the system segments in website emission calculations.</p><p>In this post I will look at an possible alternative approach to calculating the emissions of the server (data center/hosting) segment in website carbon calculations. I hope that this can be the first in a series of posts where I lay out possible alternate approaches to estimating emissions for each segment in website carbon calculations. When (read: if ever) the other posts are ready, I will link to them below.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Update</p><p></p><p>I <em><strong>did</strong> </em>get around to writing that post! You can read it at <a href="https://fershad.com/writing/thinking-about-a-way-to-estimate-website-energy-use/">Thinking about a way to estimate website energy use</a>.</p><p></p></div><h2>About this post</h2><p>This post aims to be a conversation starter. In it, I will look at the open methodology published by <a href="https://www.cloudcarbonfootprint.org/docs/methodology">Cloud Carbon Footprint (CCF)</a>, and how it might be adapted for use in website carbon estimate calculations.</p><p>I have limited my scope to operational emissions only, and I acknowledge further work would be needed to include embodied emissions factors.</p><p>I have tried my best throughout to be clear about the changes and assumptions I've made in adjusting the CCF methodology. You might agree with some, you might disagree with others. That's cool, I'd love to hear your feedback regardless. The best way to do so is on <a href="https://www.linkedin.com/in/fershad/">LinkedIn</a>, <a href="https://indieweb.social/@fershad">Mastodon</a>, or by <a href="https://fershad.com/contact">email</a>.</p><h2>The problem with models</h2><blockquote><em>All models are wrong, but some are useful. </em><br />George E. Box</blockquote><p>We need to acknowledge this from the get go. No matter how detailed we try to make our models, there will inherently be some part of the model that has to rely on generalised assumptions, averages, or estimates. That is part of what makes a model a model.</p><p>If you're after 100% accurate figures for website carbon emissions, then you need to be able to measure the emissions attributed to processes running on a server, the energy usage of the internet required to transfer data, and the energy used by the actual device that views the website.</p><p>That's not feasible for the large majority of people who want to work out the carbon impact of their websites. And that's why we need models to fill in the gaps.</p><h2>A time & location based approach to estimating website server emissions</h2><p>The scale and complexity of hosting a website can vary wildly. It can be a small as <a href="https://scott.ee/project/solar-hosting-raspberry-pi/">a single solar powered server</a> at home right through to complex, globally distributed infrastructure. Hosts can perform simple tasks like delivering static HTML pages right through to hosting full applications that build pages and running database queries on demand.</p><p>While there are ways to get server utilisation metrics directly, these require at least some setup and configuration on the servers in question. For many, this isn't possible. So what metric could be used instead as a proxy for server utilisation?</p><h3>Server Timing Headers</h3><p>Ideally, we'd be able to get data about the individual processes responsible for serving a web page & how much power they consumed.</p><p>It is possible to setup a server so that it responds with information about how long a given task or function took to complete. This can be done through <a href="https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Server-Timing">Server Timing Headers</a>.</p><p>This is a great starting point, but to make these time values useful for carbon estimates we need a way to represent it as energy measured over a period of time (kilowatt-hours). These timings also don't factor in the energy associated with hosting sites on distributed platforms like CDNs.</p><p>Since we're not able to get actual power figures from the server without some custom setup or tooling installed, we need to create a model which allows us to estimate this conversion.</p><h2>❗An important note before starting</h2><p>The approach I outline below is a consequential model, in that it tries to look in detail at parts of a system to work out their individual impacts on the overall systems footprint. Such a methodology is suitable for those seeking ways to identify areas for improvement in their site's emissions profile. However, it would not be suitable for carbon accounting or reporting against standards like the Greenhouse Gas Protocol. </p><h2>Borrowing from Cloud Carbon Footprint</h2><p>Cloud Carbon Footprint is an open source tool for measuring the emissions of cloud compute use. It works with data from Amazon Web Services (AWS), Google Cloud Platform (GCP) and Microsoft Azure.</p><p>Since a lot of the modern web runs on services that site on top of one of these three providers, I feel that using Cloud Carbon Footprints data and methodologies make sense in helping shape a general model that would be applicable to most of the web.</p><h3>Model scope</h3><p>First, let's scope out what we'll measure:</p><ul><li>Compute - the energy used by server compute to deliver a web page.</li><li>Storage - the energy used to store the web page.</li><li>Networking - the energy used to move data between data centers.</li><li>Memory - the energy used by memory to deliver a web page.</li></ul><p>The CCF methodology includes a GPU segment as well, but since we're dealing with web servers I have not included that this in my post.</p><p>Let's now look at each individual segment to see what each calculation might look like in the context of a website. We'll aim to create a profile of an "average web server" based on the calculations and data in the CCF methodology.</p><h3>Compute</h3><p>CCF compute calculation looks at virtual CPU utilisation for a moment in time. It methodology for calculating this borrows from <a href="https://codeascraft.com/2020/04/23/cloud-jewels-estimating-kwh-in-the-cloud/">Etsy's Cloud Jewels</a> methodology.</p><pre class="language-undefined"><code class="language-undefined">Average Watts = Min Watts + Avg vCPU Utilization * (Max Watts - Min Watts)</code></pre><p>In our case, since we're looking to build an average server profile, we can use the <code class="language-markup">Min Watts</code> and <code class="language-markup">Max Watts</code> values that <a href="https://www.cloudcarbonfootprint.org/docs/methodology#appendix-i-energy-coefficients">CCF's calculation defaults to for Google (GCP), Amazon (AWS), and Microsoft (Azure)</a>. We'll also use the default value of 50% (0.5) for <code class="language-markup">Avg vCPU Utilization</code>.</p><pre class="language-undefined"><code class="language-undefined">Average Watts GCP: 0.71 + 0.5 * (4.26 - 0.71)
Average Watts AWS: 0.74 + 0.5 * (3.5 - 0.74)
Average Watts Azure: 0.78 + 0.5 * (3.76 - 0.78)</code></pre><p>We then take the average of these calculations (2.292 Watts) and multiply it by a time value to get a total in Kilowatt-Hours for server utilisation.</p><pre class="language-undefined"><code class="language-undefined">Compute Kilowatt-Hours = (2.292 W * server process time in Hours) / 1000</code></pre><p>In this case, server process time would be the value we get via Server Timing Headers. Anyone using this calculation would also need to remember to convert the server timings (sent in milliseconds) to an hourly figure. <em>This</em> is done by multiplying the server timings by 0.00000027777778.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">What if server timing information is missing?</p><p></p><div><p>Server timing information might not always be available. In those cases, we'd need to have a default to fallback on. I've got no idea what a sensible default would be here, especially given the different web architectures that exist.</p><p>Picking a number out of thin air, lets go with 100 milliseconds (0.00002778 hours) until someone can share something more compelling.</p></div><p></p></div><p>For reference, here's a link to the <a href="https://www.cloudcarbonfootprint.org/docs/methodology#compute">CCF docs for the compute segment</a>.</p><h3>Storage</h3><p>CCF have calculations to produce coefficients for both HDD and SSD storage types. These are in Watt-hours per Terabyte. For our website server calculation we'll convert these to Kilowatt-hours per Megabyte instead, and use the average of HDD and SSD.</p><p>Based on the numbers used in CCF's calculations for these coefficients, <a href="https://www.cloudcarbonfootprint.org/docs/methodology#storage">which you can see here</a>, we get:</p><pre class="language-undefined"><code class="language-undefined">HDD Kilowatt-Hours per Megabyte-Hour = (6.5 W / 10,000,000) / 1000 = 0.00000000065 kWh/MB
SDD Kilowatt-Hours per Megabyte-Hour = (6 W / 5,000,000) / 1000 = 0.0000000012 kWh/MB</code></pre><p>The average of these figures gives us a storage coefficient of <code class="language-markup">0.0000000009 kWh/MB</code>. In our calculation, we'd use this as:</p><pre class="language-undefined"><code class="language-undefined">Storage Kilowatt-Hours = 0.0000000009 kWh/MB * data transfer in MB.</code></pre><h3>What about CDNs?</h3><p>The calculation above would work for a page hosted on a single server. But in today's internet many sites have content replicated across multiple servers, either for redundancy or performance, or both. This is often done by using Content Delivery Networks (CDNs). It's worth thinking about how we'd make it possible to reflect this in our calculation as well.</p><p>Unless this is something you've setup yourself, it's hard to know exactly how many servers a site has been replicated across. Information we might have access to, though, is the number of regions a CDN provider is operating. For example, <a href="https://vercel.com/docs/edge-network/regions#region-list">Vercel list 18 regions that make up their CDN edge network</a>. If this information is unknown, we can use a default of <code class="language-markup">1</code> in the calculation below.</p><pre class="language-undefined"><code class="language-undefined">Storage Kilowatt-Hours = 0.0000000009 kWh/MB * data transfer in MB * number of CDN regions used</code></pre><p>For reference, here's a link to the <a href="https://www.cloudcarbonfootprint.org/docs/methodology#storage">CCF docs for the storage segment</a>.</p><h3>Networking</h3><p>The replication mentioned above requires data to be transferred between multiple data centers. For this, we'll use the coefficient CCF have for their calculation because, as they point out:</p><blockquote>There have not been many studies that deal specifically with estimating the electricity impact of exchanging data across data centers. Cloud Carbon Footprint docs (read more)</blockquote><p>The only change we'll make to the CFF value is to convert it into kilowatt-hours per megabyte for consistency with the rest of our calculation. This value then gets multiplied by the amount of data transfer being measured, and the number of CDN regions used (again defaulting to <code class="language-markup">1</code> if this is unknown)</p><pre class="language-undefined"><code class="language-undefined">Network Kilowatt-Hours = 0.000001 kWh/MB * data transfer in MB * number of CDN regions used</code></pre><p>For reference, here's a link to the <a href="https://www.cloudcarbonfootprint.org/docs/methodology#networking">CCF docs for the network segment</a>.</p><h3>Memory</h3><p>In writing this post, I really wasn't sure if a memory coefficient should be included in the final calculation. I've settled on including it because I think that it could be useful when estimating emissions for server-rendered sites.</p><p>My assumption is that serving static assets (HTML, images, etc) uses a negligible amount of memory (close enough to 0 to be 0), but that a site which is server-side rendered probably does use some amount of memory that it's worth counting.</p><p>Once again, we'll use the CCF value and convert it to kilowatt-hours per megabyte.</p><pre class="language-undefined"><code class="language-undefined">Memory Kilowatt-Hours = 0.000000392 kWh/MB * data transfer in MB</code></pre><p>This value can be set to <code class="language-markup">Memory Kilowatt-Hours = 0</code> if the webpage or asset being measured is known to be a static file.</p><p>You'll also notice here that we're not using the number of CDN regions as a multiplier. That is because (like the compute segment) memory would only be used in the one region that is serving the web page.</p><p>For reference, here's a link to the <a href="https://www.cloudcarbonfootprint.org/docs/methodology#memory">CCF docs for the memory segment</a>.</p><h2>Putting it together</h2><p>Having gone through the exercise above, we've now got calculations for different segments of a web server. We have also taken into account the use of distributed hosting or CDNs. Bringing these together, we have:</p><pre class="language-undefined"><code class="language-undefined">Compute Kilowatt-Hours = (2.292 * server process time) / 1000
Storage Kilowatt-Hours = 0.0000000009 kWh/MB * data transfer in MB * number of CDN regions used
Network Kilowatt-Hours = 0.000001 kWh/MB * data transfer in MB * number of CDN regions used
Memory Kilowatt-Hours = 0.000000392 kWh/MB * data transfer in MB</code></pre><p>Combining these into one monster calculation, we get:</p><pre class="language-undefined"><code class="language-undefined">Server Kilowatt-Hours = ((2.292 W * server process time in Hours) / 1000) + (0.0000000009 kWh/MB * data transfer in MB * number of CDN regions used) + (0.000001 kWh/MB * data transfer in MB * number of CDN regions used) + (0.000000392 kWh/MB * data transfer in MB)</code></pre><h2>Factoring in Power Usage Effectiveness (PUE)</h2><p>No data center is 100% energy efficient, in that no data center uses 100% of its power intake on its servers alone. The factor used to represent how effectively a data center uses its power is Power Usage Effectiveness (PUE). By including PUE in our calculation we're able to capture some of that data center inefficiency in our final emissions estimate.</p><p>Let's incorporate this into our calculation:</p><pre class="language-undefined"><code class="language-undefined">Server Kilowatt-Hours = ((2.292 W * server process time in Hours) / 1000) * PUE + (0.0000000009 kWh/MB * data transfer in MB * number of CDN regions used) * PUE + (0.000001 kWh/MB * data transfer in MB * number of CDN regions used) * PUE + (0.000000392 kWh/MB * data transfer in MB) * PUE</code></pre><h3>What's a suitable PUE value?</h3><p>The question is, what value should our model use for PUE? If you're operating a single server location, then you might be able to find out the PUE for that location from your hosting provider. Likewise, if you're running exclusively on AWS, GCP or Azure then you can used the values from CCF for those providers:</p><ul><li><strong>AWS:</strong> 1.135</li><li><strong>GCP:</strong> 1.1</li><li><strong>Azure:</strong> 1.185 </li></ul><p>As a fallback, if PUE is unknown, then I think a good value to use would be that produced by the <a href="https://journal.uptimeinstitute.com/global-pues-are-they-going-anywhere/">Uptime Institute's annual data center survey for 2023</a>. This gives an average PUE value of <code class="language-markup">1.58</code>.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Uncertainty</p><p></p><p>I'm uncertain if including PUE into the calculation is the right approach here, and would be keen to hear what others think.</p><p></p></div><h2>Turning it into a carbon estimate</h2><p>To turn this into a carbon estimate, we'd then multiply this figure by a value for the energy intensity of the grid on which the host server operates (the grid intensity in grams CO2e per kilowatt-hour).</p><p>To do this, we'll need to split that calculation up, so that the segments which only relate the server delivering a page (Compute and Memory) can use a location specific grid intensity value, while those which include factors for some kind of cross-regional data replication (Storage and Network) use the global average grid intensity. Doing this, our calculation now looks like:</p><pre class="language-undefined"><code class="language-undefined">Server CO2e (grams) = (Compute Kilowatt-Hours + Memory Kilowatt-Hours) * PUE * Local grid intensity + (Storage Kilowatt-Hours + Network Kilowatt-Hours) * PUE * Global grid intensity</code></pre><p>Or in long form:</p><pre class="language-undefined"><code class="language-undefined">Server CO2e (grams) = (((2.292 W * server process time in Hours) / 1000) + (0.000000392 kWh/MB * data transfer in MB)) * PUE * Local grid intensity + ((0.0000000009 kWh/MB * data transfer in MB * number of CDN regions used) + (0.000001 kWh/MB * data transfer in MB * number of CDN regions used)) * PUE * Global grid intensity</code></pre><p>Where local grid intensity for a server is unknown, a regional value can be used as a fallback, otherwise the global grid intensity can be used in its place.</p><p>When no CDN is used (i.e. there is no data replication, <code class="language-markup">number of CDN regions used = 1</code>) then the global grid intensity can be replaced with local grid intensity if it is known.</p><h3>What if its a green web host?</h3><p>In the case that a site is hosted on a verified green web host or a green CDN is used (checked against the <a href="https://www.thegreenwebfoundation.org/tools/green-web-dataset/">Green Web Foundation's Green Web Dataset</a>), then the corresponding grid intensity values can be set to 0 gCO2e/kWh accordingly.</p><h2>Comparing results with the current Sustainable Web Design model</h2><p>As a point of comparison, and so that I can see the calculation I've landed on in used, I'll compare it to the results for the data center segment that are produced when using the Sustainable Web Design (SWD) model.</p><p>The values used in our calculations are:</p><ul><li><strong>Data transfer:</strong> 1 MB data transfer</li><li><strong>Server compute time:</strong> 100ms (0.00000027777778 hours)</li><li><strong>CDN regions:</strong> 1</li><li><strong>Global average grid intensity:</strong> 436.33 gCO2e/kWh</li><li><strong>PUE:</strong> 1.58</li></ul><p>I am looking at the first time a page loads, so have removed the assumptions the SWD model makes about caching and return visitors from that calculation.</p><pre class="language-undefined"><code class="language-undefined">Sustainble Web Design Server Segment = 0.001 * 0.81 * 436.33 * 0.15 = 0.05301 grams
Server CO2e = (((2.292 * 0.00000027777778) / 1000) + (0.000000392 * 1)) * 1.58 * 436.33 + ((0.0000000009 * 1 * 1) + (0.000001 * 1 * 1)) * 1.58 * 436.33 = 0.00100 grams </code></pre><p>One thing we can do now with this calculation is see how changes (other than adjusting file size) might impact the emissions estimates we get. For example, lets say we put our 1 MB page onto Vercel (18 CDN locations). The resulting calculation produces:</p><pre class="language-undefined"><code class="language-undefined">Server CO2e = (((2.292 * 0.00000027777778) / 1000) + (0.000000392 * 1)) * 1.58 * 436.33 + ((0.0000000009 * 1 * 18) + (0.000001 * 1 * 18)) * 1.58 * 436.33 = 0.01273 grams</code></pre><h2>Let me know what you think</h2><p>This post is aimed at being a conversation starter. I've tried to use an existing open methodology for calculating cloud compute server emissions & looked at how it might be applied to estimating server-side emissions as part of website carbon estimates. I'm interested to know what you think about it. If you know of any other data/research I might have missed please also share it!</p><p>Get in touch: <a href="https://www.linkedin.com/in/fershad/">LinkedIn</a>, <a href="https://indieweb.social/@fershad">Mastodon</a>, or by <a href="https://fershad.com/contact">email</a></p><p>Finally, a big thank you to all the folks who've contributed to Cloud Carbon Footprint for what they have built and for working in the open.</p></div>2023 in review2024-02-20T13:25:46Zhttps://fershad.com/writing/2023-in-review/<div><p>Well, 2023 pretty much done and dusted. A solid year work wise, and a pretty big one personally. I’m really not great with keep journals or notes of things that happen, so this review might be a bit scattered. Something to improve on in 2024 I suppose.</p><h2>Fixing my eye</h2><p>I’ve had eye problems for the last 4 years. My right eye has been turning inwards since around about the start of 2020, and after trying a few different approaches to correct it I was told that surgery was the best (and last) option available to me.</p><p>So, that’s how I ended my 2023. Minor surgery on my right eye which left me looking like Terminator for the better part of a week. My eye’s started to heal now, and the whites are beginning to reappear. What’s even better, at 1 week post-op, is that the visual discomfort I had been experiencing for the last few years is now gone!</p><h2>Marriage</h2><p>I got married to my partner here in Taiwan in October. We both wanted to keep things simple, and avoid the headaches of arranging a full blown Taiwanese wedding banquet. Despite the best efforts of both our families to make things more complicated, we just about got away with it.</p><p>Another upside of this was that my parents got to visit Taiwan. It was my Mum’s second time, but my Dad’s first. It was cool to show them around the place I’ve called home for the last 11 years.</p><h2>They’re using CO2.js in where now?</h2><p>Towards the end of 2022, we began seeing <a href="https://github.com/thegreenwebfoundation/co2.js">CO2.js</a> being used in an increasing number of projects. That continued into 2023, and it was wild seeing some of the tools that were picking up the library.</p><p>Of note, CO2.js now contributes to the <a href="https://fershad.com/writing/co2e-estimates-in-firefox-profiler/">Firefox Profiler’s Power Profiling feature</a>. It was also a personal highlight to see it being used by the team at <a href="https://fershad.com/writing/carbon-control-by-webpagetest-first-look/">WebPageTest in their Carbon Control test suite</a>. I also helped the team at PianoD <a href="https://www.thegreenwebfoundation.org/news/estimating-website-emissions-in-the-italian-context-sitigreen-co2-js/">add it to Sitigreen</a>, their Italian website testing tool.</p><h3>What next for CO2.js</h3><p>I’ve got two big goals for CO2.js in 2024:</p><ol><li>Add additional models to the library to give developers more choice. There are open issues for adding the <a href="https://github.com/thegreenwebfoundation/co2.js/issues/145">GreenFrame</a> and <a href="https://github.com/thegreenwebfoundation/co2.js/issues/141">DIMPACT</a> models.</li><li>Give developers a way to pull in data from third-party grid intensity services like Electricity Maps, WattTime, or others. This should hopefully reduce the friction to perform nearer to real-time carbon estimates with current data (rather than yearly averages).</li></ol><h2>Meeting the people I work with</h2><p>In March I got to travel to Germany to meet up with Hannah, Michelle, Katrin, Oliwia, Justine, and Chris whom I work with at the Green Web Foundation. It was fantastic to meet them all in person, after having spent the better part of nine months getting to know them through pixels on my screen.</p><p>Not going to lie, though, it was freezing! It was also exhausting. We had two days together as a team, and I was still running on Taipei time which meant I was running on fumes by the late afternoon.</p><p>That said, wouldn’t change any of it for the world. Below’s a picture of everyone riding into the sunset on Tempelhofer Feld, a former airfield that is now a sprawling public space.</p><p></p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/01fb918083079acdc50790dc485eea6c101bf34c-1676x1676.jpg?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/01fb918083079acdc50790dc485eea6c101bf34c-1676x1676.jpg?auto=format" alt="6 people riding into the sunset on a wide tarmac." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">The Green Web Foundation team riding on Templehofer Feld in Berlin.</figcaption></figure><h2>Making this website carbon aware</h2><p>At the beginning of the year, I wrote a bit of code to make <a href="https://fershad.com/writing/making-this-website-carbon-aware/">this website carbon aware</a>. The idea is to understand the grid intensity of a user’s location, and then adjust what kind of website they receive based on that.</p><p>This is very much me playing around with tech and some ideas. A few folks have commented that perhaps the “low carbon” version of this website should be the default. It probably could, and nothing would be lost really. One thing I wanted to show through doing this is how carbon awareness could be added to an existing website or app, rather than a greenfield project.</p><p>Currently, the code I’ve written works in Cloudflare’s edge compute environment. I’m hoping to get some time at the beginning of next year to write starters for other providers as well.</p><h2>Another COP website audit</h2><p>COP28 happened, and so did <a href="https://fershad.com/writing/cop28-uae-a-low-carbon-website-review/">another COP website audit</a>. Unlike previous years, though, this one picked up a bit of traction with a couple of media folk reaching out and asking me to comment on the site <a href="https://spectrum.ieee.org/internet-carbon-emissions">for different stories</a>. I guess that’s more down to the controversial nature of this year’s COP host nation rather than the quality of my writing per se.</p><p>It was especially nice to be <a href="https://www.abc.net.au/news/2023-10-31/un-cop28-climate-summit-accused-greenwashing-website-low-carbon/103020978">interviewed by the ABC in Australia</a>. Finally getting some bang for my tax dollars.</p><h2>I’ll probably never finish TOTK</h2><p>I was so amped for The Legend of Zelda Tears of the Kingdom to come out. I’m glad Nintendo has such long release cycles between these games, because I’m not sure I’ll get around to finishing it anytime during 2024. The ease at which I get side tracked in the game probably means I’m never going to find Zelda and will just fork out my money for the DLC or next release in a few years.</p><h2>Two more international refereeing caps</h2><p>In September I was lucky enough to travel to Japan to refereeing in a series of Mens Open International Touch Test Matches between Japan and Singapore. It was my first time refereeing internationally since 2018, and my first time refereeing outside of Taiwan since 2019.</p><p>It was some seriously fast Touch, with both teams well into their preparations for the upcoming 2024 World Cup. I was glad for the run, and picked up a few things I need to continue working on if I’m to pursue higher refereeing levels in the future.</p><p>For now though, my eyes are on making the Taiwan team as a player for the 2024 World Cup. The first 8 months of the new year will be revolving entirely around that. As one of the old players in the squad, it’ll be a challenge but one that I’m keen to get through.</p></div>Why web perf tools should be reporting website carbon emissions2024-02-20T13:25:46Zhttps://fershad.com/writing/why-web-perf-tools-should-be-reporting-website-carbon-emissions/<div><p><em>This post</em> <em>was originally published as part of the</em> <a href="https://calendar.perfplanet.com/2023/why-web-perf-tools-should-be-reporting-website-carbon-emissions/"><em>2023 Web Performance Calendar</em></a>.</p><p>Recently, a post from the web performance monitoring tool DebugBear about <a href="https://www.debugbear.com/blog/website-carbon-emissions">why they won’t report website carbon emissions</a> in their platform caught my attention. It’s a very good post, pulling together information from a lot of sources, and presenting the reasoning behind their decision. As I read it, I found myself nodding along to parts and furrowing my brow to others in equal measure.</p><p>This post is partly in response to DebugBear’s piece (so maybe read that first if you haven’t already). It is in no way a criticism of the post, which raises some very valid points about the current state of website carbon emissions calculations. However, contrary to the conclusion made in that post, I believe that despite the current shortcomings of different models the data they provide <em>should</em> be presented to users in a way that allows them to begin to contextualise the emissions associated with their website and digital assets.</p><p>In this post, I aim to provide a counter take, that looks at the present and future. I hope to emphasise how web performance tools can start integrating website carbon emissions calculations into their platforms.</p><h2><strong>For transparency</strong></h2><p>Before we start, I should disclose that at the time of writing I work as a contractor for <a href="https://www.thegreenwebfoundation.org/">Green Web Foundation</a>, a Dutch non-profit working for a fossil-free internet by 2030. As part of my role, I am a maintainer of CO2.js, a JavaScript library that enables developers to use different estimation models to calculate the CO2e produced from data transfer. Through that, I have worked to build carbon estimations into a few website carbon calculators and performance tools.</p><h2><strong>Website carbon emissions in 2023</strong></h2><p>The biggest criticism levelled at most website carbon estimation tools is their reliance on data transfer as a proxy for emissions. The article on the DebugBear website covers this extensively in relation to the <a href="https://sustainablewebdesign.org/">Sustainable Web Design</a> (SWD) model. It’s a criticism which <a href="https://fershad.com/writing/is-data-the-best-proxy-for-website-carbon-emissions/">I’ve written about</a> in the past as well.</p><p>Other models do exist which take into account factors beyond data transfer. The DebugBear article touches on <a href="https://greenframe.io/">GreenFrame</a> and <a href="https://dimpact.org/">DIMPACT</a>, two examples that can be used for website emissions estimates (DIMPACT can also be used to estimate video streaming). <a href="https://methodology.scope3.com/lifecycle">Scope3</a> is a model mainly for advertising and publishing, but is another example of one that takes multiple varying inputs into its carbon calculations.</p><h2><strong>Understanding the models we have today</strong></h2><p>To get our bearings, it will help to get an overview how the GreenFrame, DIMPACT, and SWD models work. This will highlight some of their limitations, as well as start to surface areas in which they might be able to be improved. I’ll also touch on alternatives that can be used to measure website carbon emissions, both today and potentially in the future.</p><h3>GreenFrame</h3><p>GreenFrame runs in a Dockerised environment, collecting usage measurements from its container to feed in its carbon calculation. It also includes measurements for a theoretical server container as part of its total calculations.</p><p>GreenFrame runs scripted Playwright scenarios, and so can be used to estimate emissions for not just page load but specific page interactions. There is an <a href="https://github.com/marmelab/greenframe-cli">open source CLI tool</a> which could be easily added to existing tooling or CI/CD pipelines, although at the time of writing it is covered by a somewhat restrictive license.</p><h3>DIMPACT</h3><p>The DIMPACT model relies on inputs that are less readily available in real time, especially in calculating server and user device emissions. Unlike GreenFrame, the DIMPACT model does not perform measurements on an actual web page. Instead it takes user inputted data for things like total GHG emissions from data center processes, total data served by CDN, and user specific information like device type, location, and total data served. Not all of this information is available in realtime, with most figures likely being presented as monthly, quarterly, or annual data.</p><h3>Sustainable Web Design</h3><p>The SWD model’s main input is data transferred. It uses this as the basis for calculating carbon emissions of all the segments that make up a website system – hosting, networks, user devices, and manufacturing. It’s worth noting that manufacturing is captured in the scope of the SWD model, while it is omitted from the scope of the GreenFrame and DIMPACT models.</p><p>As the DebugBear post points out, data transfer as a metric has a poor correlation to server utilisation and network energy use. There might be some correlation with device energy use, but not every byte of data is equal (more on this in a bit). So, yes, data transfer is a weak proxy for website carbon emissions. To fix this <a href="https://www.youtube.com/watch?v=DXX4hkV7XOI">“we need a ‘green’ perf. metric”</a>. We need browser APIs that expose more about a webpage and (ideally) energy use so that we can be really accurate emissions measurements. Until those appear, we’ll have to live with an accessible proxy like data transfer or something similar.</p><h4>Not all bytes are equal</h4><p>There’s long been a belief that 100kb of JavaScript or video content has a much larger device-level impact than 100kb of HTML. Ideally there’s a future where we’ve got a model that allows us to dissect a web page’s content to this level and come up with an even more nuanced carbon emissions estimate for it, based on this information. Having research to back that up will allow for updated models to be created with this information baked in. There’s already <a href="https://websitesustainability.com/cache/files/research23.pdf">a paper by Alexander Dawson</a> exploring this, but a significant amount of more work would be required to get this to the stage where it’s suitable to include in a carbon estimation model.</p><h3>About Firefox Profiler</h3><p>It is also possible to capture the energy profile of a webpage in the Firefox Profiler. The profile will also show a CO2e estimate, which is something I helped roll in earlier this year. It should be noted that you can now adjust the energy intensity that is used for this CO2e estimate to customise the results for your scenario.</p><p>Running the profiler programmatically would allow extremely accurate synthetic carbon measurements to be captured, since it measures actual energy used rather than relying on a model. The folks at Sitespeed are looking into how that can be done. Follow along in <a href="https://github.com/sitespeedio/sitespeed.io/issues/3944">this GitHub issue</a>.</p><p><a href="https://share.firefox.dev/green-coding-summit-2023">This talk by Florian Queze</a> of Mozilla at the SDIA Green Coding Summit (November 23, 2023) is a good introduction to the Firefox Profiler, as well as other power measurements tools in use today.</p><h3>Other possible approaches in the future</h3><p>If we want a way to measure website and digital carbon in realtime, then we need that information to be available in the browser and/or the platform. Ideally, that information would be actual energy utilisation similar to what can be recorded in the Firefox Profiler. Other proposals include <a href="https://www.ietf.org/archive/id/draft-martin-http-carbon-emissions-scope-2-00.html">HTTP headers</a> or <a href="https://www.thegreenwebfoundation.org/publications/extending-ipv6-to-support-carbon-aware-networking/">utilising parts of the IPv6 protocol</a> to capture this information. Right now, though, data transfer serves as one of the more readily accessible bits of information we can access in the browser without requiring a special environment to be configured.</p><h2><strong>All models are wrong, some are useful</strong></h2><p>I took this line from my colleague <a href="https://www.linkedin.com/in/mrchrisadams">Chris Adams</a>, but let’s <a href="https://qt.fershad.com/writing/change-gco2kwh-firefox-profiler/">attribute it to statistician George Box</a> shall we.</p><p>I hope what I’ve written about above helps to give you a better understanding of where each methodology might fall short and how they could be bettered. All models have inherent biases that skew their results one way or the other. There will always be a part of a model that cannot accurately capture the complexity or reality of what it is trying to measure. DIMPACT, GreenFrame, and Sustainable Web Design all have assumptions baked into their calculation methodologies which in turn impact their outputs in different ways. Measuring actual energy usage along the entire tech stack is the only way to get truly accurate website carbon emissions figures. By doing that, you no longer rely on models and the assumptions they make.</p><p>Even if updated versions of those models were to land, they’d still have their shortcomings. What’s important is that we understand these, and communicate them effectively when we do use these models in our work.</p><p>We need to ask ourselves if we are willing to let a pursuit of the perfect model stop us from making progress. You might be able to guess which camp I sit in.</p><h2><strong>The start of the journey</strong></h2><p>I’d argue that the field of measuring website carbon emissions is today in the same spot that the web performance field was a decade ago. I wasn’t around in the early days of folks measuring website performance, but I know that the metrics they relied on are a far cry from the ones that are referenced today. I can’t speak for those early web perf. pioneers, but I’m guessing at the time they were working with what they had access to through APIs, and whatever limited research was around at the time.</p><p>We’re at that point with website carbon calculations today. Of the research available into the impacts of ICT, there’s a small portion that’s focused on websites and digital media. There’s some data that we can capture to use in calculations, but most of this is aggregated in annual reports or requires a specific environment setup in order to be measured. The research into, and our understanding of, the correlation between various metrics and CO2 emissions is still very much in its infancy. The Environment Variables podcast recently aired an episode with Dr Daniel Schien (<a href="https://podcast.greensoftware.foundation/e/4n9v2qr8-the-week-in-green-software-new-research-horizons"><em>The Week in Green Software: New Research Horizons</em></a>) which provides a summary of research over the past decade, some of the hurdles that have been faced, and also the possible directions for future research.</p><p>But, as with web performance, the only way we can keep moving things forward is by implementing what we have and iterating on it. This allows us to capture data that can be used in research. It enables different hypotheses to be tested in the real world. It facilitates a greater awareness around the area of website and digital carbon emissions, which can then further fuel progress.</p><h2><strong>Customers will keep requesting it</strong></h2><p>Customers are going to keep on requesting providers include emissions reporting in the tools they use. They’ll probably be doing so with an eye to incoming legislation in Europe (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:32022L2464">Corporate Sustainability Reporting Directive (CSRD)</a>) and California (<a href="https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=202320240SB253">Climate Corporate Data Accountability Act (CCDAA</a>)) which will require companies to report on the carbon emissions from their operations well beyond what they are currently mandated to do. These bills will start taking effect in the coming years.</p><p>There’s also an increasing number of companies that have set Net Zero or Climate Neutral targets which they hope to achieve within this decade. In order for them to do that, they need to start measuring the emissions of their operations and that would include their website or digital assets.</p><h2><strong>Adding carbon emissions to web perf tools</strong></h2><p>In adding website carbon emissions to web performance tools, we should acknowledge to users that this is a young area of research and that the models used today will evolve as research and understanding in this space evolves. It’s also important to highlight the models being used for calculations, note their limitations, the fact that they are <em>estimates</em>, and to be transparent about any adjustments that have been made to them as part of an implementation.</p><p>For the rest of this section, I’ll be focusing on the SWD model, and how its results can be made more accurate despite the limitations of the model. That said, the general themes here are applicable to other models as well.</p><h3><strong>Start general and refine</strong></h3><p>The Sustainable Web Design model is a general model for website carbon emissions calculation. It definitely isn’t a 100% accurate measurement of your website’s carbon emissions, <a href="https://www.wholegraindigital.com/curiously-green/issue-48/">nor has it ever tried to come across as being this</a>. It is, though, probably the best starting point we have today for introducing the idea of estimating website carbon emissions to an individual without overwhelming them with too much detail and complexity.</p><h3><strong>Swap in real data when its available</strong></h3><p>Operational emissions in the SWD model are broken up into individual system segments (data centers, networks, and user devices). This means that the calculated emissions for one segment could be replaced with actual data for that segment, if it is available.</p><p>Say, for example, that a user’s hosting provider gives them an annual report showing the carbon emissions associated with hosting their website. Awesome, what a fantastic host! That actual data could be used in the SWD calculation in place of the estimated value for that system segment. Heck, even if you only had access to energy usage data from the hosting provider you could still multiply that by the energy intensity of the server location to get an emissions value which can be swapped into the calculation.</p><h3><strong>Get specific about energy intensity</strong></h3><p>The default SWD calculation uses the global average annual grid (energy) intensity to calculate carbon emissions from energy. Changing this value to that of a specific country will give a result that’s more tailored to that country.</p><p>Ideally, though, you’d like to do this separately for each individual system segment. That is because your web host might be in one country, while your users could be in another. By using the appropriate energy intensity for each segment, we’re able to run calculations that include values that more realistically reflect how the website is setup in the real world.</p><p>The SWD calculations page gives guidance on how to work out the energy attributed to each segment. These values can then be multiplied by respective energy intensity figures to calculate emissions. This capability is also built into <a href="https://www.thegreenwebfoundation.org/co2-js/">CO2.js</a>, where energy intensity figures can be manually entered or imported from country-level annualised data that is also available in the library. The Italian website carbon calculator <a href="https://sitigreen.it/">Sitigreen</a> is an example of this approach in action.</p><h4>A note about CO2.js</h4><p><a href="https://github.com/thegreenwebfoundation/co2.js">CO2.js</a> is a JavaScript library that can be used to calculate digital carbon emissions. <strong>It is not an estimation model in its own right.</strong> Instead, it provides access to multiple estimation models that developers can use. Currently, the library contains two models – Sustainable Web Design and OneByte. <strong>We would love to have more!</strong> There are open issues for adding the <a href="https://github.com/thegreenwebfoundation/co2.js/issues/145">GreenFrame</a> and <a href="https://github.com/thegreenwebfoundation/co2.js/issues/141">DIMPACT</a> models to the library, but as a small non-profit we rely on grant funding and donations to make this possible. You can <a href="https://www.thegreenwebfoundation.org/donate">support our work</a> or <a href="https://www.thegreenwebfoundation.org/services/">engage us for a project</a>, which will allow us to set aside time to improve this and other open source tools we maintain.</p><h3><strong>Request level emissions</strong></h3><p>All website carbon calculators I know of perform calculations on total page weight.</p><p>However, another way to approach this which improves the accuracy of results is to pass the bytes transferred for each request into the model, and then sum the equivalent carbon estimations. In this way, variables in the calculation can be adjusted for each request (e.g. green hosting, server location energy intensity). This, in turn, leads to a more refined overall emissions estimate for a page. <a href="https://webpagetest.org/">WebPageTest</a> takes this approach in its Carbon Control feature, while website analytics tool <a href="https://statsy.com/">Statsy</a> also does this in realtime. Both these tools use CO2.js under the hood to enable this.</p><h2><strong>What to do if you still don't like the models we have today</strong></h2><p>I understand that this topic might be polarising. So hey, if I’ve failed to convince you that web perf tools should report on website carbon emissions then that’s okay. Thank you for taking the time to read through the thoughts above. Before wrapping up, I’d like to leave you with a few other things I think all web performance tools can do that go beyond showing website carbon emissions to users.</p><ul><li>Show whether requests are being <a href="https://www.thegreenwebfoundation.org/tools/green-web-dataset/">served from green web hosts</a>. Yes, <a href="https://www.thegreenwebfoundation.org/support/im-using-a-cloud-provider-why-is-my-site-showing-as-grey/">CDNs can get in the way</a> of this check but most monitoring tools should be able to check and flag this.</li><li>See if there’s a way to leverage the actual power measurements taken from the Firefox Profiler in any dashboards or reports. Reminder that the folks at Sitespeed are looking at <a href="https://github.com/sitespeedio/sitespeed.io/issues/3944#ref-pullrequest-2011025018">how to automate this</a>.</li><li>As an organisation, fund additional research and projects that further our understanding of the relationship between our tech stacks and climate change.</li><li>Engage with browser vendors to add power profiling and better APIs to the platforms we use. Here’s a <a href="https://fershad.com/writing/microsoft-propose-sustainability-section-in-edge-devtools/">proposal from Microsoft Edge</a> exploring this.</li><li>Join the <a href="https://w3c.github.io/sustyweb/">W3C Susty Web Working Group</a> which has a team focused on improving metrics.</li><li>Call for more transparency from browser vendors based on the telemetry they collect. To improve the models we have today, we will need to move away from generalised device data towards anonymised power usage models for device power usage. Carefully curated open data, like we see on the server side (see <a href="https://github.com/green-coding-berlin/spec-power-model">SPECPower Model</a> or <a href="https://github.com/cloud-carbon-footprint/cloud-carbon-coefficients">Cloud Carbon Footprint</a>) could help massively here.</li></ul><p>In closing, I believe that we should embrace progress over perfection. Implement, understand, and iterate. Because if we wait for the perfect model to come along, we might be waiting until it’s way too late.</p></div>COP28 UAE: A Low Carbon Website Review2024-02-20T13:25:46Zhttps://fershad.com/writing/cop28-uae-a-low-carbon-website-review/<div><p>COP28 is rolling around soon. It’s around about a month away at the time of writing. So … yep, that means it’s time to take a look at the COP28 website and see how it stacks up in terms of web sustainability. If you’re interested in what my past reviews of COP websites have uncovered, you can read my takes on the <a href="https://fershad.com/writing/cop26-a-quick-sustainability-check/">COP26 (Glasgow)</a> and <a href="https://fershad.com/writing/cop27-egypt-a-webpage-sustainability-review/">COP27 (Sharm El-Sheik)</a> websites.</p><h2>What we’ll look at</h2><p>This blog post is very much a sequel to Michelle Barker’s excellent <a href="https://css-irl.info/greenwashing-and-the-cop28-website/"><em>Greenwashing and the COP28 Website</em></a> post. I’ll refer to it a few times through this review, and I strongly urge you to give it a read.</p><p>This review will differ from my previous COP website reviews. It won’t focus on how the website performs, what its Core Web Vitals are, or try to estimate the carbon emissions of the site. I won’t even be checking the site against the recently published <a href="https://w3c.github.io/sustyweb/">Web Sustainability Guidelines (WSG) 1.0</a> (cool idea though, any takers?).</p><p>This review will focus nearly entirely on the implementation of just one … uh … feature - the “Switch to Low Carbon Version” toggle that is found in the site header.</p><p>The website I’ll be looking at is <a href="https://www.cop28.com/en/">https://www.cop28.com/en/</a>, and my analysis was performed between October 27 and October 31, 2023.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Update - Dec. 6, 2023</p><p></p><div><p>Now that we're about a week in to COP28, I thought I'd go back and take a quick look at the website to see if anything had changed. I've captured them in callout boxes just like this throughout the post, so anyone who's already read this article can just skip through 🙂.</p><p>With previous COP website reviews, I've noticed that once the event starts content on the website balloons. With the a "low carbon design" having been present on the site since its inception, I was curious to see how new content was being treated in this design.</p><p>For the quick tl;dr version, it seems like new content has just be slapped onto the site with not thought given to the low carbon version.</p></div><p></p></div><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/0f8e6921dfacd0dd3996d4151000596d6252754e-2900x1404.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/0f8e6921dfacd0dd3996d4151000596d6252754e-2900x1404.png?auto=format" alt="A screenshot of the COP28 homepage. It features a large background image of the edge of a rounded building. The overlaying heading text reads: Actions. Delivers. Hope. In the top left of the page is a toggle which says " Switch="" to="" Low="" Carbon="" Version"."="" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">The COP28 website homepage, with the “Switch to Low Carbon Version” toggle located in the top left of the page header.</figcaption></figure><h2>Credit where it’s due</h2><p>Firstly, hey, credit to the team behind the COP28 website for actually giving <em>some</em> thought to making a sustainable website. Unlike the past two COP websites I’ve checked, this one does show as being “green hosted” when checked on the <a href="https://www.thegreenwebfoundation.org/green-web-check/?url=https://www.cop28.com/en/">Green Web Foundation’s Green Web Check tool</a>.</p><p>The fact that there’s a “low carbon version” of the site might also be seen as a good start. I was honestly surprised when I first checked this site a couple of months ago and saw the toggle there. But, after looking into it a bit more at that time I was let down by how little the toggle actually did. Michelle has done a great job of capturing those sentiments in her post (did you read it yet?).</p><h3>Website updates</h3><p>Since Michelle’s post went live, there have been some updates made to the way the low carbon toggle functions on the COP28 site. When Michelle examined the page, the toggle simply hid images visually, though they were still being downloaded. A recent change (as highlighted in this <a href="https://www.abc.net.au/news/2023-10-31/un-cop28-climate-summit-accused-greenwashing-website-low-carbon/103020978">ABC News article</a>) now results in the page reloading when the toggle is triggered. This means that images are no longer downloaded in the background on the “Low Carbon Version” of the site.</p><p>As we’ll see through the post, despite the changes mentioned above which address <strong><strong>some</strong></strong> of the shortcomings highlighted in Michelle’s post, the COP28 website still leaves a lot to be desired when it comes to showcasing low carbon web design and development.</p><h2>The Full Experience</h2><p>When a visitor first lands on the COP28 website, they are presented with what the website calls the “Full Experience”. This is apparently the feature-rich version of the site, with the primary feature seeming to be the presence of images on the page (this bit of sarcasm will make sense soon). Most of the homepage is text, images, and a slider-carousel thing at the top. A couple of videos are available to view, but don’t load anything without user interaction. A nice touch.</p><p>Other pages on the site are mostly text and images.</p><p>In order for this “Full Experience” of text and pictures to happen, the user is made to download the site’s main JavaScript bundle. When I checked, the bundle was 527 kB in size. That’s half a megabyte just to read some text and see some photos. A lot of the rest of this post focuses on the main JS bundle.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/992ebd4f2394063d96d598de2e716def1a67d382-2398x1710.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/992ebd4f2394063d96d598de2e716def1a67d382-2398x1710.png?auto=format" alt="A screenshot of part of the COP28 homepage showing the " Full="" Experience".="" Image="" of="" a="" man="" speaking="" at="" lectern="" with="" video="" overlay="" button="" appears="" next="" to="" some="" text.="" Below="" it,="" there="" are="" four="" news="" articles,="" each="" images="" above."="" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Part of the COP28 homepage, showing the “Full Experience” of the site with images.</figcaption></figure><p></p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Update - Dec. 6, 2023</p><p></p><div><p>Checking the website again, the size of the JS bundle has now jumped up to 612kB. There's also some additional JavaScript that's been added for an accessibility overlay that the site uses.</p><p>However, the total transfer size of the homepage is now an eye watering 35.6 megabytes. Yep, megabytes on page load ... without scrolling. Why? YouTube.</p><p>Two new carousel sections have been added to the homepage, each containing multiple embedded YouTube videos. While these carousel's <em>do not</em> automatically scroll through, they content for each YouTube embed they contain seems to be downloaded when the page first loads.</p></div><p></p></div><h2>Switching to the Low Carbon Version</h2><p>Someone coming to the website for the first time might see the toggle at the top saying “Switch to Low Carbon Version”. Curiosity could get the better of them while the page is loading, and so they click on the toggle. If they’re lucky, it responds to the click. If they’re on a slower network connection or a low-spec device, though, it probably won’t.</p><p>That’s what happened to me the first time I visited the site. As the page was loading, I saw the toggle at the top and you bet I clicked it. I took this action before the images in the carousel slider were loaded. But nothing happened. So I clicked it again. Nothing still. Finally on the third time of trying, it worked! But why did it take so long?</p><p>It turns out, that all the logic for the toggle and its subsequent behaviour (setting/unsetting the <code class="language-markup">low-carbon-mode</code> key in local storage) is baked into that 500+kB main JavaScript file. So, for the “low carbon toggle” to be useful at all any visitor must wait for their browser to download, parse, and execute that JS file. And if you’ve got JavaScript disabled (or the download fails), then you better get your COP28 information from somewhere else!</p><h2>An underwhelming experience</h2><p>Now, we’re finally on the “Low Carbon Version” of the website. Getting there requires the page to reload, but thankfully caching avoids us having to re-download the main JavaScript file. That <a href="https://webperf.tips/tip/cached-js-misconceptions/">doesn’t mean the parsing and execution get skipped</a> though.</p><p>Anyway, boy oh boy is the low carbon site a let down. It feels as though little-to-no creative thought has gone into designing a low-carbon web experience here. I feel like an arse for saying that, especially given my lack of digital design chops, but that is genuinely how I feel when looking at the low carbon version of this website. This was a chance to showcase creative, engaging <a href="https://sustainablewebdesign.org/">Sustainable Web Design</a>. Instead the low carbon site might be best described as bland, lifeless, and dull.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/6bde3d144632fdc280eff77be3bd812761de710a-2216x1722.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/6bde3d144632fdc280eff77be3bd812761de710a-2216x1722.png?auto=format" alt="A screenshot showing part of the COP28 homepage in the " Low="" Carbon="" Version".="" A="" green="" gradient="" background="" has="" some="" text="" on="" the="" left="" with="" a="" video="" play="" button="" placed="" over="" no="" content="" right.="" Below="" this="" section="" is="" block="" of="" four="" news="" articles,="" each="" plain="" cream="" coloured="" boxes="" above="" them."="" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Part of the COP28 homepage, showing the “Low Carbon Version” of the site with image replaced by gradient coloured boxes.</figcaption></figure><p>The colour images that are <em>a key feature</em> of the “Full Experience”, are replaced with boxes of coloured gradients. The page is pretty much just blocks of text floating in space. <strong>It gives the impression that something is broken with “Low Carbon Version” of the site.</strong> A metaphor, perhaps, for the COP process? I digress.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Update - Dec. 6, 2023</p><p></p><div><p>The "Full Experience" of the homepage now loads well over 30MB of content for YouTube embeds. How about on the "Low Carbon Version"?</p><p>It loads the same.</p><p>The "Low Carbon Version" of the homepage now downloads 34MB of content. Again, we've got the same 612kB main JS bundle and some code for the accessibility overlay. But again, the overwhelming majority is a result of the YouTube embeds that are present on the "Full Experience" version of the site.</p><p>What's really disappointing here is that is looks as though the content has just been slapped onto the "Low Carbon Version" without any thought or consideration given to actually improving performance (let alone considering sustainable web design principles).</p><p>On face value, it looks as though there was no thought given to <em>how</em> the "Low Carbon Version" of the site would be managed once the event started and new content needed to be added. This adds more fuel to the feeling that this is just greenwashing, that the "Low Carbon Version" was just made to appease folks without actually spending time to plan and execute it properly.</p></div><p></p></div><h3>It doesn’t have to be this way</h3><p>Low carbon web design is a burgeoning part of the digital design community. And the challenge to build eye-catching, and interesting low carbon websites hasn’t got in the way of many web designers and developers. Just take a look at some of the stunning low carbon designs featured on <a href="http://lowww.directory/">lowww.directory</a> and <a href="https://lowwwcarbon.com/">lowwwcarbon</a>.</p><h2>Nothing else really changes</h2><p>So, the low carbon version of the website strips out most of the colour images. But (spoiler in the heading) nothing else on the page seems to change all that much.</p><p>We’re still using the same main JS and CSS bundles (~527 kB and ~30kB respectively). We’re still loading in all the font files (~380 kB). If I click the YouTube video on the homepage, it still loads the whole YouTube embed experience which is another bunch (~1 MB) of JavaScript to download. So really the toggle’s wording could change from “Switch to Low Carbon Version” to simply “Remove Images”.</p><h3>What difference does it make?</h3><p>Removing images on the homepage sees about a 1.2 MB reduction in the size of the page (including lazy-loaded images). On other sub-pages, using <a href="https://www.cop28.com/en/green-zone"><em>What is the Green Zone?</em></a> as an example, the reduction is closer to 400 kB. Remember, to achieve this 400 kB saving we’re downloading 520 odd kilobytes of JavaScript. That feels out of whack.</p><h3>Let’s talk about caching</h3><p>In reality, we’re not <em>downloading</em> the main JS file each time we navigate to a new page on the site. The file does come with a <code class="language-markup">cache-control: public, max-age=31536000</code> header, and gets stored in browser cache after it’s first downloaded. As we navigate the “Low Carbon Version” of the site we’re not redownloading it, but our devices still have to deal with the computation required to make that JavaScript useable. This happens on each page navigation.</p><p>Where caching won’t help, though, is when the website is changed. The main JS file comes with a hash in the filename. At the time I checked the site it was called <code class="language-markup">main.4f6b1b16.js</code>. Since this looks like a React site, it is very likely that the hash value would change if a change is made to the website code. In this way it “busts” the cache for that file, meaning it gets downloaded again by any returning visitor.</p><p><strong>And that sucks if you’re a returning visitor who last left the site with <code class="language-markup">low-carbon-mode</code> turned on.</strong> Returning to the site would mean you’re again downloading the entire JS bundle, just to get that sweet, sweet low carbon experience of text floating in a digital void.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Update - Dec. 6, 2023</p><p></p><p>Returning to the homepage as someone with <code class="language-markup">low-carbon-mode</code> turned on now sucks even more, because all the YouTube content you probably aren't going to watch just gets downloaded anyway.</p><p></p></div><h2>How could it be better?</h2><p>Time for a change of mindset. So far, I’ve raised concerns about:</p><ul><li>The “Switch to Low Carbon Version” toggle and how it is implemented.</li><li>The main JS bundle that comes with the site.</li><li>The design of the low carbon version.</li></ul><p>For the rest of this post, I’ll be touching on a few ways these could be bettered to deliver a more meaningful low carbon website experience. I’ll go in reverse order of the list above.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Update - Dec. 6, 2023</p><p></p><div><p>The bloat on the homepage caused by the multiple YouTube embeds that are downloaded is easily solved. There are solutions out there that play well with React-based websites such as <a href="https://github.com/justinribeiro/lite-youtube">justinribeiro/lite-youtube</a> or <a href="https://github.com/paulirish/lite-youtube-embed">paulirish/lite-youtube-embed</a>. There's even this <a href="https://css-tricks.com/lazy-load-embedded-youtube-videos/">sick little trick</a> comes courtesy of Arthur Corenzan to consider. Heck, it can <a href="https://fershad.com/writing/youtube-facades-with-cloudflare-workers/">even be done in an edge worker</a>.</p><p>Of course, giving some thought and planning into how the "Low Carbon Version" of the website will actually be maintained and updated with new content is also a very important (first) step.</p></div><p></p></div><h3>Give some thought to low carbon design</h3><p>I’ve already touch on this earlier, but there’s a bit more to say. A low carbon website doesn’t have to be a wall of text. Just check out the two gallery sites I’ve linked to above. Low carbon website designs can be equally as impactful as the so called “full experience”.</p><p>One of the things noted in Michelle’s post is that the site downloads multiple TTF font files. These could be replaced with WOFF2 versions for starters, which would reduce the file size. But why not go a step further and see how creative the design team could get with a single variable font, or even <a href="https://systemfontstack.com/">just using System Fonts</a>.</p><h3>Ditch (most of) the JavaScript</h3><p>There is absolutely nothing obvious on this website that screams out as needing over 500 kB of client-side JavaScript to function. Sure, there are elements (like our good friend the low carbon toggle) that require a <strong><strong><strong><strong>sprinkling</strong></strong></strong></strong> of JavaScript to be useable. But, the site overall could very easily be built using low- or no-JS builders like <a href="https://astro.build/">Astro</a> or <a href="https://www.11ty.dev/">Eleventy</a>, rather than React.</p><p>This, in turn, would lead to some interesting opportunities to improve the low carbon toggle as well.</p><h3>Move low carbon detection off the device</h3><p>If we want to remove most of the client-side JS, how would low carbon mode detection still work? A combination of cookies, and edge compute would make it possible to still have a working toggle for a fraction of the client-side JavaScript.</p><p>This is made possible because when a user toggles low carbon mode, the page reloads from the server. As a result, it’s possible to intercept those requests and deliver a low carbon page directly to browser. Here’s how it might work:</p><ul><li>The client-side code for the toggle sets as cookie to indicate the user wants low carbon mode. This could probably be done in less than a dozen lines of JS.</li><li>When the toggle is changed, the page reloads.</li><li>During this process, the response to the browser is intercepted by an edge worker (think Cloudflare Workers, Vercel Edge Function etc.)</li><li>The edge worker looks for the cookie and based on its content returns the HTML for a “low carbon” or “full experience” page to the browser.</li></ul><p><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>But how’s this any better for sustainability?</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></p><p>Surely running all this stuff at the edge for tens of thousands of requests produces more CO2? Fair point. Here’s why <em>I</em> <strong><strong><em>think</em></strong></strong> this approach is more sustainable.</p><ul><li>We have no control over user devices. By removing the JS needed to check and deliver the low carbon site from the user’s device we’re benefiting <em>all users</em>. Those on better devices get an even faster experience, and those on low-spec devices don’t get bogged down as the page tries to load.</li><li>Edge workers are hyper-optimised for this kind of task. Running our compute there would take a fraction of the time (and therefore energy) compared to running it on the client.</li><li>We can improve this further by choosing edge providers who are taking steps to try and ensure as much of their network is powered by green energy or are mitigating the emissions of their operations. We’ve got much more control over this decision than we do over what energy is powering the devices our users have.</li></ul><p>It’s worth noting that <em>this is just my (slightly informed) opinion</em>. Especially the second point isn’t based on any hard data I’ve seen. If you do have data or research around this, <a href="https://fershad.com/contact/">I’d love to hear about it</a>!</p><h3>Low carbon by default? Nah, not yet.</h3><p>In Michelle’s post, she mentions that the low carbon version of the site should be the default experience. I share this opinion in principle. But. The low carbon experience of the COP28 website as it is today <strong><em>is not suitable</em></strong> to be the default version of the website. It would give a lot of users a very wrong impression of what low carbon web design is all about. For mine, that would be more detrimental than helpful.</p><p>Hopefully, though, between now and the start of COP28 we see further changes to the site and a more thoughtful, well designed “Low Carbon Version”.</p></div>Power consumption of JPEG, WebP, and AVIF2024-02-20T13:25:46Zhttps://fershad.com/writing/power-consumption-jpeg-webp-and-avif/<div><p>Earlier this year, I worked on getting <a href="https://fershad.com/writing/co2e-estimates-in-firefox-profiler/">carbon emission estimates into the Firefox Profiler</a>. Since then, I’ve had an itch to use the Profiler to test out the actual power usage of some <em>stuff in the browser</em>. In this post, I’ll be scratching the first of those itches - image formats.</p><h2>Testing image formats</h2><p>For a while, I’ve wondered just how much power gets used <strong>on a device when it loads an image</strong>. There’s plenty of guidance for developers to use modern image formats like WebP and AVIF which generally have a smaller file size compared to older formats like JPEG. That makes sense from a performance perspective. The smaller the image file, the less time it will take to transfer over the network, and the sooner it can be made visible on a user’s device.</p><p>What I’ve been wondering about, though, is what happens <strong>when the image actually reaches the user’s device</strong>. Are modern formats, with smaller file sizes, also less power hungry as the user’s browser downloads, decodes & renders them?</p><h3>What we’re testing</h3><p>To test this out, I’ve created a set of simple web pages that load a single (large) image of a different image format. The formats I’ve ended up testing are:</p><ul><li>JPEG</li><li>WebP</li><li>AVIF</li></ul><p>Each page contains a single <code class="language-markup"><img></code> tag that loads the image in that particular format. The <code class="language-markup"><img></code> tag also has some alt text, as well as height and width attributes. There is no other CSS or content on the page. This would allow me to see power consumption when the images are loaded as they would be on any regular webpage, via a HTML <code class="language-markup"><img></code> tag.</p><p>The original image was in JPEG (.jpg) format. The WebP and AVIF versions were generated in the <a href="https://squoosh.app/">Squoosh</a> app, using the default settings for each of those formats.</p><p>I also created an index page to link to each of those image pages. You can find <a href="https://github.com/fershad/ff-profiler-img-format-test">all the source code on GitHub</a>. Finally, I uploaded the pages to a server on the internet using <a href="http://surge.sh/">Surge.sh</a> so that the tests could be run over a real network. You can find it at <a href="https://ff-profiler-img-type-test.surge.sh/">https://ff-profiler-img-type-test.surge.sh/</a>.</p><p>As an aside, the image I used for testing is shown below. It’s a banger of a shot I took during a recent holiday in Tasmania, Australia. The night mode on the Google Pixel 7 is something else man.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/a64ce007cd46a86e45b88d609b1ca7d94159bd01-3072x4080.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/a64ce007cd46a86e45b88d609b1ca7d94159bd01-3072x4080.png?auto=format" alt="A streak of stars and gas stretch diagonally across a shot of the night sky." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">The Milky Way galaxy, as seen from near Cradle Mountain in Tasmania.</figcaption></figure><h3>How tests were run</h3><p>All tests were run on my local machine using a version of Firefox Nightly browser. Details of the platform are below:</p><ul><li><strong><strong><strong><strong><strong><strong><strong><strong>Browser:</strong></strong></strong></strong></strong></strong></strong></strong> Firefox Nightly 118.0a1</li><li><strong>OS</strong>: macOS 13.4.1</li><li><strong>ABI</strong>: aarch64-gcc3</li><li><strong>CPU model</strong>: Apple M2</li><li><strong>CPU cores</strong>: 8 physical cores, 8 logical cores</li></ul><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Process isolation with Apple Silicon</p><p></p><div><p>The fact that I’m running the test on an Apple Silicon powered Macbook is important to note here. Power profiling in Firefox works in different ways depending on the operating system and hardware being used. </p><p>On Linux and Windows (Surface devices), the power profile will capture power usage for the entire browser (including inactive tabs, and extensions). On Apple Silicon (M1 and M2), though, the profiler is able to show power usage of individual processes. In this way, I can easily isolate the specific tab in which I was performing the test and get only it’s power consumption results.</p></div><p></p></div><p>I used the Firefox Profiler’s default power profile settings to capture each test run. I won’t go through how to do that, since I’ve already covered that in the <a href="https://github.com/fershad/ff-profiler-img-format-test">post linked to at the start</a> of this article.</p><p>When running each test, webpages and image files were all served from my local machine (localhost). Each test run followed this sequence of steps:</p><ul><li>Open the index page (<a href="https://ff-profiler-img-type-test.surge.sh/">https://ff-profiler-img-type-test.surge.sh/</a>)</li><li>Start the Firefox Profiler</li><li>Open the JPEG page, and let the image load</li><li>Go back to the index page</li><li>Open the WebP page, and let the image load</li><li>Go back to the index page</li><li>Open the AVIF page, and let the image load</li><li>Go back to the index page</li><li>Stop the Firefox Profiler</li></ul><p>I ran three tests to capture performance of uncached images. For these tests, I cleared the browser cache between each test run. I also ran three tests to capture performance on images served from cache, but I closed the browser between each of those runs.</p><h3>My hypothesises</h3><p>I did go into this exercise with some ideas about how power hungry each image format would be. In a nutshell, I expected the newer formats (WebP and AVIF) to use more power than JPEG. I expect AVIF to be the most energy intensive, because I remember reading something about <a href="https://www.notion.so/7ffd63400ccf40ea8a6481b6a511402c?pvs=21">it needing more CPU power to decode</a>.</p><h2>The results</h2><p>After running the tests, I ended up with six profiles (three cached, three uncached). I’ve made those public, and you can find links to them below. First up, here’s a TL;DR of the key findings:</p><ul><li>Across all tests, loading the WebP page had the lowest energy consumption.</li><li>Across all tests, loading the AVIF page had the highest energy consumption.</li><li>JPEG was close to WebP in most tests.</li><li>The uncached tests are fairly consistent. Testing when images are cached, however, has some wild variability. More testing is probably needed there. </li></ul><div class="callout"><p></p><p>When reading these test results, please remember that the tests are very rudimentary and have a very small sample size. Further testing is required in this space to fully understand in detail the energy consumption of these (and other) image formats.</p><p></p></div><h3>Results in detail</h3><p>Running each test resulted in a timeline that looked like the one below. You can see three bumps in the <strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>Process Power</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong> track. These correspond to loading each image page.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/72d12e9d218f637ef641f42db2a11066dfe7fc19-3280x370.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/72d12e9d218f637ef641f42db2a11066dfe7fc19-3280x370.png?auto=format" alt="A screenshot of a profile timeline, showing Network, Memory, and Process Power tracks." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center"></figcaption></figure><p>In order to determine how much power was consumed loading each page individually, I selected each part of the timeline beginning at the pointerup DOMEvent that triggered my navigation to the image page, and ending at the <a href="http://www.devdoc.net/web/developer.mozilla.org/en-US/docs/Web/Reference/Events/MozAfterPaint.html">MozAfterPaint</a><a href="http://www.devdoc.net/web/developer.mozilla.org/en-US/docs/Web/Reference/Events/MozAfterPaint.html"> DOMEvent</a> that fired after the image image finishing loading.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Tiny numbers</p><p></p><p>Before getting to the results, it should be noted that we’re dealing with some really small numbers here. Figures are shown in microwatt hour, or µWh. Alongside them, I have presented the equivalent kilowatt hour (kwh) value.</p><p></p></div><p>In looking at the results, I have only examined the energy consumption shown in the profiler and <em>not</em> the carbon emissions estimate that’s presented alongside it. The reason for this is that the carbon emissions estimate are calculated using global average grid intensity, and would not accurately reflect the <strong><strong><strong>actual</strong></strong></strong> emissions at my location (Taiwan) when running these tests. In any case, energy consumption is a good proxy for emissions in this case - the more energy used, the greater the resulting emissions, no matter what the grid intensity is.</p><h4><strong><strong><strong><strong><strong><strong>No cache</strong></strong></strong></strong></strong></strong></h4><p><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>Profiles:</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong> <a href="https://share.firefox.dev/3EjXzSl">Test run 1</a>, <a href="https://share.firefox.dev/3QUmqUo">Test run 2</a>, <a href="https://share.firefox.dev/44sJzAC">Test run 3</a> </p><p>Transfer sizes:</p><ul><li>JPEG: 1.51 MB</li><li>WebP: 421 Kb</li><li>AVIF: 254 Kb</li></ul><p><strong><strong><strong><strong><strong>JPEG</strong></strong></strong></strong></strong></p><ul><li>Test 1: 103 µWh (0.000000103 kwh)</li><li>Test 2: 110 µWh (0.000000110 kwh)</li><li>Test 3: 109 µWh (0.000000109 kwh)</li></ul><p><strong><strong><strong><strong>WebP</strong></strong></strong></strong></p><ul><li>Test 1: 103 µWh (0.000000103 kwh)</li><li>Test 2: 106 µWh (0.000000106 kwh)</li><li>Test 3: 100 µWh (0.000000100 kwh)</li></ul><p><strong><strong><strong><strong>AVIF</strong></strong></strong></strong></p><ul><li>Test 1: 137 µWh (0.000000137 kwh)</li><li>Test 2: 137 µWh (0.000000137 kwh)</li><li>Test 3: 133 µWh (0.000000133 kwh)</li></ul><p>Looking at these results there’s not much between JPEG and WebP. In 2 out of the 3 tests, WebP uses marginally less energy, but not by any significant margin. AVIF, though, is consistently using about 30 microwatt-hours more electricity than WebP across all three tests.</p><p>It’s also really interesting to see the power consumptions as the images load. The below image compares the three formats. It’s curious that AVIF has two spikes in consumption, while JPEG and WebP have one main spike and a series of much smaller bumps. Could this be because of how the AVIF file is decoded before it is rendered on the screen?</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/f5f8b422ae9a29c845386d50afd0531766f2d0c7-919x475.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/f5f8b422ae9a29c845386d50afd0531766f2d0c7-919x475.png?auto=format" alt="Power usage graphs for JPEG, WebP, and AVIF. JPEG and WebP both have a single spike in energy usage before tailing off. AVIF has one large spike, then a drop off followed by another large spike." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center"></figcaption></figure><h4><strong><strong><strong><strong><strong><strong>Cached</strong></strong></strong></strong></strong></strong></h4><p><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>Profiles:</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong> <a href="https://share.firefox.dev/45MkNfE">Test run 1</a>, <a href="https://share.firefox.dev/3stcZRQ">Test run 2</a>, <a href="https://share.firefox.dev/47LrZdZ">Test run 3</a> </p><p><strong><strong><strong><strong><strong>JPEG</strong></strong></strong></strong></strong></p><ul><li>Test 1: 79 µWh (0.000000079 kwh)</li><li>Test 2: 141 µWh (0.000000141 kwh)</li><li>Test 3: 118 µWh (0.000000118 kwh)</li></ul><p><strong><strong><strong><strong>WebP</strong></strong></strong></strong></p><ul><li>Test 1: 116 µWh (0.000000116 kwh)</li><li>Test 2: 119 µWh (0.000000119 kwh)</li><li>Test 3: 69 µWh (0.000000069 kwh)</li></ul><p><strong><strong><strong><strong>AVIF</strong></strong></strong></strong></p><ul><li>Test 1: 158 µWh (0.000000158 kwh)</li><li>Test 2: 151 µWh (0.000000151 kwh)</li><li>Test 3: 74 µWh (0.000000074 kwh)</li></ul><p>What a mess! There’s next to no consistency when images are served from cache. I wonder if that’s got to do with something that’s happening as the browser tries to access the file system? Definitely should have run some more tests here!</p><p>Looking at the average of these results, WebP again uses less energy across the three tests. JPEG isn’t too far off as well. And once more it’s daylight to AVIF, which consumes the most energy on average.</p><p>In the cached tests, I was also seeing more of the “double spikes” of power consumption, similar to what I highlighted about for AVIF. However, this time these double spikes were happening across all three formats, but did not occur consistently for all formats in all three tests.</p><h2>What conclusions can be drawn?</h2><p>This was a fun little exercise to carry out. I managed to scratch an itch, and have come out the other side with even more thoughts, ideas, and itches to be scratched. </p><p><strong>These tests are also on a very small sample size and so any conclusions drawn from them should ideally be tested further.</strong> The Firefox browser, and how it works, probably impacts the results somewhat, and I wonder how they might differ on Chrome and Safari. Alas, right now there’s no way to test energy consumption in those browsers.</p><p>Also, I’m far from an expert in images or browser engines. It would be nice to gain a better understanding of how both of those work. This <a href="https://cloudinary.com/blog/contemplating-codec-comparisons">Jon Sneyers post on Cloudinary’s blog</a> looks like a good place to start.</p><h3>Should you switch to WebP?</h3><p>To be honest, I wasn’t expecting WebP to be the most energy efficient format when I started out. I am really impressed by how consistently it was less power hungry compared to JPEG and AVIF.</p><p>So, should you switch your site to use WebP based on <em>these</em> results? For sites that only have a handful of image, I’d say nah your time is definitely better spent elsewhere. From a website performance perspective, though, if you’re using JPEG then by all means look for ways to move to one of the newer formats like WebP or AVIF. It might help boost some of those performance metrics which Google cares about.</p><p>For sites with a lot of images (news websites, galleries etc.) a switch to WebP might be a something worth looking into with a bit more research. It’d be interesting to see if/how these results change when there are multiple images on the page, and maybe a bit of CSS to resize them. That’s an unscientific test for me to run on another day though.</p></div>The Week in Green Software: Open Data with Fershad Irani2024-02-20T13:25:46Zhttps://fershad.com/writing/the-week-in-green-software-open-data-with-fershad-irani/<div><p>A couple of weeks ago I had the pleasure of jumping on the Environment Variables: This Week in Green Software podcast with Chris Adams. I work with Chris at the Green Web Foundation, but it was fun to nerd out about news across the sustainable tech space on the podcast.</p><p>You can listen to the podcast here: <a href="https://podcasts.bcast.fm/e/vnwrxy28-the-week-in-green-software-open-data-with-fershad-irani">https://podcasts.bcast.fm/e/vnwrxy28-the-week-in-green-software-open-data-with-fershad-irani</a></p><p>Below is a transcript of our conversation, and links to the articles and resources we talk about.</p><p><strong>Chris Adams:</strong> Hello and welcome to Environment Variables, brought to you by the Green Software Foundation. In each episode, we discuss the latest news and events surrounding green software. On our show, you can expect candid conversations with top experts in their field who have a passion for how to reduce the greenhouse gas emissions of software.<br /><br />I'm your host, Chris Adams.<br /><br />Hello and welcome to another episode of The Week in Green Software, where we bring you the latest news and updates from the world of sustainable software development. I'm your host, Chris Adams. In today's episode, we're looking at open data about how green the power we use is, the wonders of HotCarbon, some cool new projects from the cloud native computing foundation and the Green Software Foundation, and a really cool technology called gzip.ai. Finally, we have some fantastic opportunities for you to be part of the Green Software Foundation because they're hiring. But before we dive in, let me introduce my guest today, Fershad Irani, an independent web sustainability consultant and maintainer of CO2. js. Fershad, I'll hand over to you to introduce yourself, if that's okay.<br /><br /><strong>Fershad Irani:</strong> Cheers, Chris. Thanks, man. I know you've been trying to get me on this podcast for a while, so it's exciting to be finally here. Yeah. Hi folks. As Chris mentioned, my name is Fershad Irani. I'm a web sustainability consultant and I live in Taipei, Taiwan. Most of the time I spend these days is working with Chris and a bunch of other amazing people at the Green Web Foundation.<br /><br />I think I've been over... It's over a year now, hasn't it, since I've been there, Chris? Or close to.<br /><br /><strong>Chris Adams:</strong> I think it has been indeed. Yeah.<br /><br /><strong>Fershad Irani:</strong> Yeah, during that time, we've, we've done a heck of a lot, I think, um, doing a bit of writing and a bit of coding. Chris did mention co2.js, which is where I've spent a big chunk of the last year. It's an open source carbon estimation library.<br /><br />I think Chris has mentioned it once or twice on this podcast, just snuck it in there. It's been really cool watching that project grow over the year and now it's being picked up by some other quite large projects itself, such as the Mozilla Firefox browser and web page test, which is just mind blowing to think that some code I've written is in those projects.<br /><br />But outside of web sustainability, I do have a bit of a life. I help organize and play in a local touch rugby, or touch football for all the Aussies out there. We have a group here in Taiwan and we play weekly and try and send teams internationally whenever we can.<br /><br /><strong>Chris Adams:</strong> Cool. Thanks. Thanks, Fish. Oh, and I'm calling just for context when we work together, Fershad said Chris, you can call me Fish instead of Fershad. So if I call Fershad Fish at any point, it's, it's basically just, uh, a, a shortened version of the, of his name that he's comfortable with us using. All right.<br /><br />So that's who I'm referring to when I ask. What do you think, Fish? I do not have any actual Fish in, uh, the podcast today.<br /><br /><strong>Fershad Irani:</strong> Cat's out of the bag.<br /><br /><strong>Chris Adams:</strong> Yeah. Yeah. Actually, is the cat also in the room as well? Like we, we gonna have a incursion from her today.<br /><br /><strong>Fershad Irani:</strong> She likes her video calls, so she might jump in on this one. Eventually, it's food time at the moment.<br /><br /><strong>Chris Adams:</strong> All right. So we may have a third guest as we record today. All right. I should just briefly introduce myself before we start. I mentioned my name is Chris Adams. I am the executive director at the Green Web Foundation, which is one of the member organizations of the Green Software Foundation. The Green Software Foundation, I work there at, as the chair of the policy working group, and that's basically the thing I'd probably share with you now, actually.<br /><br />So I think Fish. I guess we should probably start looking through, you're familiar with the format. So should we do the usual thing of running through some of the stories that caught our eye and then basically share a bit of context for them?<br /><br /><strong>Fershad Irani:</strong> Let's go, mate.<br /><br /><strong>Chris Adams:</strong> All right. Okay. So I think what's the first story that we have here was actually a story from Electricity Maps.<br /><br />They're one of, they're another member of the Green Software Foundation. And earlier on this month, they released some open data and like a significant amount of open data, actually. Fish, I might let you start on this one because there's, I think it's worth people understanding why this is interesting and it might be worth you sharing some of this because you've worked with a number of different providers of carbon intensity data now.<br /><br /><strong>Fershad Irani:</strong> Yeah, this data that Electricity Maps has released is just a huge data dump from 2021 2022. You have almost 55 countries in their data set and it's just such an awesome amount of historical data with so much granularity, not only yearly data, which we're used to working with most of the time, but now to have monthly, daily, hourly historical data available for free.<br /><br />That's something that's really going to be handy for a lot of people building out tools and analysis around carbon emissions and all that type of stuff. Yeah. Until now, for the most part, we have been working with annual. grid intensities, like the data that we've got from EMBA and we've put that into co2.js in the last year.<br /><br />I can't wait to play around with, with this data set and see, um, what we can get in there in the future from monthly, daily, even hourly figures if we're able to. They do say that they are going to release 2023's data. I'm not too sure when.<br /><br /><strong>Chris Adams:</strong> So as I understand, this is basically a push to essentially increase the floor of data quality that is available in the public domain for people to use so they can, Oh, there's our cat coming in. Yep. The goal is to increase the level of data quality as a floor so that rather than only having to use annual data, which often occludes and hide some of the information to provide a much higher resolution. So for example, you can see if you're going to like decide to move computing jobs to different parts of the world at different times, you can see the impact of this. The other thing that's also interesting about this is that it's actually released using the open database license, which means that you're able to build on this commercially or use it in all kinds of projects.<br /><br />Now, what I have done when I found out about this is I had a go at this and I've used a tool called Observable, which makes it really easy to build little tools, little exploratory notebooks. And we've got a couple of links to essentially the hourly carbon intensity data for a series of countries that we found.<br /><br />So we've got one for Germany and Finland, but basically they have one for almost every single sub grid in America, which is what people might refer to as balancing authorities in America. And this is cool because I think one of the things that I realized when I started playing with this data is that this lets me see, okay, if I did a bit of, if I had a computing job last year, where else could I run at the same time?<br /><br />Or where could I move that to? over the whole kind of geographic space and time last year to see how I could have reduced the emissions for that. That's something that I haven't really been able to do before. And it's nice to see this. The other thing that is worth bearing in mind is that this, there is a commitment from the organization, Electricity Maps to publish on a yearly basis every time going forward.<br /><br />So at the end of 2023, they'll be publishing the data from 2023 as open data for anyone to use. And you're able to use the data real time from them. Basically as a commercial product, right? And that's essentially what you can see this being used for. It's a way to increase the data quality used for the last year.<br /><br />But if you want to do something real time, then you may need to use electricity maps or what time or some of the other tools, depending on what your specific use cases, but this being out is a really cool thing. I'm really happy to see it.<br /><br /><strong>Fershad Irani:</strong> And just on that last point, like, we do need more of this and as much of it as possible to be open source in terms of monthly, daily, hourly emissions data. And if that can come from governments or from other private entities, that just helps all of us in this space. Like it, it helps drive decisions like you were saying about.<br /><br />Carbon aware computing and stuff like that that also helps improve the accuracy and transparency of the carbon estimates that we're producing it that we're going to start relying on for reporting and other legislation that comes in the future.<br /><br /><strong>Chris Adams:</strong> Yeah, like, I feel this is quite useful because this publication, if you're looking at the carbon intensity of anything you did last year, you've basically gone from, essentially you made something 8, 760 times more accurate because you've actually got hourly figures for this stuff, which has been really hard or really expensive to get access to previously in this kind of way.<br /><br />So that's cool. I don't think that's worth bearing in mind is that you have to ask yourself, how many times do you need to pay for this data? Because you may not be familiar with how kind of literature grids work, but in many parts of the world, there is a small levee that's put on to that's built into the kind of basically hourly rate you pay for any power is usually between 20 to 10%, a lot of the time, which is essentially allocated to potentially funding a transition to renewables for this stuff. And this information is collected anyway for this. So the idea that it's actually visible and that it's available in the public domain is a really good thing and really long overdue. So it's nice to see this. So yeah, good news story. We've shared a link to the data portal and it's free for anyone to use and fetch the data from. And hopefully we should see this turning up in places like the CarbonAware SDK and any other tools like CodeCarbon and so on, so you can start making more responsible decisions about when and where. You run any kind of computing jobs.<br /><br />Let's look at the next story. The next story is about HotCarbon, which is a online conference, online and hybrid conference that's initially based in America, but the cool news is that it's basically sustainability, ICT, nerd Christmas, there's a bunch of really good papers that have been released, and there's also now the recorded videos of all the talks from this.<br /><br />And if you are trying to find out what the state of the art is. in discussions around digital sustainability. This is one of the places to look for the kind of technical discussions about this. And there's a couple of talks that I think both Fish and myself have really caught our eye. Fish, I'll let you go first.<br /><br />And then I'll come in with my one actually, because I think there's one that you. Quite liked, right?<br /><br /><strong>Fershad Irani:</strong> Yeah, my one's actually from last year's HotCarbon. There's actually a paper from Romain Jacob and Laurent Vanbever, both from ETH in Zurich. And in last year's HotCarbon, they published this paper, which is just, it's got just a beautiful name, if anything else, The Internet of Tomorrow Must Sleep and Grow Old.<br /><br />And that was, I think the very, probably the first, if not one of the first times that I've personally really started thinking about with the data transfer is the best proxy for website carbon emissions and how we calculate them that kind of began a rabbit hole for me and I'm still going down that rabbit hole as I think a lot of us are. But it was really interesting and presented some of the the ways that networks operate and function presented that really clearly and the video for that is is a really good short 20 25 minute watch I think it is but they've also got a paper this year with kind of a less pretty sounding name, but Chris, you want to talk to that one?<br /><br /><strong>Chris Adams:</strong> Yeah. So first of all, before I talk about this one in particular, I'll just let people know that last year we did an interview specifically with Romain Jacob about the paper that he shared last time. So we will share a link to that to go into more detail about it. But the general thrust of the paper from last year.<br /><br />Was that the internet is basically provisioned in its current state for availability above all else, which means there's lots and lots of the time we've massively over provisioned for it. So big, it's like having the biggest possible computer you can imagine just for when most of the time it isn't actually used that much.<br /><br />This time, he's actually, Romain Jacob is the, one of the lead authors, along with Jackie Lim and Laurent Venbever, I think, from ETH Zurich. They're talking about, are there ways to do something about this? And it's not such a poetic name. But the general thrust of this paper is that given that we know that most of the time we're not using the entire capacity of the internet, is it possible to kind of power down parts of it as it were, is it possible to make parts of the internet sleep so that you can make meaningful reductions in the energy usage and as a result, the carbon footprint of this stuff.<br /><br />And the argument basically is that yes, you can do some things like this. There are savings in the order of tons based on looking at a open data set from OVH called the weather map data set where OVH, which is another cloud provider have basically shared the traffic that they have running inside their own networks.<br /><br />And they basically explored this and said, given what we know about how the internet is used and what kind of usage patterns we have, is it plausible to selectively power down parts of the internet and still maintain like the same level of quality of service basically. And it's super nerdy. But it's a really nice cool paper and it's a fun read.<br /><br />It's one of the first times I've seen people actually work with real data from a real organization, because one of the thrill struggles you have is actually having access to this information. So this is really cool to see this. There are other more ones. There are many more papers as well, but I think what we might need to do is run through the list and see if we can get some of the people from HotCarbon at 2023 to speak about this, because there was a number of really exciting looking papers and there is 20 videos and 20 papers to read through.<br /><br />So if you want to. Basically see what's happening at the real cutting edge. That's a place to look.<br /><br /><strong>Fershad Irani:</strong> Just to be sure that HotCarbon's already happened, right?<br /><br /><strong>Chris Adams:</strong> Yeah, it happened a couple of weeks ago, but the videos were literally published. I think last night or two nights ago or something like that. So there's a bunch of, that's the place to look.<br /><br /><strong>Fershad Irani:</strong> So hot off the press.<br /><br /><strong>Chris Adams:</strong> Yeah, absolutely. HotCarbon, hot off the press indeed. And if you want something even hotter, there is a mailing list called the E Impact mailing list, which I'll share a link to, where there are ongoing and robust conversations about all this stuff here.<br /><br />So Fish, you spoke to this idea about, okay, is data transfer a good proxy for understanding website carbon emissions or anything like that? That's the place that I am usually following to see what the conversations are, going back and forth on that stuff. And it's a really useful place to learn from essentially world experts for free about what's happening there.<br /><br /><strong>Fershad Irani:</strong> Do you want to do a spicy take and give an answer to that one? Is data transfer the best proxy for website carbon emissions? Chris Adams.<br /><br /><strong>Chris Adams:</strong> I'm not going to have a spicy take on this one yet, because I'm still trying to figure this one out. Because I feel that there's lots and lots of evidence that basically shows that the network part doesn't change all that much based on what you send over the wire. So you can make the argument, rather than thinking about it like a kind of road and cars driving, it might be more useful to think about networks as like a cycle lane where you have people using it.<br /><br />So. You know, if you in aggregate, look to all the people cycling on a cycle lane, you might see a small change in usage, but you're not going to see a massive changes if you had like loads of cars driving along it. And I think this is an issue of us having mental models or not the correct mental model when we think about this stuff.<br /><br />That's about as spicy as I can really take. Cause I don't think I know enough about it, but Fish, we should probably share a link to your piece, because this is one thing that we've had. Bunch of time talking about with both implementing the sustainable web design model in co2.js, but also because there's a whole separate discussion about this, both at a regulatory level, but also in inside industry with actually the sustainable web design model specifically, there's a whole bunch of work going on there that I suspect you might have some reckons on or something you could share on there actually.<br /><br /><strong>Fershad Irani:</strong> Yeah. And I think it's also worth noting that there are other methodologies for estimating website carbon emissions or digital carbon emissions out there that don't use data transfer necessarily as their proxy. And they use other things like time on device or they try to measure the actual usage of a device, which is something that you can also do these days in.<br /><br />The Firefox web browser, which is super cool. And I'm with you. It's something that we're all learning as we go. And there's more research coming out about it for now. Data transfer is the best we have, but with what's in Firefox, hopefully other browsers can implement that type of technology as well. We can start to see some real world data that we can then base some of our estimates and assumptions off.<br /><br />And we can then work with that.<br /><br /><strong>Chris Adams:</strong> There is one thing that I would wish for, if we could see something like this for HotCarbon 2024, this whole paper here is based on the willingness of one organization to share some data about how a network is working so that it can form basically a public understanding of where the real impactful decisions and interventions are possible can be made when we think about greening software, right?<br /><br />We know that browser makers like Firefox and Chrome and Microsoft Edge, they have all this telemetry information about how their browsers are being used because they use it to improve the products, right? If there was a way to share a suitably safely prepared data set, which was a representative sample of how websites and things were used, it will be so useful for us to actually understand this.<br /><br />And now that we've done a bit of work with say Firefox, for example, we understand that these numbers, they can be collected and they can be used because. If you're using Firefox now, you can basically turn on the Firefox kind of profile and you can see right down to the process or thread level, what the energy impact of various parts of the page are.<br /><br />And we know that some of this stuff is essentially presented in telemetry to inform product decisions. If you had organizations sharing some open data around this, it would be such a help for understanding what the things are. What the most effective interventions would be for impacting website for carbon figures but right now we don't have that yet, but it's the thing we could hope for. And who knows, there's a year now for it. So fingers crossed, eh?<br /><br /><strong>Fershad Irani:</strong> HotCarbon 2024.<br /><br /><strong>Chris Adams:</strong> Yes. All right. Should we move to the next story fish?<br /><br /><strong>Fershad Irani:</strong> Let's go.<br /><br /><strong>Chris Adams:</strong> This one is a story from a character, someone called Assaad Razzouk, who is, I think he's actually based in Singapore.<br /><br />And he's one person who runs a podcast called the Angry Clean Energy Guy, but he's actually has a background working in this field. I basically wanted to share this cause I found this really interesting specifically because when you speak to people who are thought leaders in the kind of world of cloud and sustainability in cloud, one of the recommendations that you'll hear people say is, please don't run things in Southeast Asia right now.<br /><br />Because the energy is really dirty and it's really hard to do that and because it's so hot, it also means that even the computing that you do run, there's going to be a massive amount spent to keep the computers from glowing red and overheating rather than actually doing your computing. And this is the first time I've seen where someone saying, no, there's actually some changes taking place there's been massive investments, particularly from Singapore in some of the surrounding areas, to make some changes to this. So while we've seen the energy transition move quite quickly in China and to an extent, Europe and America specifically with the IEA, you're now seeing some signs of this in Southeast Asia as well, which hopefully means that computing will be getting greener over time.<br /><br />And Fish, I know that you initially came from Australia. So I figured I'd share this link here from Grok Ventures and Quinbrook, basically the story about connecting Australia to Singapore to provide a punch of clean energy through this actually.<br /><br /><strong>Fershad Irani:</strong> Yeah, I'll, I'll be a bit cynical, as any good Aussie should, and um, just say, this is something that I've heard mumblings of doing something like this for, I think, over 10 years? Since, yeah, before I moved here to Taiwan, and for the last 25, 30 years people have been talking about high speed rail along the east coast of Australia, and that's still not there.<br /><br />This is a really cool idea, and something definitely that, when you look at a place like Singapore,<br /><br /><strong>Chris Adams:</strong> Hmm.<br /><br /><strong>Fershad Irani:</strong> It's small, they've got land constraints, they can't just suddenly put up a whole bunch of solar, they can't really put up a whole bunch of wind because it's a major shipping channel and a lot of planes come through there as well.<br /><br />They need to be looking outside to import energy, and they've got Indonesia, Malaysia pretty close by. It's good to see that Singapore is doing some investment outside of their own borders in clean energy. As someone who lives there, Asia, has got a way to go in terms of being green. But the potential is there. We sit on this thing called the Ring of Fire, and it's an active geothermal hotbed. I've got hot springs 20 minutes by car from my place. There's potential there for, for geothermal beyond just using solar and, and wind. So Asia does have that possibility of being a, a green hub for digital sometime in the future.<br /><br /><strong>Chris Adams:</strong> Do you know what I actually totally forgot about the whole Pacific Rim Ring of Fire stuff, because there was an announcement, I think two weeks ago or last week from Microsoft, them basically breaking ground on a massive geothermal project for some of the data centers in New Zealand, specifically for this.<br /><br />So yeah, that's actually a useful, interesting perspective. I didn't think about that actually.<br /><br /><strong>Fershad Irani:</strong> Let's move on to the next story, which is from the Green Software Foundation and one of the brainchilds of Adrian Cockcroft. It's about introducing a specification for real time carbon intensity. Chris, I think you'll be able to speak a bit more to this, but from my understanding, what this is all about is aiming to set a common way for data centers to report on energy and emissions, preferably in real time.<br /><br />And I think that's something that would be useful for a tool like Cloud Carbon Footprint, wouldn't it?<br /><br /><strong>Chris Adams:</strong> Yeah, first of all, it's really cool to see this proposal go ahead because essentially one of the struggles you have is even if you're using, say, Microsoft, Amazon and Google, you're running, you're trying to run the same computing load between these three, it's almost impossible to have any kind of meaningful comparison between these things because they all measure carbon in slightly different ways and include different things, whereas other ones don't now, what It's basically been proposed here is there's actually two things.<br /><br />So first of all, there are different ways of measuring. And also the figures that you see are not particularly actionable a lot of the time. So the resolute, the information you will usually come a few months later rather than in real time or anything, or even the same half hour, basically. Now what's been proposed here is essentially a way to talk about minute by minute metrics that a cloud provider would make available so that you can actually make informed decisions about when and where, or what kind of computing jobs you choose to schedule, or even which providers you're going to choose to use compared to other ones. Now I've read through the proposal and it's really well thought through and one of the reasons that people have said that they can't share this information before is that cloud providers basically will usually will say. We can't share this data because there's a security issue related to this. And Fish do you remember when we did some work with Firefox, we had something like this because one thing we learned when we were trying to get some high resolution figures for the browser, one of the solutions was we could get these figures, but you would need to run Firefox as root, which might not be a good idea for people to be doing that.<br /><br />And essentially what the thrust of this points to is that if you keep the resolution at minute by minute level. Then you're no longer disclosing any kind of dangerous information that might help an attacker, but it also provides sufficient resolution for you to make much more informed scheduling decisions as an operator.<br /><br />But also you actually get some consistent ways to make comparisons between different providers of these services. So this is my view is something that is really overdue and to see someone who's actually fleshed it out quite well, and actually thought about lots of the issues and how this relates to some of the weird aspects of how people count energy is green with certificates and so on.<br /><br />There's really good news and it's also. Interesting to see that you've got groups like the cloud native computing foundation getting involved or have it expressing interest as well. I think this is long overdue and you're right. Tools like cloud carbon footprint could presumably could in theory consume this kind of information if it was exposed by the providers, because right now they have to use models and guesswork based on the billing data, which is much less useful than getting direct figures.<br /><br />It also means that any other cloud provider who, which is not the big three could also share this information. So you could finally have some meaningful ways to make meaningful comparisons between them.<br /><br /><strong>Fershad Irani:</strong> And that's something I didn't think about when I first read it, but it actually really good like pardon the pun, but this turns the heat up on those or has the potential to turn the heat up on those big cloud providers and gives people a chance to, like you say, compare them on their carbon footprint.<br /><br />They might even need to start competing on carbon footprint because that's going to be important in the future.<br /><br /><strong>Chris Adams:</strong> This is exactly it. This makes some of this possible. And it also means that new entrants can actually start sharing these numbers. So you could compete on transparency to provide these numbers as a way to help customers make the responsible decisions that are currently really difficult to do. Or you could even plausibly build this into some of the tooling so that it's just part of how Kubernetes works or part of how maybe even Docker might work for example. This is actually, in my view, really exciting. And I'm really curious to see where it goes next, actually. All right. I think we've spoken about that quite a lot.<br /><br />Should we look at the next one. This is the IEA. So the IEA, Fish, I'll let you speak a bit about this one here. Cause this is the International Energy Agency.<br /><br />They've updated their data set, their, their information about data centers for 2023, this, this is the resource that is almost always cited as the authoritative figures on what the environmental impact of the tech sector is or how much energy it uses. And if you want to cite any numbers, these are peer reviewed and generally pretty reliable numbers you can refer to.<br /><br />They're safe ones to use. And yeah, they're pretty eyeopening. Aren't they Fish?<br /><br /><strong>Fershad Irani:</strong> Yeah. Firstly, it's good to see this data being updated. It's not so good to see some of the figures that are coming out of it. But like talking about data centers, the big three plus plus Meta. One of the things that struck me from this report was that from 2017 to 2021. So that encompasses some of the COVID years.<br /><br />The report says that there was a doubling in the amount of energy consumed by those four providers. It also then goes on to say that it expects there to be moderate growth for the next few years. I really hope that their definition of quote unquote moderate isn't another doubling because then we're going to be in serious trouble on the data center front because that's a lot of energy to be consuming.<br /><br />I think in the report it says somewhere around 1. 3% of total global like energy use or something. And that's without including cryptocurrencies, which is a whole other ball game. I think they've steered clear of it in this report.<br /><br /><strong>Chris Adams:</strong> Yeah. As I understand it was broken out separately because it's generally considered not part of the existing economy for this part. And also we're not going to talk about cryptocurrencies on this because the less said about them, the better. But generally speaking, this is one of the first times you've seen these figures broken out like that, because typically what you've had people talking about is the actual energy usage staying more or less about level for the last, say, 10 years or so, but what this really highlights is that this has stayed level because we've had a massive concentration of usage to a very small number of providers, as opposed to having a large number of maybe less efficient providers.<br /><br />There are some good signs of that in terms of in absolute terms, the figures are not growing as much as they could be, but it also means that we've got this massive concentration of, we've got all this consolidation, which has other impacts in terms of, okay, how easy is it to then pass all kinds of policy as a result for this, to move things away from being level to going down rather than going up.<br /><br />And this is the thing that we'll see coming forward, basically.<br /><br /><strong>Fershad Irani:</strong> And I think on that thing, just like one thing that I can't possibly see it going down in the future is like just the amount of volume, the amount of internet traffic that is there. There's a number in that report for 4.4 zettabytes of internet traffic in 2022, which is, I don't even know what that number is, man.<br /><br />Like it's just mind bogglingly big.<br /><br /><strong>Chris Adams:</strong> a zettabyte, right? I'm just, if I can find the figures for that, it's. Good Lord. So there's 21 zeros behind it. Yeah. If a million is like three, six, that's seven. So yeah, that's 20. That's a. A very large number. That's an incomprehensibly large number, but<br /><br /><strong>Fershad Irani:</strong> That's mental.<br /><br /><strong>Chris Adams:</strong> yeah, that's one of the issues that we struggle with.<br /><br />Okay. So this at least gives you an idea of where the most recent current data is that you might refer to. Okay.<br /><br />There's maybe one more story. Then we'll look at what else is going on in terms of jobs and things going out there. Fish, this is one I just want to point people to, because I've seen quite an interest in the Cloud Native Computing Foundation.<br /><br />There's a new thing called the Green Reviews Project, which has come up. And, uh, I'll just read the kind of blurb on this because this, in my view, looks cool. Basically, the Green Reviews Working Group helps CNCF projects assess and improve the cloud native sustainability footprint. So the idea of this, as I understand, is to start integrating sustainability reviews into how projects are maintained and run so that you get an idea of just bringing up the floor of competency on projects.<br /><br />So people have some way to talk about this and think about it. And essentially consider these as requirements in the same way that you might look at other things as requirements. This is interesting in my view. I was quite excited to see something like this. And there's a couple of links of what this looks like in practice with, I believe, the Falco project.<br /><br />And a couple of other ones there. So yeah, interesting to actually see something like this happening. This looks like it's going to be merged in the next week or two and a working group, the kind of technical architecture working group for this. And yeah, I was quite excited to see this actually land.<br /><br /><strong>Fershad Irani:</strong> And that's a really good way of making sustainability or sustainability considerations a regular part of a process and a way of doing things. That's rather than it being its own separate silo that might get looked at, might not get looked at. If it is part of the regular process that everyone has to go through, you're going to see more traction, more movement in the right direction, which is good to hear.<br /><br /><strong>Chris Adams:</strong> Yeah, I think it'd be really useful to actually have a chat with some of the CNCF folks on this, maybe they can come on the podcast and talk about a, how this happened and what this looks like, because we are now seeing various open source projects or groups starting to essentially start, create their own groups for this.<br /><br />So WordPress has one, Wagtail has one. This is one, which is, seems to be across some, a number of all the projects in the CNCF, the Cloud Native Computing Foundation. So there's a bunch of stuff going on there. So yeah, this, in my view, this is quite exciting, actually.<br /><br />All So the final thing, this is a little bit like we shared last week. If you are looking for work, the Green Software Foundation is actually hiring for a technical project manager and a content project manager. So these are funded positions that are available. You look at and it's, and you can apply with links that they have there.<br /><br />So there's, that's what's going on there. Okay, so we're just coming up to the hour for this show, and, uh, we normally have a kind of easy question to round this off. Now, Fish, I know that you've been doing a bit of travel away from Taipei and you've just come back, so I figured I'd ask, what's the first place you, you try to go to, to get some food you can't get anywhere else or as good as anywhere else when you are back in Taipei?<br /><br />What's your first place you're thinking of<br /><br /><strong>Fershad Irani:</strong> We got dumplings. We had dumplings the first time, first night we got back, which is quintessential Taiwan. I adid find myself that when I was on the road, I was traveling through Australia, mostly where I grew up. And I did find myself craving instant noodles, which is a bit weird, but there's just a dearth of choice.<br /><br />There's hardly any choice in Australia for instant noodles. And then you come back here and you've got mind blowing<br /><br /><strong>Chris Adams:</strong> cornucopia of ramen in packets?<br /><br /><strong>Fershad Irani:</strong> Oh yeah. Yeah, so it was dumplings first and instant noodles a very close second.<br /><br /><strong>Chris Adams:</strong> I was not expecting that second answer. I'll be I'll be real. Okay for me when I come back to Berlin It's all about falafel for me There's a really good place called Lausanne when you come back to Kreuzberg and it's probably the best falafel in at least five square kilometers if you're going in anywhere near Kreuzberg. So that's all for this episode.<br /><br />All the resources and links will be shared in this podcast episode, and you can visit podcast. greensoftwarefoundation to look at some of the previous episodes that we've actually referred to a few times. And finally, Fish, thanks for coming on. Really, I really enjoyed hanging out and chatting with you again.<br /><br />So everyone else, see you on the episode and Fish, bye for now, I suppose.<br /><br /><strong>Fershad Irani:</strong> See you folks.<br /><br /><strong>Chris Adams:</strong> Cheers, Fish. Hey everyone, thanks for listening. Just a reminder to follow Environment Variables on Apple Podcasts, Spotify, Google Podcasts, or wherever you get your podcasts. And please do leave a rating and review if you like what we're doing.<br /><br />It helps other people discover the show. And of course, we'd love to have more listeners. To find out more about the Green Software Foundation, please visit greensoftware. foundation. That's greensoftware. foundation in any browser. Thanks again and see you in the next episode. </p><p><br /><strong>News:</strong></p><ul><li><a href="https://app.electricitymaps.com/map?utm_medium=podcast&utm_source=bcast&utm_campaign=environment-variables">Electricity Maps Open Data</a> [4:02]</li><li><a href="https://hotcarbon.org/2023/index.html?utm_medium=podcast&utm_source=bcast&utm_campaign=environment-variables">HotCarbon 2023: 2nd Workshop on Sustainable Computer Systems</a> | HotCarbon [9:27] </li><li><a href="https://theangrycleanenergyguy.com/articles/?utm_medium=podcast&utm_source=bcast&utm_campaign=environment-variables">Articles by Assaad Razzouk | Thought Leader Renewable Energy</a> | Angry Clean Energy Guy [17:51] </li><li><a href="https://www.afr.com/street-talk/quinbrook-pops-up-in-grok-s-camp-at-sun-cable-deal-close-20230525-p5db92#:~:text=Quinbrook%20Infrastructure%20Partners%20is%20understood,a%20few%20parties%20including%20Quinbrook?utm_medium=podcast&utm_source=bcast&utm_campaign=environment-variables">Quinbrook pops up in Grok’s camp at Sun Cable, deal close</a> | Financial Review [19:05] </li><li><a href="https://github.com/Green-Software-Foundation/pr-faqs/pull/10/commits/887177bb388bde1d7b0eacd9735c35f1f90f6648?utm_medium=podcast&utm_source=bcast&utm_campaign=environment-variables">Adrian Cockcroft’s Proposal for a Specification for Real Time Carbon Intensity</a> | Green Software Foundation [20:55]</li><li><a href="https://www.iea.org/energy-system/buildings/data-centres-and-data-transmission-networks?utm_medium=podcast&utm_source=bcast&utm_campaign=environment-variables">Data centres & networks |</a> IEA [25:14]</li><li><a href="https://www.cncf.io/projects/?utm_medium=podcast&utm_source=bcast&utm_campaign=environment-variables">Graduated and Incubating Projects</a> | Green Reviews Project | Cloud Native Computing Foundation [28:36]</li></ul><p><br /><strong>Resources:</strong></p><ul><li><a href="https://observablehq.com/?utm_medium=podcast&utm_source=bcast&utm_campaign=environment-variables">Observable</a> [6:10] </li><li><a href="https://www.romainjacob.net/bibliography/jacob2022Internet.html?utm_medium=podcast&utm_source=bcast&utm_campaign=environment-variables">The Internet of Tomorrow Must Sleep More and Grow Old</a> | Romain Jacob [10:10]</li><li><a href="https://podcast.greensoftware.foundation/e/rnkw9p2n-green-networks?utm_medium=podcast&utm_source=bcast&utm_campaign=environment-variables">Green Networks</a> | Environment Variables episode with Romain Jacob [11:12] </li><li><a href="http://weathermap.ovh.net/?utm_medium=podcast&utm_source=bcast&utm_campaign=environment-variables">OVH weathermap</a> [12:21]</li></ul><p> </p></div>Sitespeed.io - Using and contributing to CO2.js2024-02-20T13:25:46Zhttps://fershad.com/writing/sitespeed-io-using-and-contributing-to-co2-js/<div><div class="callout"><p></p><p>This post was originally published on <a href="https://www.thegreenwebfoundation.org/news/sitespeed-io-using-and-contributing-to-co2-js/">The Green Web Foundation's blog</a>.</p><p></p></div><p>Visiting a fast website is always a pleasant experience. In fact, most of the time you won’t even notice that a website is fast. But you sure will notice when one isn’t! The field of website performance exists, in part, to aid website owners in creating these unnoticeable, pleasant experiences. And doing so can be <a href="https://www2.deloitte.com/content/dam/Deloitte/ie/Documents/Consulting/Milliseconds_Make_Millions_report.pdf">worth</a> <a href="https://www2.deloitte.com/content/dam/Deloitte/ie/Documents/Consulting/Milliseconds_Make_Millions_report.pdf"><em>a lot</em></a> to companies operating online.</p><p>There are several services that allow users to monitor the performance of an entire website. Most are paid, and most focus primarily on website performance metrics. That makes <a href="https://sitespeed.io/">sitespeed.io</a> a bit of an outlier in the web performance monitoring space. For one, sitespeed is entirely open source. Second, it is currently the only dedicated performance monitoring tool which also allows users to get carbon emissions estimates for every page of their site.</p><h2>A VERY BRIEF HISTORY OF SITESPEED.IO</h2><p>The sitespeed project started in 2012 (happy 10 year anniversary!) when <a href="https://github.com/soulgalore">Peter Hedenskog</a> committed the first lines of code. Peter, who now works in the performance team at the Wikimedia Foundation, initially built sitespeed for consulting work. His aim was to have something that could crawl an entire site, and run performance tests to identify troublesome pages.</p><p>Fast forward a decade, and sitespeed is now one of the most well-rounded performance monitoring tools available. A core tenant of the project is to give users the ability to own their own data, and run sitespeed on infrastructure of their choosing. It can be run in Node or using Docker. The resulting metrics can be used to create monitoring dashboards in Grafana and Graphite.</p><p>You can hear more about Peter & sitespeed story on <a href="https://changelog.com/podcast/212">episode 212 of The Changelog</a>.</p><blockquote><em>I think we as developers have a lot of responsibility and power to make the world a better place. There are many decisions at everyday work, where if you choose to you can be a conscious developer that makes choices which make the world better. In practice, that could be to open source your work, make sure your workplace treat people equally independent of gender/background, choosing local companies over big tech giants for hosting or, if you have the luxury to choose, not working for companies that do evil. Or, it could be as easy as contributing some code to a library that does good.</em><br /><br />Peter Hedenskog</blockquote><h2>SUSTAINABILITY ENTERS THE PICTURE</h2><p>As web performance metrics evolved through the late 20-teens, so did sitespeed. Peter and his co-contributors have always kept the core goal of building the best possible web performance monitoring tool.</p><blockquote><em>My main focus is to build the best web performance tool with</em> <a href="http://sitespeed.io/"><em>sitespeed.io</em></a> <em>but since we already access a web page, there’s a lot of other things we can do quite easily like run accessibility tests or test co2.<br /><br /></em>Peter Hedenskog</blockquote><p>The idea of estimating website carbon emissions wasn’t something that sitespeed’s core team had on their roadmap initially. But all that changed when Peter reached out to the Chris Adams at the Green Web Foundation on Twitter, asking about a collaboration:</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/f2b49abdc3643ea5af3085d951e7bd45e41b7d9f-1284x1382.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/f2b49abdc3643ea5af3085d951e7bd45e41b7d9f-1284x1382.png?auto=format" alt="Screen shot showing the following exchange: Peter asking on twitter: @mrchrisadams I would love to include a green plugin to http://sitespeed.io as GreenHouse (or better). There’s a plugin architecture so it should be doable. Also interesting in your calculations for CO2 vs bytes? If I start, could you help me verify that I get it right? Chris, replying on twitter: Oh dude, I've literally been planning that this month, and reading over the sitespeed docs to figure out how to make that plugin! I'd *love* to collaborate on this. Can you DM me your preferred email for correspondence?" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center"></figcaption></figure><p>Working together, Chris and Peter were able to get a first version of the sitespeed plugin built and released within a few weeks. Along the way, Peter noticed a few things that could be improved in CO2.js itself, and so contributed back to the project. In doing so, he became one of the first contributors to CO2.js.</p><h2>WHAT DOES THE SITESPEED SUSTAINABILITY PLUGIN DO?</h2><p><strong>Sitespeed’s sustainability plugin returns an estimate of carbon emissions in grams for one page view.</strong> It provides breakdowns by page for:</p><ul><li>First-party resources</li><li>Third-party resources</li><li>Emissions by domain</li><li>Emissions by content type</li><li>Dirtiest assets</li></ul><p>You can read more about each of the breakdowns above, and how to use the plugin, in the <a href="https://www.sitespeed.io/documentation/sitespeed.io/sustainable/">sitespeed docs</a>.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/0c121ea4bde5e9a0acb901f2aeaa438570da0a17-2048x855.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/0c121ea4bde5e9a0acb901f2aeaa438570da0a17-2048x855.png?auto=format" alt="undefined" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">A screenshot from the SiteSpeed.io sustainability plugin.</figcaption></figure><p>Sitespeed’s code is entirely open source, so if you want to get a deeper look into the sustainability plugin then head over to the <a href="https://github.com/sitespeedio/sitespeed.io/tree/main/lib/plugins/sustainable">repository on Github</a>.</p><p>For privacy reasons, the sitespeed team don’t collect any metrics on who or how the tool is used. However, Peter did reveal that Wikimedia uses it for monitoring the performance of their services, and that he actively uses the sustainability plugin internally in his role.</p><p>With an online presence as large as that of the <a href="https://wikimediafoundation.org/">Wikimedia Foundation</a>, the performance team is constantly running tests to improve and optimise pages across their various entities. As part of the tests he runs, Peter uses the sustainability plugin to baseline how pages compare to each other as well as to other companies/organisations.</p><p>As an aside, the Wikimedia Foundation also release a <a href="https://meta.wikimedia.org/wiki/Sustainability">yearly sustainability report</a>. While this report doesn’t include website carbon emissions, it does include some really insightful information of Wikimedia’s emissions and energy usage at the data center level.</p><h2>HOW DOES SITESPEED USE CO2.JS?</h2><p><strong>CO2.js is an open-source JavaScript library that enables developers to estimate the emissions related to use of their apps, websites, and software.</strong> It is built to reduce the barrier to entry for developers who want to build carbon estimates into the apps and sites they build.</p><blockquote><em>My main focus is to build the best web performance tool with</em> <em><a href="http://sitespeed.io/">sitespeed.io</a>. I could never build co2.js, because I lack the domain knowledge.</em><br /><br />Peter Hedenskog</blockquote><p>Pairing sitespeed’s web page data transfer and requests data with CO2.js allowed sitespeed’s sustainability plugin to estimate carbon emissions based on data transfer.</p><p>Sitespeed also uses CO2.js to perform green hosting checks on the domains it collects when scanning a page.</p><h3>Carbon emission estimations</h3><p>To turn page data into carbon emissions estimates, sitespeed takes a snapshot of a web page using it’s own <a href="https://github.com/sitespeedio/pagexray">PageXray library</a>. With the information about the page and all its requests available, CO2.js can then get to work.</p><p>Sitespeed’s sustainability plugin <a href="https://developers.thegreenwebfoundation.org/co2js/explainer/methodologies-for-calculating-website-carbon/#the-onebyte-model">uses the One Byte model</a> for carbon estimations by default. However, to align with <a href="https://www.thegreenwebfoundation.org/news/release-guide-co2-js-v0-10/">recent changes in CO2.js</a>, the team have also introduced the ability for users to manually change to the <a href="https://sustainablewebdesign.org/calculating-digital-emissions/">Sustainable Web Design model</a> instead.</p><p>Peter’s contributions helped extend CO2.js to cover some common requirements for running website performance tests. He contributed the <code class="language-markup">perParty</code>, <code class="language-markup">perDomain</code>, <code class="language-markup">perPage</code>, <code class="language-markup">perContentType</code> and <code class="language-markup">dirtiestResources</code> methods to use PageXray JSON data to generate results for the sustainability plugin.</p><h3><strong>Green host checks</strong></h3><p>CO2.js provides <a href="https://developers.thegreenwebfoundation.org/co2js/tutorials/check-hosting/">a nice wrapper</a> around The Green Web Foundation’s <a href="https://developers.thegreenwebfoundation.org/api/greencheck/v3/check-single-domain/">Greencheck API</a>. This allows developers to very quickly implement checks for one or multiple domains against the Foundation’s Green Domains dataset. Sitespeed uses this to check for green hosting on all the domains used by a given web page.</p><h2>MEASURING SUSTAINABILITY NOW AND INTO THE FUTURE</h2><p>So, what about the future? Peter hopes for a more proactive world, where browsers warn users of sites that might be consuming too many resources. We’re already seeing some movement in this direction. The Ecosia search engine <a href="https://blog.ecosia.org/green-search/">highlights planet-friendly organisations</a> in its search results. Meanwhile, Google has also started to <a href="https://blog.google/outreach-initiatives/sustainability/sustainability-2021/">surface sustainability information and nudges</a> across several of its products.</p><blockquote><em>I think it would be cool if web browsers themselves could warn users if a web page consumes a lot of energy, the same way Mac OS warns about applications that drain the battery. Next step would be that the browser warns the user before going to the page: do you really want to access this wasteful website?</em><br /><br />Peter Hedenskog</blockquote><p>Like Peter, we’d love to see sustainability built into the browser, and even web standards. We’ve been working with the Firefox team to <a href="https://github.com/firefox-devtools/profiler/pull/4243">add carbon emissions estimates to their DevTools profiler</a>. Open-source projects like <a href="http://sitespeed.io/">sitespeed.io</a>, open data, and active engagement by the web community all have an important role to play in getting us to a greener, more carbon-aware internet.</p></div>Release guide: CO2.js v0.132024-02-20T13:25:46Zhttps://fershad.com/writing/release-guide-co2-js-v0-13/<div><div class="callout"><p></p><p>This post was originally published on <a href="https://www.thegreenwebfoundation.org/news/release-guide-co2-js-v0-12/">The Green Web Foundation's blog</a>.</p><p></p></div><p>CO2.js v0.13.0 brings the latest average grid intensity data from <a href="https://ember-climate.org/">Ember</a> into the library. It also expands the number of countries for which average grid intensity data is available.</p><h2>UPDATED AVERAGE GRID INTENSITY DATA</h2><p>CO2.js v0.13.0 introduces average grid intensity data for the 2022 calendar year for some countries. In keeping with previous releases, data is sourced from <a href="https://ember-climate.org/">Ember</a>.</p><p>At the time of this release, 2022 calendar year data is available for 78 countries & 11 regions. You can use the interactive Observable table below to filter the data available in CO2.js by year.</p><div><iframe width="100%" height="280" frameborder="0" src="https://observablehq.com/embed/9046c0934c27f72b@237?cells=viewof+form%2Cexplainer%2Ctable"></iframe></div><p><strong>Update 2023/07/29</strong> - As of v0.13.5, we are now fetching the latest data from Ember each month.</p><h2>AVERAGE GRID INTENSITY AVAILABLE FOR MORE COUNTRIES</h2><p>As part of the data update in v0.13.0, we have also made some changes under the hood which affect how we source average grid intensity data from Ember. As a result, it has become easier for us to provide the latest available average grid intensity data for a given country within CO2.js.</p><p>In previous versions of CO2.js, average grid intensity data was available for just 75 countries and 11 regions. From v0.13.0 forwards, developers will have access to data for 209 countries and 12 regions.</p><p>The changes we have made also make it easier to update data in the future, and paves the way for automating this process. In doing so, we hope to provide developers a consistent means of working with the latest, most reliable grid intensity data available.</p><p>You can find <a href="https://github.com/thegreenwebfoundation/co2.js/releases">details of every release</a> for CO2.js on GitHub, where you’ll also be able find the <a href="https://github.com/thegreenwebfoundation/co2.js/blob/main/CHANGELOG.md">changelog</a> for this project.</p><p>If you are using CO2.js in production then The Green Web Foundation would love to hear from you! Use the <a href="https://www.thegreenwebfoundation.org/support-form/">contact form</a> on the website to get in touch.</p></div>A first look at Carbon Control by WebPageTest2024-02-20T13:25:46Zhttps://fershad.com/writing/carbon-control-by-webpagetest-first-look/<div><p>Just yesterday (May 9th, 2023), the WebPageTest (WPT) team shipped a new feature to the tool. They called it <strong><strong><strong><strong><strong><strong><strong>Carbon Control</strong></strong></strong></strong></strong></strong></strong>, and boy oh boy was I excited to see it finally land.</p><h2>Fixes #1613</h2><p>The whole idea of having carbon emissions reporting in WebPageTest goes back to a <a href="https://twitter.com/TheRealNooshu/status/1457681398249267200">tweet from Matt Hobbs</a> in 2021. In response to that, Scott Jehl (who worked on this feature) created <a href="https://github.com/WPO-Foundation/webpagetest/issues/1613">issue #1613</a>. As tends to happen, there was a flurry of early comments and suggestions (I jumped the gun on a few things 😅) but then not much noticeable action thereafter.</p><p>Fast forward just over a year, and I found myself starting to have conversations with Tim Kadlec, then Director of Engineering at WPT, about what carbon emissions results might look like. Having known about this since last year, I was delighted when Scott Jehl merged <a href="https://github.com/WPO-Foundation/webpagetest/pull/2867">pull request #2867</a> into the WebPageTest <code class="language-markup">master</code> branch.</p><h2>So, Carbon Control …</h2><h3>What is it?</h3><p>Carbon Control is a set of optional tests that can be performed with a WebPageTest run. The results are presented on an a really nicely designed page where you can see:</p><ul><li>If your site is served from a green web host.</li><li>How many third party requests are served from green web hosts</li><li>Your sites estimated carbon footprint per new (uncached) visit</li><li>Contextual information about the estimated footprint</li><li>Suggested improvements</li><li>And a breakdown per resource type</li></ul><p>Carbon Control is a terrific <strong>first step</strong>, especially coming from such a widely used service like WebPageTest. It doesn’t try to present <strong><strong><strong><strong>too much</strong></strong></strong></strong> information, but the information it does surface is presented in such a way so as not to be intimidating to the viewer.</p><h3>Running a test with Carbon Control</h3><p>Carbon Control tests are <strong><strong><strong>opt-in.</strong></strong></strong> To turn them on for a test run, you’ll need to select the <strong><strong><strong><strong><strong><strong><strong><strong><em>Run Carbon Control</em></strong></strong></strong></strong></strong></strong></strong></strong> checkbox for either a Simple or Advanced Configuration test.</p><p>Alternately, the WPT team have created a special page for this feature which already has everything setup. Head to <a href="https://www.webpagetest.org/carbon-control/">https://www.webpagetest.org/carbon-control/</a>.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/623e26ea9f6b78db02b7a8fc22b1ea4b9ba31d25-2190x930.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/623e26ea9f6b78db02b7a8fc22b1ea4b9ba31d25-2190x930.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">A screenshot of the WebPageTest homepage, with an arrow pointing to the Run Carbon Control checkbox for a Simple Configuration Test.</figcaption></figure><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/ade3d70c1f57137e1a0a30470afbe16798bfca29-2190x882.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/ade3d70c1f57137e1a0a30470afbe16798bfca29-2190x882.png?auto=format" alt="undefined" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">A screenshot of the WebPageTest homepage, with an arrow pointing to the Run Carbon Control checkbox for a Advanced Configuration Test.</figcaption></figure><h3>The Carbon Control results</h3><p>Let’s take a whirlwind tour through the results surfaced by Carbon Control. Once your test runs are complete, you’ll be able to navigate to the Carbon Control page from the test results navigation dropdown menu.</p><h4>The key stuff</h4><p>The first results you’ll see on the page are green hosting checks. WPT breaks this down into two categories - the primary domain, and third-parties. The results in the third-parties check will tell you that “x of n are green-hosted”. Clicking on this will reveal a list of all the third-party domains indicating which are green & which are not.</p><p>Next up, you’ll see the total size of the page and next to that a CO2e estimate. Carbon Control uses the Sustainable Web Design model to produce its CO2e estimate. It currently does this in a way that does not factor in caching or return visitors.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Data transfer as a proxy</p><p></p><div><p>I was going to write a short paragraph here about using data transfer as a proxy for website carbon emissions. That short paragraph became a longer ramble, so I’ve spun it out into it’s own blog post. </p><p>TL;DR: No, but it’s the best we have right now. </p><p>Read more at: <a href="https://fershad.com/writing/is-data-the-best-proxy-for-website-carbon-emissions/">Is data transfer the best proxy for website carbon emissions?</a> </p></div><p></p></div><h4>Important to note</h4><p>One really important implementation detail is that WPT calculates total website carbon emissions by <a href="https://github.com/WPO-Foundation/webpagetest/blob/master/www/assets/js/conditional_metrics/carbon-footprint.js#LL79C2-L98C6">summing the emissions</a> of each individual request. This is very good step in the right direction, especially when using a generalised methodology like the Sustainable Web Design model. It’s something I’ve written about in <strong><strong><a href="https://fershad.com/writing/improving-the-accuracy-of-website-carbon-emissions-estimates/#beyond-the-ballpark"><em>Improving the accuracy of website carbon emissions</em></a></strong></strong>, and it’s great to see the WPT implementation taking this approach.</p><h4>Extras</h4><p>These two sections are the main results Carbon Control surfaces. Below them, you’ll see some contextual comparison of a website’s carbon footprint, as well as an option to test from a different location to see how that impacts the results.</p><p>One of the other great features about Carbon Control is that it works with WebPageTest’s Opportunities and Experiments. These allow users to make “no-code” changes to their sites & test out their performance (and now carbon) impact. Having Carbon Control results are part of the Opportunities and Experiments suite finally allows us to match website improvements to estimated carbon reductions. Game changer!</p><p>Below the Opportunities and Experiments section is a percentage breakdown of emissions by resource type.</p><p>The rest of the content on the page contains resources, and a badge for folks who (I assume) are using Catchpoint’s monitoring service which WebPageTest & Carbon Control are part of.</p><h2>Wish list for the future</h2><p>I feel that Carbon Control is a great start, especially being in a tool like WebPageTest that is so widely used. That being said, I really hope that Carbon Control in it’s current form is just that - a start.</p><p>Here are a few ideas, updates I’d hope to see in future iterations:</p><ul><li><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>Show emissions for both new & repeat views</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong> - WPT is already capable of surfacing results for repeat views (cached results). It shouldn’t be too much of a stretch to present these on results page.</li><li><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>Present more third-party information</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong> - I say this having created <strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><a href="https://aremythirdpartiesgreen.com/"><em>Are my third parties green?</em></a></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong> for this purpose. It would be great to not just have the URL of the third-party resource, but also some additional context about it. At a minimum what type of resource is it (analytics, advertising …?), what company is providing it? Having a table further down the page with this information (or even with third-parties grouped in some way) would be a nice addition.</li><li><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>Using location specific grid intensities for emissions estimates</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong> - Currently it looks like WPT use global average grid intensity for all segments when calculating emissions. CO2.js (which they use for this) allows for device, datacenter, and network grid intensities to be adjusted so that <a href="https://www.thegreenwebfoundation.org/news/release-guide-co2-js-v0-12/">more case specific calculations</a> can be performed.</li><li><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>More data in the Cache Control by Resource Type section</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong> - Currently only percentages of each request are shown. I’d like to see a bit more information here, such as bytes, estimated emissions etc.</li><li><strong><strong><strong><strong><strong><strong><strong><strong>A breakdown of information about how the CO2e estimate was reached</strong></strong></strong></strong></strong></strong></strong></strong> - This is nerdy stuff, so tuck it away further down the page. It would be nice to have a table showing all the requests & the associate bytes/CO2e figures, plus the associated server, device, and network grid intensities that were used. This gives a way for us to really trace back the headline results we’re seeing on the Carbon Control page.</li><li><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>Beginning to surface device energy use</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong> - I’m not sure if WPT have a way to get this information at the moment, since it’s not programmatically accessible in browsers. However, if it’s possible, it would amazing to start seeing device level energy utilisation alongside carbon emissions. This would be the first step towards estimates that are not based on <strong><strong>just</strong></strong> data transfer.</li></ul></div>Is data transfer the best proxy for website carbon emissions?2024-02-20T13:25:46Zhttps://fershad.com/writing/is-data-the-best-proxy-for-website-carbon-emissions/<div><p>This topic has been on my mind recently, especially since Mike Gifford raised this very <a href="https://github.com/thegreenwebfoundation/co2.js/issues/138">issue in the CO2.js Github repo</a>. That thread is a very interesting read, and one worth following.</p><p>It is also a criticism that I see come up a lot aimed at tools which try to present website carbon estimates. With WebPageTest releasing their <a href="https://blog.webpagetest.org/posts/carbon-control/">Carbon Control</a> test feature recently, and that hopefully drawing more attention to the area of website sustainability, I wanted capture some of my thoughts on this topic.</p><h2>So, is data transfer a good proxy?</h2><p>No, data transfer probably isn’t the best proxy from website carbon emissions. Right now, though, it’s <em>the best we’ve got</em> given the tools and knowledge available to us.</p><p><a href="https://fershad.com/writing/website-carbon-beyond-data-transfer/">I’ve written about this in the past</a>. Ideally, we’d be able to use actually energy consumption figures at both the server & device level to work out how carbon intensive a webpage is.</p><p>There seems to be a growing consensus that network energy utilisation doesn’t correlate 1:1 with data transfer over said network. So that’s another part of the system we need to reevaluate.</p><p>Additionally, there’s no easily programable access to energy usage information (especially on the device) just yet. This makes it especially difficult for testing/measurement tools like WebPageTest to use anything other than data transfer for carbon estimates. Until there is, and in the absence of peer-reviewed research into the energy consumption of different media/file types, data transfer is a measurable metric that gives us a general sense of how carbon intensive a page <strong><strong><strong><strong><em>might be</em></strong></strong></strong></strong>.</p><h2>Why even bother with the frontend?</h2><p>With all that said, why should we even bother with estimating frontend website carbon emissions. Considering that current methodologies use data transfer as the key proxy for carbon emissions, shouldn’t we just focus our attention on the server level where we can get meaningful real energy consumption data?</p><p>Firstly, the impact of devices (i.e. frontend) is significant. One only needs to look at <a href="https://blog.mozilla.org/en/mozilla/release-mozillas-greenhouse-gas-emissions-baseline/"><strong>Mozilla’s own greenhouse gas emissions findings</strong></a>. While it’s not specific to websites, it is a pretty clear indicator at the business level that the emissions from usage of ones products can be a large portion of total operational emissions.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/d0fc5ea8b241ce725d4ed429fc92e553c2eea97a-1920x1080.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/d0fc5ea8b241ce725d4ed429fc92e553c2eea97a-1920x1080.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Chart showing that 98% of Mozilla’s emissions in 2019 came from the use of their products.</figcaption></figure><p>As I mentioned earlier, network data transfer is <strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><em>the best we’ve got for now</em></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong>. Our understanding, and access to different sources of information will evolve. As more people become aware (partly through tools like <a href="https://www.websitecarbon.com/">Website Carbon Calculator</a>, Carbon Control etc), we’ll have more minds thinking about this problem. And with all that, we’ll gradually move away from using data transfer towards more accurate indicators.</p><h3>Referencing accessibility and performance</h3><p>I wasn’t actively involved in the web industry when accessibility or performance started becoming a concern people wanted to test for. But, I can image that in those early days the tooling to test, measure and report on both were limited. Despite that, we didn’t stop testing accessibility or performance. Rather, folks in the field worked with what they had at the time and behind the scenes worked on ways to improve those tools/metrics. Website sustainability, I feel, will follow the same path.</p><h2>What could to replace data transfer in the future?</h2><p>Let’s end by talking about the future. What might come in to replace data transfer in our website carbon estimation models over the next one to five years?</p><h3>Carbon data in HTTP headers</h3><p>This would be <strong><strong><strong><strong><strong>the best!</strong></strong></strong></strong></strong> At the very least it would help provide real transparency about datacenter and network level emissions. Effectively, we’d have carbon emissions data sent along with every request on the internet. That would allow for significantly more detailed estimates to be produced.</p><p>There’s a <a href="https://www.ietf.org/archive/id/draft-martin-http-carbon-emissions-scope-2-00.html">proposal out</a> at the moment, which is very early doors. There’s also this paper by my Green Web Foundation colleague Chris Adam and others on <a href="https://www.thegreenwebfoundation.org/publications/extending-ipv6-to-support-carbon-aware-networking/">how the IPv6 protocol could be leveraged</a> in a similar way.</p><h3>Research into the frontend energy usage of specific file types</h3><p>There’s already <a href="https://websitesustainability.com/cache/files/research23.pdf">a paper by Alexander Dawson</a> exploring this. It would be great to see more work in this space, at it would further help refine how frontend carbon emissions estimates are produced.</p><p>There’s long been a belief that 100kb of JavaScript or video content has a much larger device-level impact than 100kb of HTML. Having research to back that up will allow for updated models to be created with this information baked in.</p><h3>Device-level energy/emissions reporting built into browser dev tools</h3><p>Currently, measuring device energy usage requires having dedicated infrastructure to perform measurements and tests. For this data to become more widely usable, it really needs to be built into browsers. We’ve seen <a href="https://fershad.com/writing/co2e-estimates-in-firefox-profiler/">Firefox take steps</a> in this direction, and Microsoft Edge have <a href="https://fershad.com/writing/microsoft-propose-sustainability-section-in-edge-devtools/">indicated they’d like to explore this</a> as well.</p><p>Another key piece of this will be to have this data be programatically accessible so that developers can build tools and tests around it. A combination of this, plus the HTTP headers stuff mentioned above would revolutionise how website carbon emissions are measured.</p><h2>Where to from here?</h2><p>Let’s keep chipping away at this, fam. While data transfer as a proxy might not provide the most accurate carbon emissions estimates, it’s what we’ve got to work with at the moment. Our understanding of this space will continue to evolve, and with that the tools, data, and metrics we use will change too.</p><p>Right now, though, raising awareness is almost as important as generating hyper-accurate CO2 figures. Digital sustainability needs to become part of the broader consciousness, and tools like Website Carbon, Carbon Control and <a href="https://ecograder.com/">Ecograder</a> all help serve this mission.</p><p>Folks in IT have a lot of leverage with their decisions. The more who know that they <strong><em>can</em></strong> <a href="https://www.thegreenwebfoundation.org/directory/">select green hosting providers</a>, or that they <strong><strong><strong>should</strong></strong></strong> <em>try to</em> follow <a href="https://sustainablewebdesign.org/">Sustainable Web Design</a> practices, the better off we’ll all be in our mission to lower the environmental impact of our sector.</p></div>Chasing efficiency rather than green energy2024-02-20T13:25:46Zhttps://fershad.com/writing/chasing-efficiency-rather-than-green-energy/<div><p>Last week, I read <a href="https://adrianco.medium.com/dont-follow-the-sun-scheduling-compute-workloads-to-chase-green-energy-can-be-counter-productive-b0cde6681763"><em>Don’t follow the sun: Scheduling compute workloads to chase green energy can be counter-productive</em></a> by Adrian Cockcroft on Medium. In it, Adrian makes some pretty sensible sounding arguments for why shifting workloads to use the greenest energy might not be the climate friendly solution we imagine it to be.</p><h2>Shifting workloads</h2><p>A simplified idea of shifting workloads to utilise green energy goes something like this:</p><ul><li>I have a compute task that I normally run on a server in Iowa.</li><li>However, Iowa doesn’t have the greenest energy grid. So when I want to run this task, I check other compute regions available to me & shift the task to run in the region which is using the most clean energy (i.e. has the lowest grid intensity).</li></ul><p>You might see this referred to as “chasing the sun” or “moving compute through space”. It’s a neat idea, and one that I feel does have merit. There are concepts like the <a href="http://solarprotocol.net/">Solar Protocol</a> which spec out a way to implement such a solution in practice.</p><p>But, when using hyperscale cloud providers, is chasing the sun actually a climate friendly practice? From <a href="https://wikitia.com/wiki/Adrian_Cockcroft">his bio</a>, Adrian definitely seems more well versed in how hyperscale data centers work than I am so I happily defer to him here.</p><h2>Key takeaways</h2><p>You can read Adrian’s article on Medium at the link above. A few key takeaways I took from it were:</p><ul><li>Even if you’re not using the resources in one datacenter, doesn’t mean that someone else is not.</li></ul><blockquote>Just because the carbon emissions aren’t charged to your account, it doesn’t make them go away.</blockquote><ul><li>Shifting workload to a lesser utilised region that runs on clean energy could see that data center provider having to provision more resources in that region. This means emissions from manufacturing are generated, and there’s an increase in the energy used by that data center (not to mention water - story for another time though).</li></ul><blockquote>Meanwhile your workload is generating demand in a different cloud region, and all regions do demand based capacity planning, so the cloud provider buys more computers, which increases carbon emissions both for manufacturing and shipping (scope 3) and the energy they use (scope 2).</blockquote><ul><li>And, currently US & EU regions have more low-carbon options, while Asia currently has a high grid intensity (though this should drop over the coming years).</li></ul><h3>TL;DR</h3><p>The crux of Adrian’s post is captured in this one paragraph.</p><blockquote>I suggest that the best policy is to optimize your workloads so that they can run on fewer more highly utilized instances, minimize your total footprint in Asia where possible, and to use the spot market price as a guide for when to run workloads.</blockquote><h2>Does this mean clean regions don’t matter?</h2><p>No. I don’t speak for Adrian here, but I’d guess he’s not arguing that point either. By all means, if you <strong><strong>can</strong></strong> provision compute tasks to run in regions with lower grid intensity then do so. But perhaps think twice if you’re wanting to provision a task in one region, then shift it around based on the grid intensity at the time you want to run it.</p><h2>Efficient code regardless of region</h2><p>As Adrian points out, the best policy is to optimise workloads. This makes sense as a default. Shifting around inefficient tasks from one region to another just to use green energy feels kinda like a kid cleaning up their room by putting all their stuff on their bed and then covering it with a blanket. I might just be speaking from experience on that one.</p><p>Moving that inefficient task to a green region could mean that the data center operator there now needs to buy more servers to meet usage. There’s a carbon cost associated with that. Heck, running inefficient code in general probably comes with a financial overhead. So, even if you don’t care about the environment, give some thought to your bottom line.</p><h2>What might this look like for a website?</h2><p>For the web, we need to look at both the data center (hosting) as well as a client (device) sides of the picture.</p><h3>Hosting</h3><p>In some cases we can choose where we host our sites. Picking a green region to start with, or even better using a <a href="https://www.thegreenwebfoundation.org/directory/">verified green hosting provider</a>, are sensible places to start. But, if that’s not possible, how can we go about making our sites more efficient on the server?</p><ol><li>Cache as much as possible. This includes database queries, static pages, and other static assets. If the content doesn’t need to be dynamically generated or realtime, then look to cache it.</li><li>Think about whether your site can be a static site - built once & stored on a host as static HTML pages. Content sites are perfect candidates for this.<ol><li>Features like incremental site builds, can further improve efficiency for static sites. Rather than rebuilding the entire website when a page is changed, incremental builds only rebuild those pages that had changes made to them. This reduces the resources & time needed to deploy website changes.</li></ol></li><li>For pages/sites that are served dynamically, reduce the number of processes that need to run for it to be built.</li><li>If you’re using a JavaScript framework that allows for server-side rendering, then look to see if you can make that process <a href="https://fershad.com/carbon-aware-site/">carbon aware</a>. It’s something I’ve got on my “ideas to toy with” list for later in the year.</li></ol><h3>Device</h3><p>While we might have some control over our hosting, we almost certainly cannot control where people access our website from. Not just that, but we can’t control the devices on which our sites are accessed either. This makes efficiency even more important, not just to reduce carbon but also to make our sites useable on low-spec devices.</p><ol><li>Consider making your <a href="https://fershad.com/writing/making-this-website-carbon-aware/">frontend carbon aware</a>.</li><li>Do less if the device is low-spec. Using the <code class="language-markup">navigator.deviceMemory</code> API is one way to check the kind of device your code is being run on. Here’s <a href="https://umaar.com/dev-tips/242-considerate-javascript/">a great guide</a> covering that and more.</li><li>Try sending down only the JavaScript that the client needs. Frameworks like Astro and Remix try to make this possible. There’s a growing movement towards shipping 0kb JS by default, and incrementally sending over just the stuff that’s needed to make more complex page functionality happen.</li></ol><p>In general, try to follow <a href="https://sustainablewebdesign.org/">sustainable web design</a> practices as much as possible.</p></div>Self-hosting a website on a solar powered Raspberry Pi2024-02-20T13:25:46Zhttps://fershad.com/writing/self-hosting-a-website-on-a-solar-powered-raspberry-pi/<div><p>This week has been all about getting back into the swing of a regular work routine after spending 10 days in Germany. It was nice to have a bit of break, and spend time mixed between city and rural life in Germany. I came away from my time there with a load of other ideas to tinker with throughout the year, so expect a few more “how I build this” kinda posts in the future.</p><p>One thing I’d love to tinker with, but aren’t sure where I’ll find the time to is self-hosting my website. So that I remember what to do whenever that time comes around, I’m sharing a conversation thread from the <a href="http://climateaction.tech/">ClimateAction.Tech</a> slack community community about that very topic. I feel like this thread is too good to be lost into the Slack ether, so am saving it here for future reference.</p><p>The conversation started when<strong><span> </span><a href="https://sustainablewww.org/">Michael Andersen</a></strong> ask in the #green-webdev channel:</p><blockquote>Happy Friday everyone 🙂<br /><br />I was wondering if any of you are hosting your websites on a Raspberry Pi or similar?<br /><br />I am thinking about transferring my website to Sweden or Norway because of the greener energy, but VPS hosting is quite expensive. At DO I can get a VPS in Germany for 6 USD per month, and in Sweden it costs 40 USD for the same specs.<br /><br />So I am thinking about another option which is to use one of my Raspberry Pi's to host the website. But before making a decision it would be nice to know what experience others have with it.</blockquote><p><a href="https://ecoping.earth/"><strong>Dryden Williams</strong></a> was quick to reply with a pointer to the Solar Protocol.</p><blockquote>Definitely doable! You might find this interesting: <a href="http://solarprotocol.net/">http://solarprotocol.net/</a>; a network of hosting on Raspberry Pi or similar</blockquote><p><strong>I</strong> also chimed in with a link.</p><blockquote>Check out what @scottsweb has done on his site. <a href="https://scott.ee/project/solar-hosting-raspberry-pi/">https://scott.ee/project/solar-hosting-raspberry-pi/</a> </blockquote><p><strong>Michael</strong> replied thanking us for the links.</p><blockquote>Thanks @Dryden Williams and @fershad. Will definitely take a look at both. Maybe I can come up with my own little 100% solar driven project for Sustainable WWW.</blockquote><p><strong>Dryden</strong> was also interested in the work of @scottsweb.</p><blockquote>I must say I REALLY LOVE that post by @scottsweb and what a great site!</blockquote><p><a href="https://scott.ee/"><strong>Scott Evans</strong></a> (@scottsweb) joined the conversation.</p><blockquote>Thanks for the kind words and pings. Let me know if you have any questions @Michael Andersen. My site has been ticking along on battery after a couple of brighter days. Cloudy and raining again now though 😟<br /><br />Also, self hosting is great. I have lots of local network only services running now on a little home server, which is also a backup for when the solar pi goes offline.</blockquote><p>Indeed, <strong>Michael</strong> did have questions & asked Scott:</p><blockquote>it’s a really cool project, and yeah I know it’s hard to keep things running on solar. In Sweden winters get quite dark 😅<br /><br />I have one question. I used to run another website from a raspberry pi at home, but 1 out of 3 times someone tried to surf to it, it was unreachable through Cloudflare despite a script almost constantly updating the dynamic IP. Are you having problems with that also or how have you gotten around that? 😊</blockquote><p>This is when things start to get really interesting, with <strong>Scott</strong> providing some great answers and insights into self-hosting a site on a solar powered device.</p><blockquote>So I actually have it working a little differently. I don’t have to use dynamic DNS at all and my IP address could change all day and it wouldn’t matter. Typically you would set it up like this: Internet traffic -> Your router (with a port open) -> Your server (and something to update dynamic DNS on your network)<br /><br />With Cloudflare tunnels it works differently: Internet traffic -> Cloudflare <- Your server (a secure tunnel is made between your server and Cloudflare)<br /><br />In this setup, you don’t need to open any ports on your router, or worry about updating DNS records. The tunnel using cloudflare replaces all that and provides additional security too. You can read a bit more about their service here: <a href="https://www.cloudflare.com/en-gb/products/tunnel/">https://www.cloudflare.com/en-gb/products/tunnel/</a> - it’s free for small teams (perfect for home) and can do things like lock down your apps too (you must sign in with a email ending in mydomain.com or have a certain IP address). I should note that Cloudflare is not the only company to offer this service either.</blockquote><p><strong>I</strong> contributed a link to a recent <a href="http://syntax.fm/">Syntax.fm</a> episode talking about the very topic of tunneling:</p><blockquote>There was a recent Syntax episode about tunneling which might surface other options.<br /><a href="https://syntax.fm/show/590/https-tunnel-your-localhost-cloudflare-tunnels-ngrok-and-more">https://syntax.fm/show/590/https-tunnel-your-localhost-cloudflare-tunnels-ngrok-and-more</a> </blockquote><p>Once more, <strong>Michael</strong> hit us with the gratitudes.</p><blockquote>Thank you so much <a href="https://climate-tech.slack.com/team/UGHFLNT5H">@scottsweb</a> and <a href="https://climate-tech.slack.com/team/U01PKTBUYLD">@fershad</a> 🙂<br /><br />I will look more into Cloudflare tunnels. I was not a huge fan of the traditional way updating the dynamic IP, but this way might be much better.</blockquote><p>At this point,<strong> <a href="https://rollthecloud.com/">Todd Zmijewski</a></strong> chimed in with a question about battery storage capacity tied to solar panels.</p><blockquote>How many batteries can a single solar panel reliably support? With enough of these you could become a 100% sustainable file coin storage provider. How much storage capacity does each of these have? I also wonder if its possible to run wasm cloud on these reliably. Introduces a whole other dimension of being able to run dynamic apps and apis. <a href="https://wasmcloud.com/">https://wasmcloud.com/</a> </blockquote><p><strong>Scott</strong> came back with an answer from his own experience.</p><blockquote>With the solar panel I have, it would only support one battery… and the charging circuit is only designed for one battery too. The Pi Zero currently has 16GB of storage as an EMMC (<a href="https://www.uugear.com/product/raspikey-plug-and-play-emmc-module-for-raspberry-pi/">one of these</a>). More storage could be added via USB, but it would send my energy demands way over budget.<br /><br />As things stand I currently use about 110mA on average, plugging in just a USB ethernet dongle takes that up to nearly 300mA, so adding something like a USB hard drive will probably go way beyond that (into the 500mA range).<br /><br />I am also observing something else quite interesting. During the summer last year the panel seemed to struggle to charge the battery, which I couldn’t understand. Now the sun is starting to appear again, it seems to all be working better than before. I have a feeling that heat is a huge factor on how this small solar panel operates. During summer, it gets plenty of sun but it becomes too hot to touch at times. On sunny days now, it gets plenty of sun but the outside temperature is still only 1-2 degrees. I am going to experiment with this some more when the heat comes back, but I have no idea what a solution might be.</blockquote><p><strong>Michael</strong> had a follow up question for Scott around panels and storage.</p><blockquote>Do you think a 100w panel could run run and charge a battery big enough to run a raspberry pi 4 with 8gb ram continuously? I know for sure it’s enough in summer time, but I wonder if the panel is big enough to create enough energy even when it’s cloudy or in the dark months of winter 😊</blockquote><p><strong>Scott</strong>’s reply included a link to the OG solar powered self-hosting website - <a href="http://solar.lowtechmagazine.com/">solar.lowtechmagazine.com</a> - rightly noting in the process that location also plays a big roll in how one might setup a solar powered server.</p><blockquote>That is a good question. At least where we are, I don’t think it would be easy to achieve. In the winter, the apartments opposite block our sun completely and the largest battery I have found to power my pi zero (that is compatible with the charging circuit I have) would last about 30 days. A pi 4 is going to be more <a href="https://www.pidramble.com/wiki/benchmarks/power-consumption">power hungry</a> by a factor of about 4x, so perhaps the same batter would last a week.<br /><br />I think you would probably want to upgrade the entire thing towards a 12v system like the one described in <a href="https://solar.lowtechmagazine.com/about.html">Low Tech Magazine</a> - they manage to achieve really good results with a smaller panel and bigger battery but they are based in Spain too (I think?) so get much more sun.They cover some of the technical aspects here:<br /><br /><a href="https://solar.lowtechmagazine.com/2020/01/how-sustainable-is-a-solar-powered-website.html#solarpanel">https://solar.lowtechmagazine.com/2020/01/how-sustainable-is-a-solar-powered-website.html#solarpanel</a> </blockquote><p>The battery talk between Michael and Scott continues, with <strong>Michael</strong> asking about Scott’s current setup.</p><blockquote>How big is the battery you are using? I am thinking that a week would probably be okay, and in worst case I would be able to recharge another way</blockquote><p>And <strong>Scott</strong> answering:</p><blockquote>It is this one: <a href="https://shop.gwl.eu/index.php?cl=details_disc&anid=4011">https://shop.gwl.eu/index.php?cl=details_disc&anid=4011</a> - so 20Ah</blockquote><p><strong>Michael</strong> then has a further question about how the whole setup is protected from the elements.</p><blockquote>Unfortunately they don't have that one any longer. How are you dealing with frost and heat? Do you have the whole project in a box outside or have you made arrangement to charge from the outside, but the project is located inside in constant heat?</blockquote><p><strong>Scott</strong> comes back with an alternate battery option, and talks about how he’s weatherised (or worked around weatherising) his current Pi and panel setup.</p><blockquote>This looks like a good potential alternative: <a href="https://shop.gwl.eu/LiFePO4-cells-3-2-V/ELERIX-Lithium-Cell-LiFePO4-3-2V-50Ah-1-1.html">https://shop.gwl.eu/LiFePO4-cells-3-2-V/ELERIX-Lithium-Cell-LiFePO4-3-2V-50Ah-1-1.html</a> - larger capacity is appealing too. For the Pi Zero it would probably last over 2.5 weeks without any sun.<br /><br />The Pi is inside on the windowsill and the panel is outside as much as it can be. I soldered quite a long cable between the Pi and the panel so I could keep one inside and one outside. To protect the Pi from heat, it is housed in <a href="https://i0.wp.com/scott.ee/images/solar-host-pi.jpg?w=1600&quality=80&strip=info">this foam case</a>, I keep the lid open and it casts a shadow over the circuitry. I don’t find heat a problem (except for the panel getting too hot outside).<br /><br />I don’t have a great setup in this apartment, if we ever move I think I will try and find somewhere to mount the panel outside all year and then weather seal all the cables.</blockquote><p><strong>Todd </strong>rejoined the conversation with</p><blockquote>We are able to run our entire enterprise for free using AWS serverless Lambdas and Azure function apps targeting regions with low grid intensities like Norway, Sweden, etc. Something that is not possible with VPS. Using serverless we are able to handle millions of requests for free across a multi-cloud with auto-scaling serverless climate aware functions.</blockquote><p><a href="https://heylow.world/"><strong>Nico</strong></a> shared a link to a very informative YouTube video:</p><blockquote>Probably a nice add-on to this conversation:<br /><a href="https://www.youtube.com/watch?v=t8tjDC6RIR8">https://www.youtube.com/watch?v=t8tjDC6RIR8</a><br /><br />I find the discussion about the website's uptime quite interesting. The site is live 98% of the time, which is totally fine for them. It's better than investing in additional infrastructure and resources to achieve a 100% uptime.<br /><br />Solar Protocol is mentioned (around 30min), as they feel that it misses the point since it involves creating a global infrastructure just to achieve a 2% increase in uptime to get to 100%</blockquote><div class="callout"><p></p><p>That’s where the conversation ended at the time of me writing this. If there are any further developments, I will update this post with the additional content.</p><p></p></div></div>Eleventy Plugin: Green Links2024-02-20T13:25:46Zhttps://fershad.com/writing/eleventy-plugin-green-links/<div><p>If you want the TL;DR version of this post, check out the readme for this project <a href="https://github.com/fershad/eleventy-plugin-green-links">on Github</a>.</p><p>Over a long weekend at the end of February I chipped away at a small Eleventy plugin. Eleventy is the static site generator that I use to build my website. It has a rich <a href="https://www.11ty.dev/docs/plugins/">plugin ecosystem</a>, covering everything from automating sitemap creation to unfurling links.</p><p>Building an Eleventy plugin is something I've wanted to do for a very long time, but I've never had a good enough idea to build out. That changed at the end of February 2023. I was poking around the site playing with ideas of a future redesign and had a thought - <em>what if I could highlight the pages I link to that are hosted on known green web hosts?</em></p><h2>Sketching out the idea</h2><p>Eleventy runs through all the pages of my site and builds them before they are chucked up on to Cloudflare. So, it should be possible to hook into the build process to achieve what I want to do. Here's a very rough idea of how it would work:</p><ol><li>Go through each page of my site at build time, and get all the valid <code class="language-markup">href</code> attributes in anchor (<code class="language-markup"><a></code>) tags.</li><li>Create an array of unique domains from those links.</li><li>Using The Green Web Foundation's Greencheck API, see if each domain is hosted green.</li><li>Add a custom attribute to any anchor tags where the domain is returned as being a green web host.</li></ol><h2>Turning the idea into code</h2><p>Transforms in Eleventy really help here. Transforms are a way to modify the output of a page template in Eleventy. With Transforms, I would be able to go through each page and run custom code against the HTML. So that was my starting point. I looked around for other Eleventy plugins that did something similar for a bit of inspiration on how to get this done.</p><p>I landed on <a href="https://github.com/sardinedev/eleventy-plugins/tree/main/packages/external-links"><code class="language-markup">eleventy-plugin-external-links</code></a> by sadinedev. The code repository gave me a perfect boilerplate for how my plugin would be shaped.</p><h3>Early on</h3><p><code class="language-markup">eleventy-plugin-external-links</code> allows users to pass different configuration options into the plugin. I ditched this to begin with, since I wanted to get something working first. Tinkering with the code I'd copied over from that plugin, I was able to successfully find all the links from a page, and then console log out an array of unique domains.</p><p>Next up, I needed to go through these domains and see which ones were green. For that, I reached for The Green Web Foundation's Greencheck API. The API checks a domain against the Green Web dataset, which is a curated database of verified green hosting providers.</p><p>I tested it out with one page, looping sequentially through the known hosts on that page and getting back results for each. Pretty good so far.</p><h3>Multiple domains</h3><p>Trying the same code out on my entire site, I quickly hit a snag. Querying domains sequentially, like I was, saw me quickly hitting rate limits on the API. A few pages would work, then bang I'd start getting errors back instead.</p><p>To get around this, I installed <a href="https://github.com/thegreenwebfoundation/co2.js">CO2.js</a>. I've got the privilege of being a maintainer of that library, and so am familiar with the different functionality it exposes. One, is the ability to perform a lookup of the Green Web dataset using an array of domains.</p><p>Using the <code class="language-markup">check()</code> function in CO2.js, I was able to pass in an array of domains and get back an array of domains that were hosted on verified green providers. From there, I could go through the collection of links from the page and find the ones who's hostname matched one of those returned by the Greencheck API.</p><h3>What to do with green links?</h3><p>I was now able to find all the links on a page which were served from known green hosts. The next thing to do was to figure out how to separate them from the rest of the links on the page. I settled on adding a custom data attribute (<code class="language-markup">data-green-link="true"</code>) to any anchor tag that linked to a site that was hosted green. With this in place, anyone using the plugin could then target those tags using CSS or JS to make those links stand out.</p><p>On my own site, I've added an SVG to all external links that have the <code class="language-markup">data-green-link="true"</code> attribute. Regular external links look <a href="https://github.com/">like this</a>, while those which are green hosted <a href="https://thegreenwebfoundation.org/">look a little different</a>.</p><h2>Hitting publish</h2><p>With all that in place, I felt there was enough to put the plugin out into the wild. I've got a bit of experience publishing to NPM through my work on CO2.js, but I'd never put out a package of my own. I don't know why, but in my mind I had an idea that it would be some kind of complicated process of hoops I'd need to jump through. In reality, <code class="language-markup">npm publish</code> and a few more keystrokes was all that was needed.</p><p>For future patch releases, I've been using <code class="language-markup">np</code> to manage the whole process.</p><h2>Adding configuration</h2><p>After dog fooding the plugin on my own site for a bit, I realised that I probably didn't need to be checking internal links. This is because I know that my site is hosted on Cloudflare, a verified green provider. With this in mind I went back to <code class="language-markup">eleventy-plugin-external-links</code> to look at allowing some configuration options to be passed into the plugin.</p><p>Eleventy transforms are just functions that take in a page's content and output path. Wrapping the transform function in a parent function allows for any number of additional variables to be passed in. For Green Links, I chose to allow a config object that contained a key of <code class="language-markup">ignore</code>. <code class="language-markup">ignore</code> could contain an array of domains that would not be checked when the transform runs.</p><p>The upside of having a configuration object is that it allows for <a href="https://fershad.com/writing/eleventy-plugin-green-links/#future-plans">more options to be added</a> to the plugin later on.</p><h2>Cool story, but why does this matter?</h2><p>Choosing a green web host for a website is one of the most impactful decisions any website owner can make. Based on peer-reviewed research, the Sustainable Web Design model says hosting accounts for <a href="https://sustainablewebdesign.org/calculating-digital-emissions/">15% of a website’s total energy usage</a>. Beyond making your own site more sustainable, it also sends a message to other hosting providers that their potential customers value services that are powered by renewable energy.</p><h2>Future plans</h2><p>There's no real roadmap for Green Links. I'm open to ideas from the community, and code contributions too. That said, I do have a few ideas which I might get around to working on when time permits.</p><ul><li>TypeScript - I've never written a project in TypeScript, so this could be a nice small playground to learn.</li><li>Caching results - This can help speed up build times slightly, but more importantly means the Green Web Foundation's API won't get slammed each time a site is updated.</li><li>Getting more info - Knowing a host is green is a great start, but there might be ways to surface even more information about a known provider (like what evidence backs up their claims of being a green host).</li></ul><h2>Use Eleventy Plugin: Green Links</h2><p>You can install Green Links into an Eleventy project by running <code class="language-markup">npm install eleventy-plugin-green-links --save-dev</code>. You can find more <a href="https://github.com/fershad/eleventy-plugin-green-links">installation and configuration instructions</a> on Github.</p></div>Curious about driving the transition to a fossil-free internet? Here’s how CO2.js can help.2024-02-20T13:25:46Zhttps://fershad.com/writing/curious-about-driving-the-transition-to-a-fossil-free-internet-heres-how-co2-js-can-help/<div><div class="callout"><p></p><p>This post was originally published on The Green Web Foundation's blog. You can <a href="https://www.thegreenwebfoundation.org/news/curious-about-driving-the-transition-to-a-fossil-free-internet-heres-how-co2-js-can-help/">read the original post here</a>.</p><p></p></div><p>Our mission at The Green Web Foundation is for a fossil-free internet by the year 2030. We know that getting there will take a collective effort on the part of technologists around the globe. That’s why we’re always looking for ways to leverage open source and open data. Our aim is to equip those in tech jobs with compelling, state of the art, practical and well documented tools and “patterns” for change. Tools and patterns that can be used <em>right now</em> in workflows and products.</p><p>CO2.js is just one of the tools we’ve created to help with this. This article explains the concepts behind CO2.js, it’s uses, and when other tools might be better options to consider.</p><h2>What is CO2.js?</h2><p>CO2.js is an open-source JavaScript library that enables developers to:</p><ul><li>estimate the carbon emissions produced by transferring bytes of data on the internet;</li><li>get different forms of grid intensity data, such as annual average and marginal data by country;</li><li>make automated queries against Green Web Foundation’s Green Domain’s dataset.</li></ul><p>At its core, CO2.js takes an input of measured activity, for example bytes sent over the internet, and returns an estimate of the carbon emissions produced doing so. It can be run in Node.js server environments, in the browser, as well as on some serverless and edge compute runtimes.</p><p>In addition to this, CO2.js champions the open data ethos which is a core part of our work at The Green Software Foundation. Since <a href="https://www.thegreenwebfoundation.org/news/release-guide-co2-js-v0-11/">v0.11</a> developers using the library have access to annual grid intensity data from <a href="https://ember-climate.org/">Ember</a> (open data non-profit) and the <a href="https://unfccc.int/">UNFCCC</a> (United Nations Framework Convention on Climate Change). CO2.js also provides a wrapper through which developers could make automated queries against The Green Web Foundation’s <a href="https://datasets.thegreenwebfoundation.org/">Green Web dataset</a>.</p><h3>Carbon estimates</h3><p>CO2.js brings together peer reviewed carbon estimation models, and packages them up into a versatile, open-source JavaScript library. Doing this allows developers, who may not have the domain knowledge required to calculate carbon estimates themselves, to build reliable and robust carbon measurements into their software, sites, or apps.</p><p>At present, CO2.js ships with two models for measuring digital carbon emissions - the Sustainable Web Design (SWD) model, and the OneByte model. The SWD model is the more recent of the two, and is the default model used when generating estimates using CO2.js. By making sensible defaults easily available, developers can quickly start measuring emissions right away, and see some results of changes you make.</p><p>The diagram below can help you visualise the trade-offs made when choosing versatile generalised models like SWD (on the left side of the illustration), as opposed to using more granular data and calculations (on the right) which we’ll talk about later in this post.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/d7200a17ee2cec76f6fbd640ef5bba533cd13d52-2211x1465.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/d7200a17ee2cec76f6fbd640ef5bba533cd13d52-2211x1465.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center"></figcaption></figure><p><strong>Further reading: <a href="https://developers.thegreenwebfoundation.org/co2js/explainer/methodologies-for-calculating-website-carbon/">Methodologies for calculating website carbon</a></strong> </p><h3>Free grid intensity data</h3><p>As of the CO2.js version 0.11 release in September, 2022 CO2.js, the library includes free average grid intensity data from <a href="https://ember-climate.org/data/data-explorer/"><strong>Ember</strong></a>, as well as marginal intensity data from the <a href="https://unfccc.int/"><strong>UNFCCC</strong></a> (United Nations Framework Convention on Climate Change). We believe that making this data open, and accessible to all developers can empower them to build carbon awareness and more specific carbon estimates into the tools they build.</p><p>Carbon intensity is a way of measuring how clean electricity is. Or, more precisely, how much CO2 is emitted with each unit of energy produced. The electricity that powers a grid comes from a variety of sources such as renewables, fossil fuel based or nuclear. We call this the fuel mix. And it's this fuel mix that influences the carbon intensity of a country’s electricity grid. Annual average emissions intensity reflects the fuel mix of the entire electricity grid over a year. You’ll see average intensity used in the majority of carbon reporting standards and tooling.</p><p><strong>Further reading: <span>Average and marginal intensity explained</span></strong> </p><h3>Green web hosting check</h3><p>CO2.js also comes with handy functions that allow developers to run automate checks of a website domain against the Green Web dataset.</p><p>The Green Web dataset is a list all the domains running on renewably powered infrastructure in our system, along with which organisation is hosting them, and the date of the last check. The dataset is actively maintained by The Green Web Foundation, and has been used in projects such as the <a href="https://almanac.httparchive.org/en/2022/sustainability#how-many-of-the-sites-listed-in-the-http-archive-run-on-green-hosting">HTTP Archive’s 2022 Web Almanac</a>.</p><h2>Why would you use CO2.js?</h2><p>The carbon emissions of the internet are something abstract and out of sight for most.</p><p>Changing that starts with being able to measure the carbon emissions associated with digital activities in a quick and reliable way. This is where CO2.js comes in.</p><p>The next part - surfacing, visualising and presenting these figures in innovative ways - is where, the curious developers, product teams and creators come in. We urgently need ways that make it easy for everyone to comprehend and act on the data.</p><p>Here’s a few awesome approaches already using CO2.js to do just this.</p><h3>Carbon budgets</h3><p>Having a <strong><strong><strong><strong><strong><strong><em>carbon budget</em></strong></strong></strong></strong></strong></strong> - a limit for carbon emissions that a website, app or specific page should not exceed - is a great way to ensure digital services are as low-carbon as possible.</p><p>Since most <strong>website analytics service</strong> run on the user’s browser, they have access to information about how much data is transferred with each page load, and can turn this into meaningful carbon estimates for website owners. Website analytics services <a href="https://withcabin.com/">Cabin Analytics</a> and <a href="https://statsy.com/">Statsy</a> both use CO2.js to present carbon emissions estimates for web pages based on user traffic.</p><p><a href="http://sitespeed.io/">Sitespeed.io</a>, a <strong>website performance monitoring tool</strong>, has been using CO2.js in its sustainability plugin since the very early days of the project. We spoke with creator Peter Hedenskog about it in more detail for a case study - <strong><strong><a href="https://www.thegreenwebfoundation.org/news/sitespeed-io-using-and-contributing-to-co2-js/">Sitespeed.io – Using and contributing to CO2.js</a></strong></strong>.</p><p>Since version 104, the Firefox <strong>browser</strong> includes <a href="https://www.mozilla.org/en-US/firefox/104.0/releasenotes/">device power usage data in their developer tools profiler results</a>. This is very useful in gaining insights into how much power a site or app is utilising both when it is in standby and also actively being used. With this granular level of data available, we authored the code required to take the power data that was already available, and generate carbon estimates from that suing CO2.js. With their help, we were able to get a <a href="https://github.com/firefox-devtools/profiler/pull/4372">pull request</a> merged in early December 2022.</p><p>The change we’ve made in the Firefox Profiler currently uses global annual average emissions figures to calculate a carbon estimate. However, it would certainly be possible to take this a step further and use a grid intensity value for the country/region in which the profile was captured. We’re hoping to continue collaborating with the Firefox team to make this change in 2023.</p><h3>Carbon estimations and dashboards</h3><p>CO2.js also lends itself nicely for use in carbon estimation tools and dashboards.</p><p><a href="https://ecograder.com/">Ecograder</a> is a web page testing tool built by MightBytes which uses CO2.js to calculate the carbon emissions, and check for green hosting, of a web page that is tested. They couple this information with data from other sources to present an overall score for the page.</p><p>Ecograder is an example of a public facing tool, but a similar idea could be used to create carbon calculator tools for internal company use. This can be a way for organisations, or teams, to evaluate and understand the carbon impact of their digital use. Office managers and sustainability teams could work with developers to use CO2.js to track the carbon intensity of data usage within an office environment. Plugging network data usage into CO2.js can allow for monitoring and reporting on the digital usage footprint of an organisation or business.</p><p><a href="https://www.thegreenwebfoundation.org/news/release-guide-co2-js-v0-12/">Version 0.12 of CO2.js</a> introduced the ability to customise grid intensity and other constants that are used in the Sustainable Web Design model calculations. This paves the way for more detailed, case specific carbon emissions estimates. It means that a company located in Argentina can use that country’s grid intensity values when calculating digital carbon emissions, rather than relying on a global average figure.</p><h3>Automated testing and workflows</h3><p>Developers and consultants may look to use CO2.js to create automated tasks that check multiple domains for green web hosting. Carbon budgets described above could also be coupled with CI/CD tools like Github Actions to check the carbon intensity of deploys, and block deployments that exceed the carbon budget that is set.</p><p><a href="https://stepci.com/">Step CI</a> is an open-source framework for testing APIs. It uses CO2.js to give developers data on the carbon emissions of API calls, as well as enabling them to set a <strong><strong><strong><strong><strong><strong><em>carbon budget</em></strong></strong></strong></strong></strong></strong> which their API should stay within.</p><h3>Creating carbon aware user experiences</h3><p>The idea of a carbon aware website or app is one where content/user experience changes depending on how the grid intensity of the electricity grid. When the grid is powered by more renewable energy, more complex interfaces and content might be shown. When the grid is powered by more fossil-fuels, a simpler, less processor intensive experience is presented.</p><p>The grid intensity data present in CO2.js makes it possible for developers to start building carbon aware experiences. As a starter, the annual country level emissions could be used to trigger carbon aware UI or functionality based on a user’s location. This can be taken a step further by using a real-time grid intensity API service to get the current grid intensity of a user’s location. This can then be compared to the average annual emissions available in CO2.js to determine if a user should see a low-carbon or regular experience. It’s something that <a href="https://fershad.com/writing/making-this-website-carbon-aware/">I’ve experimented with</a> on my personal website.</p><p>App developers can also use CO2.js in user-facing applications to give visibility to the carbon impact of user activity in the application. Users uploading files, or downloading content, could be notified of the impacts of those tasks. Large, carbon-intensive, data transfers could also be blocked or limited. Users could also have the option to set a carbon budget for their browsing or use of an app, website, or online service.</p><h2>When not to use CO2.js</h2><p>The above examples of how CO2.js is being used to calculate carbon estimates are pretty cool (at least we think so!).</p><p>That said, CO2.js isn’t a catch all solution for digital carbon estimation.</p><p>The calculations in the Sustainable Web Design and OneByte models are estimation models, which have their own definition for <a href="https://www.wholegraindigital.com/blog/website-energy-consumption/">system boundaries</a>, as well as their own assumptions about energy usage, and grid intensity. While they are based on the best available research at the time, it must be remembered that they are still <strong><strong><strong><strong><strong>estimation</strong></strong></strong></strong></strong> models.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/4bf7fb32a94138108a78b518f133bee588bc42ed-950x534.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/4bf7fb32a94138108a78b518f133bee588bc42ed-950x534.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">The breakdown of system segments used for calculations in the Sustainable Web Design model.</figcaption></figure><p>Ensuring you’re using the right tool for what you’re trying to measure can be tricky, so we’ve listed a few scenarios in which other tools are more appropriate than CO2.js.</p><h3>Measuring cloud or serverless carbon emission</h3><p>With more and more compute shifting to <strong><strong><strong><strong><em>the cloud</em></strong></strong></strong></strong>, it is becoming increasingly important for operations teams to be aware of the carbon emissions for their cloud-based workloads. <a href="https://www.cloudcarbonfootprint.org/">Cloud Carbon Footprint</a> is an open-source tool that provides visibility and tooling to measure, monitor and reduce your cloud carbon emissions. It works with major cloud providers (AWS, Google Cloud, and Microsoft Azure), and provides energy and carbon estimates that include embodied emissions from device manufacturing.</p><p>When comparing it to CO2.js, Cloud Carbon Footprint is a better tool for measuring the emissions of cloud-base workloads as it uses actual CPU utilisation for calculations whenever that data is available to it. This is far more accurate than the generalised figures used for data center emissions found in both the Sustainable Web Design and OneByte models.</p><h3>Generating extremely precise carbon emissions figures</h3><p>If you need to generate precise carbon emissions figures specific to your individual use case (for example, running multiple serverless functions that run in different regions), then you’re going to need to set up your own monitoring, measurement, and calculation system. Some open-source tools and methodologies that can help with this include:</p><ul><li><a href="https://github.com/hubblo-org/scaphandre">Scaphandre</a> - a metrology agent dedicated to electrical power consumption metrics.</li><li><a href="https://github.com/marmelab/greenframe-cli">Greenframe CLI</a> - a command line tool to estimate the carbon footprint of a user scenario on a web application.</li><li><a href="https://dimpact.org/methodology">The DIMPACT methodology</a> - a detailed methodology to estimate the carbon impacts of serving digital media and entertainment products.</li></ul><h3>Using more detailed grid intensity data</h3><p>Both the average and marginal grid intensity data in CO2.js are annual figures. These are a great start, and can be used as an indicator for how clean or dirty a country’s energy grid is. However, if you are looking to build an app, site, or service that responds to real-time grid intensity in some part of the world, then you’ll need to use a dedicated service to get that data.</p><p><a href="https://www.electricitymaps.com/">Electricity Maps</a> and <a href="https://www.watttime.org/">WattTime</a> are two providers who can give you close-to-real-time regional-level grid intensity data.</p><p>Additionally, the <a href="https://github.com/Green-Software-Foundation/carbon-aware-sdk">Green Software Foundation’s Carbon Aware SDK</a> provides a WebAPI and a CLI tool that can be used to fetch current grid intensity data. The Carbon Aware SDK can also be used to fetch current and forecast grid intensity data Electricity Maps and WattTime through a single set of API endpoints. With this you can build out carbon aware apps that respond to how green the grid is at any given time.</p><h2>Using and contributing to CO2.js</h2><p>Developers who want to start using CO2.js can check out our quickstart guide - <strong><strong><a href="https://www.thegreenwebfoundation.org/news/start-calculating-digital-carbon-emissions-in-5-minutes-with-co2-js/">Start calculating digital carbon emissions in 5 minutes with CO2.js</a></strong></strong>.</p><p>Our <a href="https://developers.thegreenwebfoundation.org/co2js/installation/"><strong>CO2.js developer documentation website</strong></a> goes into more detail on the different models, methods, and data available in the library.</p><p>CO2.js is an open-source project, and we welcome community contributions. Code is <a href="https://github.com/thegreenwebfoundation/co2.js">kept on GitHub</a>, where you’ll be able to track future releases, as well as contribute to the future direction of the library through issues and pull requests.</p><p>If you’ve made something using CO2.js, we’d love to know! Show us your handy work on Twitter (<a href="https://twitter.com/greenwebfound">@greenwebfound</a>), <a href="https://www.linkedin.com/company/the-green-web-foundation/">LinkedIn</a>, or by using <a href="https://www.thegreenwebfoundation.org/support-form/">our website</a>.</p><p> </p></div>Release guide: CO2.js v0.122024-02-20T13:25:46Zhttps://fershad.com/writing/release-guide-co2-js-v0-12/<div><div class="callout"><p></p><p>This post was originally published on <a href="https://www.thegreenwebfoundation.org/news/release-guide-co2-js-v0-12/">The Green Web Foundation's blog</a>.</p><p></p></div><p>CO2.js v0.12.0 introduces the ability to customise the figures used in carbon emissions calculations when using the Sustainable Web Design model, paving the way for more case specific carbon emissions estimates.</p><h2>Adjust constants used by Sustainable Web Design</h2><p>The Sustainable Web Design model applies a number of constants to its carbon emissions calculation. They are:</p><ul><li>What percentage of visits to a site are new visitors</li><li>What percentage of visits to a site are returning visitors</li><li>What percentage of data for return visitors is downloaded</li><li>The global average grid intensity (442 g/kWh) is used for all segments. Green hosted data centers use a grid intensity of 50 g/kWh.</li></ul><p>These constants allow for general carbon estimations to be made using the model. However, in order to return emissions estimates that are more situation specific emissions estimates, developers need to be able to easily make adjustments to these values. The ability to do this has been requested in multiple issues (<a href="https://github.com/thegreenwebfoundation/co2.js/issues/120">#120</a> & <a href="https://github.com/thegreenwebfoundation/co2.js/issues/109">#109</a>) and it is now possible through two new functions.</p><h3>Two new functions</h3><p>In v0.12.0, we have introduced two new functions which allow developers to pass an options object containing customised values for the constants mentioned above. These new functions are modifications that build on the <code class="language-markup">perByte</code> and <code class="language-markup">perVisit</code> functions:</p><ul><li><code class="language-markup">perByteTrace</code></li><li><code class="language-markup">perVisitTrace</code></li></ul><p>It should be noted that both the <code class="language-markup">perByte</code> and <code class="language-markup">perVisit</code> functions are still present in CO2.js. In the example below, the <code class="language-markup">perVisitTrace</code> function is used to estimate emissions for 1 million bytes.</p><pre class="language-javascript"><code class="language-javascript">import tgwf from '@tgwf/co2';
const co2 = new tgwf.co2({ model: 'swd' });
co2.perVisitTrace(1000000, false, {
dataReloadRatio: 0.6,
firstVisitPercentage: 0.9,
returnVisitPercentage: 0.1,
gridIntensity: {
device: 565.629,
dataCenter: { country: "TWN" },
networks: 442,
},
})</code></pre><p>Here you can see that we have passed the function an object as the third parameter. Inside of that, we can use the following keys to adjust the constants used by the Sustainable Web Design calculation:</p><ul><li><code class="language-markup">dataReloadRatio</code> - a number between 0 and 1 representing the percentage of data that is downloaded by return visitors.</li><li><code class="language-markup">firstVisitPercentage</code> - a number between 0 and 1 representing the percentage of new visitors.</li><li><code class="language-markup">returnVisitPercentage</code> - a number between 0 and 1 representing the percentage of returning visitors.</li><li><code class="language-markup">gridIntensity</code> - an object that can contain the following keys:<ul><li><code class="language-markup">device</code> - the grid intensity to use for the device segment.</li><li><code class="language-markup">dataCenter</code> - the grid intensity to use for the data center segment.</li><li><code class="language-markup">networks</code> - the grid intensity to use for the networks segment.</li></ul></li></ul><p>The values for <code class="language-markup">device</code>, <code class="language-markup">dataCenter</code>, and <code class="language-markup">networks</code> can be either:</p><ul><li>A number representing the carbon intensity for the given segment (in grams per kilowatt-hour). In the example above, we have set <code class="language-markup">device</code> and <code class="language-markup">network</code> grid intensity in this way.</li><li>An object, which contains a key of <code class="language-markup">country</code> and a value that is an <a href="https://www.iso.org/obp/ui/#search">Alpha-3 ISO country code</a>. In the example above, we have set <code class="language-markup">dataCenter</code> in this way, using the country code for Taiwan (TWN).<ul><li>When setting grid intensity with a country code, CO2.js will use the <a href="https://developers.thegreenwebfoundation.org/co2js/data/">average grid intensity data for that is included in the library</a>. If no data is available for the country, then the global grid intensity (442 g/kWh) is used.</li></ul></li></ul><h3>Traceable results</h3><p>Since we are allowing developers to divert from the baseline Sustainable Web Design calculation, we have decided to modify the way results are returned in the new <code class="language-markup">perByteTrace</code> and <code class="language-markup">perVisitTrace</code> functions. In this way, developers have a type of audit trail for their results. Let’s compare the results of measuring 1 million bytes using the <code class="language-markup">perVisit</code> and <code class="language-markup">perVisitTrace</code> functions.</p><pre class="language-javascript"><code class="language-javascript">import tgwf from "@tgwf/co2";
const co2 = new tgwf.co2({ model: "swd" });
const original = co2.perVisit(1000000)
/* Returns
0.2703051000000001
*/
const trace = co2.perVisitTrace(1000000, false, {
gridIntensity: {
device: 565.629,
dataCenter: { country: "TWN" },
// Notice here that we've not included the network key.
// CO2.js will use the default value if the key-value is not provided.
},
dataReloadRatio: 0.6,
firstVisitPercentage: 0.9,
returnVisitPercentage: 0.1,
});
/* Returns
{
"co2": 0.4081089199680001,
"green": false,
"variables": {
"description": "Below are the variables used to calculate this CO2 estimate.",
"bytes": 1000000,
"gridIntensity": {
"description": "The grid intensity (grams per kilowatt-hour) used to calculate this CO2 estimate.",
"network": 442,
"dataCenter": 565.629,
"production": 442,
"device": 565.629
},
"dataReloadRatio": 0.6,
"firstVisitPercentage": 0.9,
"returnVisitPercentage": 0.1
}
}
*/
</code></pre><p>You can see that while the original <code class="language-markup">perVisit</code> function returns a numeric value, the <code class="language-markup">perVisitTrace</code> function returns an object with the resulting co2 estimate, as well as details of all the variables that were used in the calculation. Let’s go through them one by one.</p><ul><li><code class="language-markup">co2</code> - this is the resulting carbon emissions estimate. It is the value you most likely want to show to users/use somewhere else.</li><li><code class="language-markup">green</code> - a boolean that is passed into the function indicating whether the host data center should be considered a green host.</li><li><code class="language-markup">variables</code> - an object containing details of all the constants used in the carbon emissions calculation. This object contains:<ul><li><code class="language-markup">description</code> - a string of text describing the object’s contents.</li><li><code class="language-markup">bytes</code> - the number if bytes for which the CO2 estimate is being calculated.</li><li><code class="language-markup">gridIntensity</code> - an object containing the grid intensities used for the <code class="language-markup">network</code>, <code class="language-markup">dataCenter</code>, <code class="language-markup">production</code>, and <code class="language-markup">device</code> segments. It also contains a <code class="language-markup">description</code> field for context.</li></ul></li><li><code class="language-markup">dataReloadRatio</code> - the value used to account for the data downloaded by returning visitors.</li><li><code class="language-markup">firstVisitPercentage</code> - the value used to account for the impact of new visitors.</li><li><code class="language-markup">returnVisitPercentage</code> - the value used to account for the impact of returning visitors.</li></ul><h3>Request for feedback</h3><p>At present, both these functions should be considered experimental. Their APIs could change in future versions for CO2.js. We hope that they are a useful addition to the library for developers, and eventually plan to make them the recommended way to generate carbon emissions estimates.</p><p>In the meantime, we encourage you to try them out and leave us your feedback in the <a href="https://github.com/thegreenwebfoundation/co2.js/issues">CO2.js Github repository</a>.</p><h2>Returning segment-level emissions estimates</h2><p>When working with the Sustainable Web Design model, developers now have the option to return a breakdown of carbon emissions estimates by system segment. This can be used to give a more detailed representation of a carbon emissions estimate.</p><p>Returning segment-level emissions estimates can be done by passing a <code class="language-markup">results</code> key with the value of <code class="language-markup">"segment"</code> when initialising CO2.js.</p><pre class="language-javascript"><code class="language-javascript">import tgwf from "@tgwf/co2";
const co2 = new tgwf.co2({ model: "swd", results: "segment" });
const byte = co2.perByte(1000000)
/* Returns
{
"consumerDeviceCO2": 0.1861704,
"networkCO2": 0.05012280000000001,
"productionCO2": 0.06802380000000001,
"dataCenterCO2": 0.053703,
"total": 0.35802000000000006
}
*/
const visit = co2.perVisit(1000000)
/* Returns
{
"consumerDeviceCO2 - first": 0.13962780000000002,
"consumerDeviceCO2 - subsequent": 0.0009308520000000001,
"networkCO2 - first": 0.0375921,
"networkCO2 - subsequent": 0.0002506140000000001,
"productionCO2 - first": 0.05101785000000001,
"productionCO2 - subsequent": 0.00034011900000000005,
"dataCenterCO2 - first": 0.04027725,
"dataCenterCO2 - subsequent": 0.000268515,
"total": 0.2703051000000001
}
*/
</code></pre><p>Rather than returning a numeric value, the functions now return an object. In the two results above, the <code class="language-markup">total</code> key represents the overall carbon emissions estimate.</p><p>In the <code class="language-markup">perByte</code> function, you can see results for device, network, data center, and production segments. The <code class="language-markup">perVisit</code> function breaks this down further, showing first and subsequent (return) visit emissions for each segment.</p><p>Note that this will also work for the new <code class="language-markup">perByteTrace</code> and <code class="language-markup">perVisitTrace</code> functions that have been introduced in v0.12.0.</p><p>You can find <a href="https://github.com/thegreenwebfoundation/co2.js/releases">details of every release</a> for CO2.js on GitHub, where you’ll also be able find the <a href="https://github.com/thegreenwebfoundation/co2.js/blob/main/CHANGELOG.md">changelog</a> for this project.</p><p>If you are using CO2.js in production then The Green Web Foundation would love to hear from you! Use the <a href="https://www.thegreenwebfoundation.org/support-form/">contact form</a> on the website to get in touch.</p></div>Power measurements and CO2e estimates in Firefox Profiler2024-02-20T13:25:46Zhttps://fershad.com/writing/co2e-estimates-in-firefox-profiler/<div><p>Earlier this year, an update to the Firefox Dev Tools Profiler was merged into production (Jan. 10, 2023 - <a href="https://github.com/firefox-devtools/profiler/pull/4414">here's the PR</a>). It includes a merge of some code I helped author which surfaced carbon emissions estimates for power measurements made using the profiler. You may remember me using a pre-release version of this when I <a href="https://fershad.com/writing/cop27-egypt-a-webpage-sustainability-review/">audited the COP27 website</a> last year.</p><p>Having code merged into a Mozilla project is wild. It's even cooler when it gets a shout out at <a href="https://fosdem.org/2023/schedule/event/energy_power_profiling_firefox/">FOSDEM</a>.</p><h2>What is the Firefox Profiler?</h2><p>In a nutshell, the Firefox Profiler is a Developer Tool in the Firefox browser that can be used to capture a performance profile of browser usage. This profile contains extremely low-level detail on the processes and functions that were being run at the time it was capture, and is extremely useful for debugging performance issues, memory leaks, and much more.</p><h3>Power measurements in Firefox Profiler</h3><p><a href="https://www.mozilla.org/en-US/firefox/104.0/releasenotes/">As of Firefox 104</a>, the profiler has been able to capture power usage. About a month ago I had the pleasure of talking with Florian Quèze, one of the developers responsible for getting this feature into the Profiler. As you can imagine, capturing the power profile of a browser is no easy feat, and there's a bunch of important caveats to keep in mind if you do go ahead and use it.</p><ol><li>The power profile that is capture on Intel CPU devices is that of the <em>entire system</em>. That means if you've got a whole bunch of background apps running, their power usage will be captured too.</li><li>On AMD CPUs, you can get power profiles broken down per core, but again they capture <em>entire system</em> power draw.</li><li>On Apple Silicon CPUs, you can get a <em>per process</em> breakdown. This should allow you to isolate the power track for the browser window that is being captured.</li><li>Capturing power profiles works well on Windows 11 and Apple Mac devices, but needs some manual steps to work on Linux. I've not yet managed to get it going on my Linux device.</li><li>It's currently not possible to capture the power profile of a mobile device.</li></ol><h3>CO2e estimates in the Firefox Profiler</h3><p>The CO2e (carbon dioxide equivalent) emissions estimates that I helped add to the Profiler are pretty straight forward. It works by:</p><ol><li>Taking the power figures recorded by the profiler and converting them to kilowatt-hours.</li><li>Multiplying this value by 442 g/kWh (the global average grid intensity in grams of one kilowatt-hour).</li></ol><p>Ideally, we’d like to use region-specific figures for even greater detail. There’s been some <a href="https://github.com/firefox-devtools/profiler/pull/4243#issuecomment-1266624528">conversation around this</a> as part of the PR for this feature. I expect we'll be able to get something like this implemented sometime this year.</p><h2>Recording a profile</h2><p>Another thing that came out of my chat with Florian, was that he recommended using the Profiler Toolbar Extension for Firefox when capturing power profiles. The reason, he said, was that they had put in a lot of work into making the extension consume as little power as possible so as not to impact the results of the recording. To add it to your Toolbar, go to <a href="https://profiler.firefox.com/">https://profiler.firefox.com</a> and click on the <em>Enable Firefox Profiler Menu Button</em>.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/f93b05a2dfa1d52c97e209360eebb7443707ea12-1404x1073.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/f93b05a2dfa1d52c97e209360eebb7443707ea12-1404x1073.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Screen capture showing how to add the Firefox Profiler to the browser taskbar.</figcaption></figure><p>This gives you a one-click profiling option through the Firefox taskbar. Another way to capture a profile is through DevTools. In the browser, open up DevTools and navigate to the Performance tab. There, you’ll see a dropdown with different presets which you can use. There’ll be a Power option in that list. With that selected, you can start recording!</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/c002eb1d76f1294417f3f7f63ffef37af7edcd7c-989x443.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/c002eb1d76f1294417f3f7f63ffef37af7edcd7c-989x443.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Screenshot from Firefox 106 DevTools, showing the Power preset selected in the Performance tab.</figcaption></figure><h3>My recommendation</h3><p>Given what I mentioned above about how the profiler captures power use of the <strong><strong><strong><strong><strong><strong>entire system</strong></strong></strong></strong></strong></strong>, it might make sense to run a clean profile first (with just Firefox open, no tabs open) to give you a comparison point when trying to examine how power hungry a page is.</p><h2>Reading a profile</h2><p>Depending on the system you’ve run the profile on, you’ll be presented with different Power tracks. In my examples, I’ve captured a profile on a Windows 11 Surface Pro 6 Laptop. This is an Intel powered device, and so I’m presented the with below four tracks:</p><ul><li>Power: DRAM</li><li>Power: CPU package</li><li>Power: CPU cores</li><li>Power: iGPU</li></ul><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/b8b3f4fc314f3da222b1397eedca403b29ed92a9-873x196.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/b8b3f4fc314f3da222b1397eedca403b29ed92a9-873x196.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Screenshot showing Power tracks (rows) in the Firefox Profiler.</figcaption></figure><p>From what I understand now, to get the full power usage during the profile I’ve capture I would want to sum together the values for <strong><strong><strong><strong><strong><strong>Power: DRAM</strong></strong></strong></strong></strong></strong> and <strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>Power: CPU Cores.</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong> I can get these values by hovering over the respective tracks and noting the <strong><strong>Energy use in the visible range</strong></strong> value.</p><p>You’ll also notice the little CO2e value in parentheses after the power reading. You can use that for a ballpark carbon estimate, or you can get more precise if you want by converting the power measurement to kilowatts per hour, and then multiplying that by an appropriate grid carbon intensity figure.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Update - 19.09.2023</p><p></p><div><p>The team at Mozilla have been working a bit more on this feature recently, and now it's possible to change the grid intensity that is used to calculate the CO2e estimate in the profiler.</p><p>I've written about how to do that in this quick note - <a href="https://qt.fershad.com/writing/change-gco2kwh-firefox-profiler/">Change the value for CO2e calculations in Firefox Profiler</a></p></div><p></p></div><h2>Watt else could you do?</h2><p>Get it. Watt. Power measurements. It’s been a long day, fam.</p><p>I don’t want to be a bad influence, and nerdsnipe folks into spending their weekends looking at power profiles. That said, I’ll just put a couple of ideas out there.</p><p>Since the profiler is capturing system level power consumption, you could feasibly:</p><ul><li>Measure the power usage of desktop applications, or</li><li>Measure the power usage of the same web page across different browsers, or</li><li>Measure the power usage of a web page in dark & light mode</li></ul><p>With all the other above, you’d want to capture a clean baseline of your system’s power usage before capturing a profile for whatever you’re testing.</p><h2>Closing thoughts</h2><p>This is super cool stuff, and exposes a lot of potential avenues for research and auditing. The impact of device usage is significant as a whole, even if the impact of individual devices is measured in milligrams. One only needs to look at <a href="https://blog.mozilla.org/en/mozilla/release-mozillas-greenhouse-gas-emissions-baseline/">Mozilla’s own greenhouse gas emissions findings</a>.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/d0fc5ea8b241ce725d4ed429fc92e553c2eea97a-1920x1080.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/d0fc5ea8b241ce725d4ed429fc92e553c2eea97a-1920x1080.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Chart showing that 98% of Mozilla’s emissions in 2019 came from the use of their products.</figcaption></figure></div>Making this website carbon aware2024-02-20T13:25:46Zhttps://fershad.com/writing/making-this-website-carbon-aware/<div><p>The idea of building carbon awareness into digital products, services, or apps is something that you’ll probably start to read a bit more about in the coming years. You might here some people talk about it as <em>“moving compute through time and space”</em>. Sounds very sci-fi, ey?</p><p>In fact, making digital products carbon aware is already starting to be a thing. Last year, Microsoft announced that <a href="https://support.microsoft.com/en-us/windows/windows-update-is-now-carbon-aware-a53f39bc-5531-4bb1-9e78-db38d7a6df20">Windows Update is now carbon aware</a>, while Apple have also taken <a href="https://support.apple.com/en-us/HT213323">tentative steps in this direction</a> with iOS version 16.1. Google’s been <a href="https://blog.google/outreach-initiatives/sustainability/carbon-aware-computing-location/">shifting workloads between data centers</a> based on the availability of carbon-free energy since 2021. The Green Software Foundation also held <a href="https://greensoftware.foundation/articles/carbonhack22-a-big-leap-in-carbon-aware-computing">CarbonHack22</a> last year, the first ever hackathon focused on carbon aware software and their applications.</p><h2>What makes something carbon aware?</h2><p>At its simplest, for something to be carbon aware means that it has an understanding of how clean/dirty the electricity grid it operates on is. This is called “grid intensity”, and it’s an important term that you’ll see mentioned a lot through this post. I’ll use it interchangeably with the term “carbon intensity”. These terms describe a way of measuring how much CO2 is emitted by producing a unit of electricity.</p><p>A clean grid is powered by more renewable/low-carbon fuel sources, while a dirtier grid gets more generation from fossil-fuels. Grid intensity isn’t a static metric. It can shift throughout the day and night, for example, as more solar is available when the sun’s up compared to when it’s not. Things that are carbon aware will have access to this information, and can be built to perform more resource intensive operations during periods when the grid is powered by more renewable energy (low grid intensity), while offloading less power-hungry tasks to period of higher fossil-fuel generation (high grid intensity).</p><p>Asim Hussain has <a href="https://devblogs.microsoft.com/sustainable-software/carbon-aware-vs-carbon-efficient-applications/">a terrific post about carbon-aware and carbon-efficient software</a>, which is definitely worth taking the time to read.</p><h2>So, a carbon-aware website?</h2><p>The idea of making a website carbon aware first came into my consciousness when <a href="https://branch.climateaction.tech/">ClimateAction.Tech’s Branch Magazine</a> launched in 2020. Depending on the carbon intensity of the UK’s electricity grid at the time you visit, you’ll be presented with a different version of the site.</p><p>Tom Jarrett has <a href="https://branch.climateaction.tech/issues/issue-1/designing-branch-sustainable-interaction-design-principles/">written about the considerations</a> that went into designing the Branch Magazine website to be responsive to grid intensity at a given time. The site is built on WordPress, and hosted in the United Kingdom. As far as I can tell from looking at the code, the Branch Magazine website uses data from <a href="https://www.carbonintensity.org.uk/">https://www.carbonintensity.org.uk/</a> to determine the current grid intensity and adjust the site’s appearance accordingly.</p><p>What this means, is that when I visit the site from Taiwan (which doesn’t have the cleanest power grid) I will see a carbon-aware version of the site that is based on where it is hosted rather than my local grid’s current intensity. I have no problem with this, and in some ways the decision makes sense to me. But since I first saw the carbon-aware design of Branch Magazine, I’ve always wondered if it would be possible to do the same thing but for it to be <em>based on the current grid intensity of the website visitor’s location</em> rather than the host.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/3050520fa7f1e223db7537bac618dad01efeaefd-2376x1290.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/3050520fa7f1e223db7537bac618dad01efeaefd-2376x1290.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">A side-by-side comparison of the carbon-aware (left) and regular (right) versions of my website. Visually, there’s not much noticeable difference, which is by design.</figcaption></figure><h2>The wait for data</h2><p>Branch was launched in late 2020. I’m now writing this in early 2023. Does that mean I’ve been subconsciously nerdsniped by this problem for the last two- and a-bit years? Yes. Yes, it does.</p><p>The problem stopping me from going further with my initial thought was having access to real-time grid intensity data. In 2020, the only provider I knew of was <a href="https://fershad.com/writing/making-this-website-carbon-aware/electricitymaps.com">Electricity Maps</a>. However, access to their API was paid and well out of reach for me in terms of affordability - especially for what effectively would be a prototyping exercise.</p><p>In recent times I’ve become aware of <a href="https://www.watttime.org/">WattTime</a>, another provider of real-time grid intensity data and forecasting. Again, especially at the time I found it, WattTime’s data was offered through a paid API which I couldn’t justify spending on.</p><p>From time to time, I’d have thoughts of sourcing grid intensity data myself. I’ve consider doing this for a handful of locations where most of my website’s visitors were coming from. But as a freelancer, just keeping my head above the water through a global pandemic, I wasn’t able to dedicate the time that would be required for that endeavour.</p><h3>Enter CO2signal</h3><p>The data problem was resolved when I stumbled across <a href="https://co2signal.com/">CO2signal</a>. It is a free, rate-limited API provided by Electricity Maps. The CO2signal API returns real-time, country-level grid intensity data based on either latitude & longitude coordinates, or a valid country code. This is pretty much what I needed to try out my crazy idea.</p><p>I cannot emphasise enough how important open, accessible data is in driving the ideation and innovation that will be critical in our transition to a cleaner, greener web/world. I’m not saying that folks should not be able to package up their hard work into paid services. It’s worth remembering, though, that developers especially have a lot of leverage. Allowing them to have access to quality data enables them to try new ideas, and build prototypes that can help shift thinking, inspire others, and help drive the change the planet needs.</p><h2>Building out an idea</h2><p>Okay, so I finally had some real-time grid intensity data I could use. Next, I needed time.</p><p>Living in Taiwan, the Lunar New Year holiday provides a solid 5-10 days (depending on the year) where things are closed, and people are either spending time with family or travelling. In our house, we normally head to central Taiwan for a few days, before returning north to Taipei.</p><p>I found out about CO2signal in December, 2022. Lunar New Year was early this year, in mid-January, 2023, so I sat on the idea until then. It gave me some time to think about what I might want to build, and some ways to go about it.</p><h3>Roughly specing it out in my head</h3><p>In the lead up to Lunar New Year, I started thinking more about this <em>thing</em> I wanted to do. I decided to try and apply carbon awareness to my own website, because it’s there to allow me to test out crazy ideas like this, right? Thinking through what I wanted to build revolved around a few key questions:</p><ul><li>What would a carbon-aware site look like for me?</li><li>What should the threshold be before “low-carbon mode” is activated?</li><li>Should/could this be implemented server-side, or did it need to run in the browser?</li><li>What, if any, control should users have over applying carbon-aware changes to the site?</li><li>How can I keep it working without blowing through CO2signal’s API limits?</li></ul><h4><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>What would a carbon-aware site look like?</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></h4><p>My website is already pretty lean, and I’ve tried to apply <a href="https://sustainablewebdesign.org/">Sustainable Web Design principles</a> where ever they’ve made sense. So, I had a think about what adjustments I could still make on a carbon-aware version of my site.</p><ul><li>Dark mode - nope, not yet. I’ll get a dark mode added one day. I could have done a quick and dirty colour swap, but I’d rather take time to make something more polished.</li><li>Lower image quality - definitely doable. I serve images from <a href="https://cloudinary.com/">Cloudinary</a>, and can easily change up a URL parameter or two to achieve this.</li><li>Remove/block JavaScript - again, pretty doable. There’s only a handful of JavaScript on this site anyway, none of which is critical to functionality.</li><li>Remove/block CSS - doable, but nah. I felt that might be a bit jarring for visitors to start with. It could be something I can allow users to turn on though. <a href="https://www.zachleat.com/">Zach Leatherman has a feature like that on his site (scroll to the footer).</a></li><li>Remove/hide images - Possible. But after trying it out, I felt the site would need a further redesign to make it look good when images were missing. Decided against it for now.</li></ul><h4><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>When does “low-carbon mode” kick in?</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></h4><p>As far as I’m aware, there’s no agreed upon standard that says <em>“at this carbon intensity, a grid should be considered high intensity”</em>. Without this, I had to make my own call. I settled on 221 g/kWh (grams per kilowatt-hour). This is half of the global average grid intensity, and so felt as good as any number I could pluck out of thin air.</p><h4><strong><strong><strong><strong><strong><strong>Server- or client-side?</strong></strong></strong></strong></strong></strong></h4><p>In Branch Magazine’s solution, the carbon intensity check takes place through a client-side script that runs when a user lands on a page. I wanted to avoid this if I could. First, there’s the “cost” of waiting for the script to parse, fetch data, and execute. I’d need to wait for this to complete before rendering the page, to ensure I’m not downloading content that’s not needed. On some pages, like my <a href="https://fershad.com/writing/">writing index page</a>, it would also require a large amount of DOM manipulation. This all could take time on slow connections or low-powered devices.</p><p>I also needed to know what country the website visitor was located in. I thought about trying out the Geolocation API, but it requests user permission to be activated. As someone visiting a blog website like mine, I’d be pretty freaked out if I saw the browser requesting my location, even if only for a <em>trying-out-an-idea-prototype</em>.</p><p>After a bit of digging, I found that the <code class="language-markup">request</code> object received by Cloudflare Workers <a href="https://developers.cloudflare.com/workers/runtime-apis/request/#incomingrequestcfproperties">includes <code class="language-markup">cf.country</code>, <code class="language-markup">cf.latitude</code>, and <code class="language-markup">cf.longitude</code> headers</a> with each request. This information comes from Cloudflare’s edge. I host my site on Cloudflare Pages, and so hooking up a Worker script would be absolutely possible. To boot, I’d be able to use <a href="https://developers.cloudflare.com/workers/runtime-apis/html-rewriter/">Cloudflare Workers HTMLRewriter</a> to modify a page on the server, before sending it back to the browser. I’ve used it before, but it still blows my mind every time I reach for it.</p><p>So that settled it. I’d be doing things server-side (or more specifically, on the edge).</p><h4><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>How much control should the user have?</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></h4><p>I’ve decided to make the carbon-aware experience on this site <em>opt-out</em>. Part of building out this idea is to expose people to the notion of carbon-aware digital. By making it opt-out, more people are likely to see it as a starter. If that can trigger the interest of a few more folks to start thinking about how they build for/use the web, then that’s a win.</p><p>I also want the experience of a carbon-aware website to be as seamless as possible for the visitor. In a way, it’s a challenge to the thinking that “low carbon” equals a poorer experience, or poorer performance.</p><p>In the end I decided that if a visitor was presented with the low carbon experience, they would be shown a message explaining what they are seeing. They’d have the option to visit the regular version of the site if they want to. If they opt for this, I set a cookie that expires after 1 day to indicate their preference. The Cloudflare Worker that does the heavy lifting to make the site carbon-aware looks for this cookie in the request, so the user can continue navigating the site without being bothered again.</p><h4><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>Operating within the CO2signal rate limits</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></h4><p>CO2signal allows for 30 requests per hour. Most of the time, my website doesn’t go anywhere near that number of visitors. Occasionally, though, it does. Also, grid intensity data doesn’t update <em>that frequently</em>, so I could probably get away with saving it somewhere for a while. Since I was using Cloudlfare Workers, an easy solution for this was to use <a href="https://developers.cloudflare.com/workers/runtime-apis/kv">Workers KV</a> to cache data from the API. I decided to cache results for 1-hour, after which time the grid intensity would be fetched again from the API. This prevents me having to repeatedly hit the CO2signal API, especially in the rare times when there’s a spike in traffic.</p><h2>Putting it all together</h2><h3>How it works</h3><p>When someone visits any page of this website, the following process is kicked off:</p><ol><li>A Cloudflare Worker checks the request object for the latitude, longitude, and country of the request.</li><li>If data is found, then a fetch request is made to the CO2signal API.</li><li>CO2signal sends back data about the <em>current</em> grid intensity at that location, if it has data for that country.</li><li>The Cloudflare Worker then checks if there is grid intensity in the returned data.</li><li>If the grid intensity is equal to or greater than 221 g/kWh, then the HTML response is modified before it is returned to the browser. If it’s less, then the original web page is returned.</li></ol><p>If at any point along in that process there is no data available or something goes wrong then the original web page will be shown.</p><h3><strong>What gets modified on a page?</strong></h3><p>When the grid intensity is equal to or greater than 221 g/kWh, the following modifications are made:</p><ul><li>Image quality is greatly reduced.<ul><li>Visitors can click to download better quality versions of any image if they need to.</li></ul></li><li>Remove AVIF images, since <a href="https://www.smashingmagazine.com/2021/09/modern-image-formats-avif-webp/#:~:text=Decoding%20AVIF%20images%20for%20display%20can%20also%20take%20up%20more%20CPU%20power%20than%20other%20codecs%2C%20though%20smaller%20file%20sizes%20may%20compensate%20for%20this.">decoding them can be more CPU intensive</a>. Although there is <a href="https://greenspector.com/en/which-image-format-to-choose-to-reduce-its-energy-consumption-and-its-environmental-impact/">research to suggest otherwise</a> (I have to follow up on this).</li><li>Non-critical JavaScript is removed from the site. This includes: <ul><li>Progress bars when reading blog posts.</li><li>Share links (using the Navigator.share API).</li><li>Instant.page script</li><li>JavaScript that creates CodePen embeds. The embeds are replaced with a link to the pen.</li></ul></li></ul><p>I also add a small snippet of HTML & JavaScript which shows a message at the bottom of the screen telling the user they are viewing a modified version of the site. The user can dismiss this, or choose to be shown the regular page. In both cases, a cookie is set to note the user’s preference. That cookie expires in 1 day.</p><h3>The outcome</h3><p>Overall, making the changes listed above sees about 50 kB to 100 kB reduction in the data downloaded on page load. For most pages on my site, that’s a reduction of 20% or greater.</p><p>I haven’t yet tested out the device level power savings of these changes. That could be something for a later post. It’s a pretty good reason to try out <a href="https://fershad.com/writing/co2e-estimates-in-firefox-profiler/">power profiling in the Firefox Profiler</a>.</p><h3><strong>Open source</strong></h3><p>If you want to get a sense for how all that looks in code then I've created <a href="https://github.com/fershad/carbon-aware-site-worker">a starter repository</a> on Github. It has some of the core code that powers the carbon-aware implementation, but doesn’t contain the image modifications or notification message.</p><p>I’ve kept this starter code minimal on purpose. In building this project I realised that there’s no cookie cutter solution for carbon awareness. It will look and feel different for each application and site. There’s a lot of planning, care, and design considerations that have to go into ensuring a low-carbon experience is still a useful, pleasant user experience.</p><h2>How could this be better/different?</h2><p>This whole project started as a way for me to test out an idea that’s been bugging me for ages. Are there ways that it could be better? Absolutely. Are they ways it could be implemented differently? It depends, but yeah probably. And are there ways that browsers could help promote low-carbon web design? Yes, I’ve got thoughts.</p><h3>Making it better</h3><p>There are a few things that could make the setup I’ve described earlier better.</p><h4><strong><strong>Regional grid intensity data</strong></strong></h4><p>In some countries (mostly large landmasses), electricity generation in different regions is taken care of by different providers. Sure, you can aggregate the grid intensity for all regions and provide a country-level average. But a more accurate carbon-aware implementation would get to the regional level if that data was available.</p><p>Electricity Map say that they plan to introduce regional level data into CO2signal in an upcoming version of the API. I hope they do. Since I’m using the latitude and longitude of the request, I’d be able to make full use of that regional data should it exist.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/4ad050c8e8d00b8f63de2123e37a5a5f16a48776-3640x1929.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/4ad050c8e8d00b8f63de2123e37a5a5f16a48776-3640x1929.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Screenshot of Electricity Maps visual dashboard. Here you can see that some countries (like the US and Australia) have data available for regional grids.</figcaption></figure><h4><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>Falling back to annual average grid intensity</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></h4><p>This was raised by <a href="https://indieweb.social/@marcradziwill@social.tchncs.de/109778055451906037">Marc Radziwill on Mastodon</a>. In the cases where CO2signal has no data for a location, I could try falling back to the annual average grid intensity for that country. As Marc points out, <a href="https://developers.thegreenwebfoundation.org/co2js/data/">CO2.js already has this data available</a>. Heck, I wouldn’t even need to install the library, since the data is available in <a href="https://github.com/thegreenwebfoundation/co2.js/tree/main/data/output">both JS and JSON</a> files on Github.</p><h4><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>Relative grid intensity</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></h4><p>Right now, I’ve picked a number of out thin air and set that as the value which determines whether or not a user gets a carbon-aware experience. For me, living in Taiwan, I’m <em>always</em> going to see the carbon-aware experience, because our grid is mostly powered by coal & natural gas.</p><p>It would make more sense if I could somehow look back historically and say <em>okay, the average grid intensity in Taiwan is X, so if the current grid intensity is above that <strong><strong><strong><strong>then</strong></strong></strong></strong> show the low-carbon page</em>. Actually, as I typed that I realised it’s something I <em>could</em> do today using the annual average grid intensity data I mentioned above.</p><p>I feel a version 2.0 coming on.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Update - 12th Feb., 2023</p><p></p><p>Since writing this, I have gone ahead and implemented the idea above. This website is now carbon-aware <em>relative</em> to the average annual grid intensity of the country a visitor is located in.</p><p></p></div><h4><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>I’ve noticed it sometimes fails</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></h4><p>I’ve noticed that occasionally the Workers script will fail, and return the regular web page with a custom header <code class="language-markup">carbon-aware-site: Error fetching data</code>. I’ve noticed this only happens sometimes, which makes it harder to work out what’s going on. It’s something to debug later.</p><h4><strong><strong><strong>I kinda broke some links</strong></strong></strong></h4><p>Yeah. Not the worst, since I could find and replace them in my code. But, after I implemented the Worker all paths on my site now need a trailing <code class="language-markup">/</code> to work. Without the trailing slash, Cloudflare chucks a tantrum and errors out. Another one to debug later.</p><h3>Doing it differently</h3><p>My site is built with Eleventy. All the pages are built, and pushed up to Cloudflare’s CDN network. Whenever a page is requested, a HTML file gets sent back. That’s why I had to use a Cloudflare Worker and HTMLRewriter to apply any carbon-aware changes outside of the browser. So now, I’ve got to remember now that this Worker exists & needs a separate deploy step whenever I update its code.</p><p><em>If</em> this site was instead a server-side rendered (SSR) site, then I could probably have some middleware as part of the backend code which renders a page whenever someone visits it. That feels like a nicer solution to me, having all the code in one place when it’s published.</p><p>If this was a PHP site … I don’t know much about that, but PHP is SSR right? This sounds like something a WordPress plugin could do.</p><p>Another option could be to build two versions of a website (like what Organic Basics have done - <a href="https://us.organicbasics.com/">regular</a>, <a href="https://lowimpact.organicbasics.com/">low-impact</a>). Then, I could redirect visitors to the appropriate version of the site.</p><h3>How could browsers help?</h3><p>At the start of this post, I highlighted a few examples of carbon awareness beginning to show up at the operating system, device, and server/data center level. If I was to revisit this post at the end of the year, I’m sure there’d be a few more examples I could add. It would be cool if web browsers were one of them.</p><h4><strong>A <code class="language-markup">prefers-eco</code> header</strong></h4><p>Okay, I just made that up, but you can see where I’m going, I hope. I say this as someone with no idea of what goes into building features into a browser. It sounds doable for browsers to allow users to set a preference that would then pass a header along with each request to indicate that the user would prefer a <strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><em>low-carbon experience</em></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong>. With this in place, sustainable web designers could include these experiences in their designs, and developers could more easily account for these preferences in their code.</p><p>Actually, though, there’s already the <a href="https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Save-Data">save-data header</a> which could be used for just that. In CSS, there’s also <a href="https://www.w3.org/TR/mediaqueries-5/#prefers-reduced-data">a spec for <code class="language-markup">prefers-reduced-data</code></a>. Both sound promising, but are still experimental with sketchy browser support at best. As the <code class="language-markup">prefers-reduced-data</code> spec points out, there’s also the risk that these could be used to fingerprint users, which is something that will need to be considered if they get further down the implementation path.</p><p>If you’re interested in either, then Polypane have <a href="https://polypane.app/blog/creating-websites-with-prefers-reduced-data/">a solid post on how they can be used/tried out</a>. Jeremy Wagner also <a href="https://css-tricks.com/help-users-save-data/">wrote about using <code class="language-markup">save-data</code></a> on CSS-Tricks way back in 2017.</p><h4><strong>Make the browser carbon aware</strong></h4><p>There are a heap of things that could happen here. What if, periodically your browser checked your local grid intensity & switched into a mode that consumed less power if the grid was <em>too</em> polluting? Here are some ideas:</p><ul><li>Browser tabs could be suspended sooner.</li><li>A header like <code class="language-markup">save-data</code> could be automatically set.</li><li>The user could be prompted to close tabs which have been unused for a certain period of time.</li><li>Dark-mode could be turned on automatically.</li><li>Plugins/extensions could be disabled, with the user able to specify ones to leave on (similar to how it’s done for Incognito/Private mode).</li><li>Autoplaying videos would be disabled (actually, this should be a feature anyway).</li><li>Video quality could be reduced slightly or capped.</li></ul><p>Those are just a few quick ideas. In 2019, Michelle Thorne led <a href="https://discourse.mozilla.org/t/firefox-eco-mode-brainstorming-how-can-the-internet-tackle-the-climate-emergency/46582/2">a brainstorming session</a> at MozFest where they came up with <em>a load</em> more. It’s quite the list.</p><h2>Closing</h2><p>I’m stoked that I’ve finally been able to build out a carbon-aware website concept. This is a living proof of concept, and I’ve already got ideas for some updates to how it works. I’d love to see how others go about implementing carbon awareness on their own sites or apps, and what learnings they take from it.</p><p>If you do build our own carbon-aware website, share it with me on Mastodon (@fershad@indieweb.social) or <a href="mailto:itsfish@fershad.com">by email</a>. Likewise, if you want to chat about the ideas or concepts covered in this post then reach out as well.</p></div>Improving the accuracy of website carbon emissions estimates2024-02-20T13:25:46Zhttps://fershad.com/writing/improving-the-accuracy-of-website-carbon-emissions-estimates/<div><p>Existing models for website carbon emissions are good for reaching a ballpark figure of website CO2 emissions. I touched on that last year in <a href="https://fershad.com/writing/website-carbon-beyond-data-transfer/">Website carbon: Beyond data transfer</a>. Models like the Sustainable Web Design model are extremely useful for providing an estimation framework, especially for developers who don’t have domain knowledge in the digital emissions space.</p><p>The importance of this cannot be understated. As more companies start thinking about digital emissions, dev teams are being asked to come up with estimates of their work’s impact. These teams are experts in HTML, CSS, JavaScript or some other combination of programming languages. Having models like <a href="https://sustainablewebdesign.org/calculating-digital-emissions/">Sustainable Web Design</a> (SWD), which are fairly easy to understand & implement, provide a great starting off point for people looking at website carbon emissions for the first time.</p><h2>Beyond the ballpark</h2><p>Recently, I’ve had a few conversations with folks who’ve been working on different website carbon estimation pieces. In most cases, they have been regular developers who are looking at website emissions for the first time. In all cases, they have been using the Sustainable Web Design model and asked “<strong><strong><strong><strong><strong><em>how accurate is this estimate?</em></strong></strong></strong></strong></strong>”.</p><p>At the same time, I’ve been working on the <a href="https://github.com/thegreenwebfoundation/co2.js/milestone/5">next release of CO2.js</a>. In it, we’re looking to address a common question we get from developers who use the library - <strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><em>how can I change X value in the SWD model?</em></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></p><h3>Assumptions and generalisations</h3><p>The SWD model contains a number of assumptions for how website data is loaded. Namely, these are:</p><ul><li>What percentage of visits to a site are new visitors (empty cache)</li><li>What percentage of visits to a site are returning visitors (warm cache)</li><li>What percentage of data for return visitors is downloaded (not served from cache)</li></ul><p>The model also uses global average grid intensity (442 g/kWh) to generate carbon emissions estimates.</p><p>In reality, these values would differ from website to website, web page to web page, and user to user. It’s also worth noting that the underlying data that’s the foundation for the model is a few years old now, and <a href="https://indieweb.social/@mandrasch@social.tchncs.de/109693205203712150">should be revisited</a>.</p><h2>Producing more accurate estimates</h2><p>Okay, so here’s the part that you’re probably clicked into this post for. How is it possible to improve the accuracy/meaningfulness of the carbon emissions estimates from a generalised model like Sustainable Web Design.</p><p>Before going further, though, I want to first acknowledge that the SWD model isn’t the only one around. What I go into below could also be applied to <a href="https://view.officeapps.live.com/op/view.aspx?src=https%3A%2F%2Ftheshiftproject.org%2Fwp-content%2Fuploads%2F2018%2F10%2FLean-ICT-Materials-1byte-Model-2018.xlsx&wdOrigin=BROWSELINK">the 1byte model</a>. There are also other methodologies specific to certain digital sectors - like <a href="https://dimpact.org/methodology">DIMPACT</a>, which I really want to get my teeth into during this year. Here’s a good paper on <a href="https://onlinelibrary.wiley.com/doi/epdf/10.1002/leap.1506">how Cambridge University Press have used it</a>.</p><h3>Adjusting the assumptions</h3><p>To start with, let’s stick to what we’ve just been talking about. Most website owners should have access to data about new and returning visitors at least. Getting information about how much data is downloaded by returning visitors takes a bit of extra work, but can be estimated too.</p><p>The SWD model uses the following percentages for each:</p><ul><li>75% first time visitors</li><li>25% return visitors</li><li>2% of data downloaded by return visitors</li></ul><p>But for a real website, a blog post that’s gone viral on Hacker News might have 90% first time visitors, while a homepage might have more like 60% first time visits.</p><p>If you’re running analytics on a site, then you can use that data to work out the first/return visitor split for a given page. To figure out how much data return visitors download, you can run a page through <a href="https://www.webpagetest.org/">WebPageTest</a> & include repeat views.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/31733081a2bb34c7dd593e0feeb3762f9813c4a3-1579x815.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/31733081a2bb34c7dd593e0feeb3762f9813c4a3-1579x815.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Screenshot of the WebPageTest homepage with a test selected, and “Include Repeat View” checked.</figcaption></figure><p>The results page for the test will show you how much data was downloaded for the first view & how much was redownloaded for the repeat view. You can use this to work out a percentage of data that return visitors download, and plug that into the SWD model.</p><p>For CO2.js, we’ve got a couple of <a href="https://github.com/thegreenwebfoundation/co2.js/issues/120">open</a> <a href="https://github.com/thegreenwebfoundation/co2.js/issues/109">issues</a> asking to make adjusting these figures easier. It’s something that will be possible once v0.12 comes out. That will include new functions which allow users to pass in their own values for these variables. You can <a href="https://github.com/thegreenwebfoundation/co2.js/pull/126">follow the pull request here</a>.</p><h3>Adjusting grid intensity</h3><p>In the same pull request, we’re also planning to give users the ability to adjust the grid intensity used for different system segments. Let’s unpack some of that terminology for a second.</p><p>The way a carbon emissions estimation model works, is that it first calculates the energy used by an operation. It then multiplies this by a figure known as grid intensity. Grid intensity is a value that represents how many grams of CO2 is produce per kilowatt-hour of electricity generation on a given grid. This multiplication of energy usage by grid intensity gives us a carbon emissions estimate.</p><p>As I mentioned earlier, the SWD model uses a global average grid intensity figure for all its calculations. But let’s say that you <em>know</em> your server is in Germany and your users are in France. Then for those two parts of the emissions calculation, you’d ideally be using grid intensity figures specific to those locations.</p><p>Of the four system segments used in the SWD model - data centers, networks, devices, production - data centers and devices are the ones where location is most likely to be known and so can be adjusted for. Sometimes there could be a case where it is known network traffic is also limited to within one country, and so could also use a country-specific grid intensity figure. Given the nature of global supply chains these days, the global grid intensity figure is a good fit for the production segment.</p><p>There are a couple of ways to find grid intensity data. CO2.js include <a href="https://developers.thegreenwebfoundation.org/co2js/data/">annual average grid intensity data</a> for over 60 countries, or you can source this data from a provider like <a href="https://ember-climate.org/data/">Ember</a>. If you want to go all in and track emissions in real-time, then you’d need to use an API like <a href="https://www.electricitymaps.com/">Electricity Maps</a> or <a href="https://www.watttime.org/">WattTime</a>.</p><h3>Per request weight, rather than page weight</h3><p>Okay, this one’s a bit hardcore, but if you’re using a model like SWD to work out the emissions of an entire page, then it’s definitely worth considering.</p><p>From my experience with website carbon calculators, most seem to take the entire page weight & put that through the SWD calculation to return a CO2 estimate. That’s well and good, but if you’re using a testing tool like Google Lighthouse to get the page weight, then you’ll also have access to data about every request made when that page loads. You can see where I’m taking this, yeah?</p><p>If we calculate the emissions of each request individually, and then tally them together, we have the potential to get to an even more accurate estimate. This is especially the case when coupling this approach with the adjustments talked about above. There are varying degrees to how far you can take this, each returning more accurate results. I talk through them below and each one builds on the ones before it.</p><ol><li><strong>Calculate the emissions of each request using the regular SWD model.</strong> This will return a result that’s probably going to be within a rounding error of the result you would get from using the total page weight.</li><li><strong>Check each request for green hosting, and use the renewables grid intensity for the data center segment of that request.</strong> In this scenario, if you have even one green hosted request you should get a number that’s lower than if you were to just use total page weight.<ol><li>You can check for green hosting using the <a href="https://developers.thegreenwebfoundation.org/api/greencheck/v3/check-single-domain/">Greencheck API</a>.</li><li>The SWD model uses a renewables grid intensity of 50 g/kWh.</li></ol></li><li><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>Check the location of each request’s host, and use the grid intensity for that location in the data center segment calculations.</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong> Results here could be higher or lower than if you just used total page weight. If you have a lot of requests served from regions that have high grid intensity, then this will skew the result up. Likewise, if you have more requests coming from lower grid intensity regions, you’ll get a lower emissions estimate.<ol><li>If you know a requests host IP address, then you can check its location using <a href="https://developers.thegreenwebfoundation.org/api/ip-to-co2/overview/">the IP to CO2 Intensity API</a>. This API also returns the annual average grid intensity for locations.</li><li>What we’ve done above is just for the data center segment. If you know the location of the device viewing the site, then you could also plug in those figures.</li></ol></li><li><strong><strong><strong><strong>Check if the result is not cached, and adjust the data downloaded by returning visitors figure accordingly.</strong></strong></strong></strong> A request might have <code class="language-markup">max-age=0</code> or <code class="language-markup">no-cache</code> set in <code class="language-markup">cache-control</code> headers. These requests will be downloaded each time any visitor loads the page. You could adjust the SWD calculation to reflect this. It will result in a higher emissions estimate.</li></ol><h3>Swap in known emissions values</h3><p>Finally, there might be times when you <strong><strong><strong><strong>actually</strong></strong></strong></strong> know the emissions associated with one segment of your website. Perhaps your hosting provider is able to give you an annual breakdown of the emissions associated with hosting your site. In that case, you could <a href="https://sustainablewebdesign.org/calculating-digital-emissions/#:~:text=Emissions%20Calculation%20Formulas,AC%20x%200.19">calculate annual emissions for all other segments</a>, and then add in the known emissions you have.</p><p>Or, perhaps you know the energy associated with loading a page on device (see: <a href="https://fershad.com/writing/co2e-estimates-in-firefox-profiler/">CO2e estimates in Firefox Profile</a>). Here, you could calculate the emissions of other segments for a page view, and then add in the device emissions you know.</p><h2>Rounding off</h2><p>Measuring, and reporting on, carbon emissions is <a href="https://podcasts.bcast.fm/e/rnk5kq2n-the-week-in-green-software-green-software-legislation">going to become an ever more important part of doing business</a>. That said, it’s not practical to expect web developers or IT teams to drop everything and acquire the complete domain knowledge to build out carbon emissions estimation tooling.</p><p>Frameworks for calculating estimates, like the Sustainable Web Design model, are important tools in helping people get started down this path. As we’ve covered in this post, they can help you get an initial ballpark estimate. With a few tweaks they can be refined to produce results that are far more accurate and representative for a given website.</p></div>2023. A look ahead.2024-02-20T13:25:46Zhttps://fershad.com/writing/2023-a-look-ahead/<div><p>After taking <a href="https://fershad.com/writing/2022-in-review/">a look back</a> at some of the stuff I got through in 2022, I thought it would be a nice exercise to cast an eye ahead and write out some of things I’m excited about for 2023. There’s a mix of professional and personal listed below, in no particular order.</p><h2>Extending CO2.js</h2><p>The back end of 2022 has seen CO2.js start to be used in more open source and commercial projects. I’ll be writing up some case studies on those for The Green Web Foundation later in the year.</p><p>In the meantime, I’m really excited for what some upcoming changes to the library can unlock. This week, I started working on a pull request that enables developers to use custom grid intensity, caching, and visitor figures in the Sustainable Web Design estimation model. This will allow for more accurate estimates to be produced using CO2.js, and opens the door for the library to be a viable option for use in website sustainability audits. You can <a href="https://github.com/thegreenwebfoundation/co2.js/pull/126">check out the PR here</a>, feedback on the API is welcome.</p><p>As the year goes on, I also expect that we’ll be adding a lot more open grid intensity data into the library.</p><h2>Nerding out on the Solar Protocol</h2><p>I’ve had this on my to-read list of a while. The <a href="https://computingwithinlimits.org/2022/papers/limits22-final-Brain.pdf">Solar Protocol</a> explores the potential of solar-powered computing, and how different servers around the globe can be used as a distributed network running on the power of the sun.</p><p>I’d really, really like to explore the possibility of having a website hosted on the Solar Protocol that lives on a few solar-powered computers around the world. When someone visits the site, it would be served from the location with the highest solar energy generation at that time. This is something Chris Adams and I have touched on briefly during chats about Green Web Foundation stuff, so hopefully it’s something we can actually build together later this year.</p><h2>Just travelling again</h2><p>I haven’t left Taiwan for close to three years, and am very keen to travel again. There are a lot of folks I work and collaborate with online who I’ve never met in person. I’m hoping to get the chance to have an in-person beverage with some of them later in the year.</p><p>Of course, I have to get back to Australia at some stage this year too. Age is slowly catching up with my parents, and expecting them to come to Taiwan’s a bit of a stretch.</p><p>And, of course, there’s bound to be an obligatory trip or two to Japan.</p><h2>International Touch Footy!</h2><p>Chinese Taipei (say it with me now, TAIWAN) Touch Association is a very small member of the International Touch community. In 2019, we attended our first ever international event as a country when we sent one team to play in the 2019 Touch World Cup. The momentum we hoped to build off that was curtailed by COVID, so we’re starting from close to square one again this year.</p><p>That said, I am really excited at the prospect of our association sending multiple teams to multiple international tournaments this year. I’m especially eager to help organise a junior team which we hope can travel to Australia or England (undecided yet which one) to represent Taiwan at a Youth Touch tournament. As president of the association, I’m really hopeful that this could be a catalyst to have more kids take up Touch here in Taiwan.</p><p>I’m also hoping that I can pull on the boots once again to represent Taiwan as a player at the Asian Championships in September. We hope to send multiple teams to that tournament, which would be an awesome achievement for an association of our size.</p><p>We’ll also be holding our (used to be annual) Taipei Touch Tournament in March this year. We’re expecting to have club teams from around the region coming to Taipei for a weekend of good footy & even better food! Touch has allowed me to become friends with some amazing people throughout the Asian region, and I can’t wait to see some of them again this year.</p><h2>Writing more. Reading more.</h2><p>I think I’ve written more in the last year and a half than I did in the 5 years I spent in a marketing job here in Taiwan. I’ll be doing a lot more of it this year, both through this blog and for The Green Web Foundation.</p><p>Reading, however, is something I’m less regular with. In 2023, I want to be more consistent with reading blog posts, research papers, and even the odd book. Keeping notes on these things is something I’ve also struggled with. I’m currently trying out <a href="https://readwise.io/read">Readwise Reader</a>, which keeps highlights alongside the web pages, PDFs, and videos. This is something I’ve been searching for, and I really hope that this solves that problem for me.</p><p>I’m sure that through all the reading & writing, I’ll get nerd sniped into building a prototype or two over the year! Let’s see how the year unfolds, and take a look back at all this (and more) in a 2023 review post come December.</p></div>2022 in review2024-02-20T13:25:46Zhttps://fershad.com/writing/2022-in-review/<div><p>All in all, 2022 has been a pretty cool year for me. The back half of the year was especially fun, with the opportunity to work on some really cool projects! Here’s a collection of a few notable things that went down over the course of the year.</p><h2>Are my third parties green?</h2><p>At the start of the year, I released <a href="https://aremythirdpartiesgreen.com/"><strong><em>Are my third parties green?</em></strong></a>. I built it after reading the <a href="https://almanac.httparchive.org/en/2021/third-parties#prevalence">third-parties chapter of the 2021 Web Almanac</a> which found that for over 45% of website requests were third-party requests. Reading that, got me wondering how many of those requests were being served from green web hosting.</p><p>I built the tool over the course of January and February. In the process I got to play around with Google Cloud Functions, Cloudflare Workers and KV, as well as the pre-1.0 version of SvelteKit. You can read all about that process in <a href="https://fershad.com/writing/building-are-my-third-parties-green/">this blog post</a>. I originally launched <strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>Are my third parties green?</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong> as a website to scan sites. In the months after launch, I <a href="https://fershad.com/writing/adding-a-directory-and-api-to-are-my-third-parties-green/">added a directory & API</a> to the tool.</p><h3>What’s next for this project?</h3><p>Since launching, there’s been over 1900 tests have been run through the tool, which is waaaaaaay more than I ever anticipated! All these results are stored in Cloudflare Worker’s KV, which was a decision I made to help facilitate caching & sharing. In early 2023 I want to sift through these results and write up my own Web Almanac-ish state of green third-parties review.</p><p>I’ve also got it on my to-do list to update the site to SvelteKit 1.0, and do an update of data in the API and directory. I’d also really like to contribute back to <a href="https://github.com/patrickhulce/third-party-web">Third Party Web</a>, the Github repo that powers part of this project.</p><h2>Flowty</h2><p>Around the same time as <strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>Are my third parties green?</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong> was launched, I got involved in a conversation in the <a href="http://climateaction.tech/">ClimateAction.tech</a> community which focused on making sites built in Webflow greener. That led to a conversation with designer <a href="https://www.suninthecorner.com/">Katy Jackson</a>, which in turn led to me exploring how to go about converting a Webflow site to static content that could be easily uploaded to green web hosting.</p><p><a href="https://www.suninthecorner.com/">That exploration gave birth to Flowty</a>. With Flowty, designers could build and maintain sites in Webflow. The Flowty script would take care of converting & optimising the site before uploading it to a connected hosted service. This would allow climate-conscious designers to still use tools they are comfortable with to create low carbon sites, and host them on green web hosting platforms.</p><p>However, just before launching Flowty I had someone raise that the tool might be in breach of Webflow’s Terms of Service. I asked around, and couldn’t get a straight answer through Webflow’s support channels. I finally got an answer when Webflow issued cease and desists notices to a couple of other services that operated in similar ways to Flowty.</p><h3>Sunsetted</h3><p>Webflow’s actions led to me deciding to open source <a href="https://github.com/fershad/flowty">Flowty’s source code</a> and sunsetting the project. By making the code available, I hope that those who do care about web sustainability, but want/need to use Webflow can still have a means to build & host their site sustainably. Despite being discontinued, the project recently made an appearance <a href="https://thenewstack.io/is-low-code-development-better-for-the-environment/">in this article</a> on the sustainability of low-/no-code platforms.</p><h2>A few first - Conference talks. Podcasts. Web Almanac.</h2><p>In the middle of the year, I gave my first ever conference talk as part of <a href="https://webdirections.org/lazyload/">LazyLoad 2022</a>. I talked through an extended version of the <a href="https://youtu.be/LD8HiUGdsX0">Web Performance and the Planet presentation</a> I gave earlier in the year to the Toronto Web Perf. Meetup.</p><p>Both speaking events were virtual, and bloody heck it’s hard just talking into a camera! I’ve got to give a massive shoutout to WebPageTest’s own <a href="https://twitter.com/HenriHelvetica">Henri Helvetica</a> for encouraging me to take those first steps into the speaking scene.</p><p>2022 also saw me guest on a podcast for the first time. I was the interviewee on <a href="https://anchor.fm/greenio/episodes/Fershad-Irani---Using-website-performance-to-green-the-web-e1f6179">Green I/O’s first ever episode</a>! A big thanks to <a href="https://www.linkedin.com/in/gaelduez">Gaël Duez</a> for reaching out to me, and entrusting me with the honour of being his first guest.</p><p>Through the year, I also did some work as an analyst on the <a href="https://almanac.httparchive.org/en/2022/sustainability">2022 Web Almanac’s Sustainability chapter</a>. This is the first time there’s been a dedicated chapter on web sustainability in the Web Almanac. It’s a huge step for the community, and hopefully we can have enough contributors to have chapters in future editions of the Almanac.</p><h2>Working with The Green Web Foundation</h2><p>Definitely the highlight of the year was having the chance to work with The Green Web Foundation. I saw they had a technical writing position open, but wasn’t too sure if I’d be the a fit for the role as I had no formal technical writing background. I applied anyway, and after a chat with <a href="https://www.linkedin.com/in/mrchrisadams">Chris Adams</a> I was stoked to find out that they were keen for me to get on board to help build out developer documentation for some of their core tools.</p><p>In the past six months we’ve created a <a href="http://developers.thegreenwebfoundation.org/">developers docs site</a> for some of the foundation’s repositories and APIs. But we didn’t just stop with that. Working with Chris, I started contributing code updates to the <a href="https://github.com/thegreenwebfoundation/co2.js">CO2.js library</a>. We’ve done some cool things with that, including extending the library so that it can run in <a href="https://developers.thegreenwebfoundation.org/co2js/tutorials/getting-started-node/">Node on the server</a>, <a href="https://developers.thegreenwebfoundation.org/co2js/tutorials/getting-started-browser/">in the browser</a>, as well as in other environments like <a href="https://github.com/fershad/co2js-cloudflare-worker-api">Cloudflare Workers</a>, <a href="https://github.com/thegreenwebfoundation/co2.js/issues/115">Deno, and Bun</a>.</p><p>I’ve really enjoyed working with The Green Web Foundation team, and it seems like they’ve enjoyed having me around too. I’m super excited to continue working with them in 2023 on a more regular, part-time basis. I’m really looking forward to building more cool stuff, thinking through some interesting problems related to digital sustainability, and sharing those with all of you too.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/afa60cc975b3e09181d9b970dd6049cd061454a4-862x1103.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/afa60cc975b3e09181d9b970dd6049cd061454a4-862x1103.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Screengrab of The Green Web Foundation team in October, 2022.</figcaption></figure></div>Checking where website requests come from with ReqCheck2024-02-20T13:25:46Zhttps://fershad.com/writing/checking-where-website-requests-come-from-with-reqcheck/<div><p><a href="https://reqcheck.fershad.com/">ReqCheck</a> is a tool I’ve built to help folks find out where all the different requests made by a web page are served from. The idea for it came while I was writing up a <a href="https://fershad.com/writing/cop27-egypt-a-webpage-sustainability-review/">website sustainability audit of the COP27 homepage</a>.</p><p>One of the things I got into as part of that audit was finding out where the COP27 homepage was being hosted. In that audit, I stopped at the homepage. However, trying to determine where the site was hosted got me wondering whether it would be possible to automatically check where <strong><strong><strong><strong>all</strong></strong></strong></strong> the requests on a given page were coming from. ReqCheck aims to do just that.</p><h2>Introducing ReqCheck?</h2><p><a href="https://reqcheck.fershad.com/">ReqCheck</a> is a tool that takes the results of a <a href="https://webpagetest.org/">WebPageTest</a> run, and surfaces information about where each request is served from. It does this by using data from a couple of APIs provided by <a href="https://www.thegreenwebfoundation.org/">The Green Web Foundation</a>. The results show the countries from which web page resources are served, as well as information about the carbon intensity of that location. Requests made from green web hosts are also highlighted.</p><h3>Why use WebPageTest results?</h3><p>WebPageTest allows for tests to be run from a number of different locations around the world. Using WebPageTest results as the data source for ReqCheck opens up the possibility for folks to check how requests for a page are served depending on where a user might be visiting from. By doing this, the impact of different hosting providers and CDNs should become clearer. It should also help to make visible the carbon intensity differences in global energy grids, and how this can affect a page’s sustainability profile.</p><h2>Using ReqCheck</h2><p>To use ReqCheck, you’ll first need to run a test on WebPageTest. You will also need to ensure that the test results are public (this is on by default, but can be turned off if you’re logged in with a WebPageTest account).</p><h3>Running a test</h3><p>With your result URL in hand, you can then:</p><ol><li>Head over to <a href="https://reqcheck.fershad.com/">https://reqcheck.fershad.com</a>.</li><li>Enter a public WebPageTest result into the form, and click Submit.</li><li>The results are then scanned, and a set of unique IP addresses used for requests is compiled.</li><li>This is then passed into The Green Web Foundation's <a href="https://developers.thegreenwebfoundation.org/api/ip-to-co2/overview/"><strong>IP to Carbon Intensity API</strong></a> to check the location and grid intensity of each IP address.</li><li>The tool also uses The Green Web Foundation's <a href="https://developers.thegreenwebfoundation.org/api/greencheck/v3/check-single-domain/"><strong>Greencheck API</strong></a> to test if a request is served from a green host.</li></ol><p><em>It should be noted that ReqCheck uses the Median Run, First View results from a WebPageTest run. A future version might include Repeat View results to show the impact of caching, but don’t hold me to it!</em></p><h3>Reading the results</h3><p>Once ReqCheck has finished analysing the site, you’ll be taken to the results page (here’s a <a href="https://reqcheck.fershad.com/results/221027_AiDcFE_7H6">demo page</a> that you can refer to). Here, you’ll be presented with:</p><ul><li>Some brief details of the WebPageTest run that was analysed</li><li>A summary of the ReqCheck test analysis</li><li>A breakdown of IP addresses by country</li></ul><p>For the rest of the time, we’re going to focus on the IP address breakdown, because that’s where the detail lies.</p><h3>IP addresses by country</h3><p>The countries listed on the results page are sorted from the one with the most unique IP addresses to the least. Sometimes, the IP to Carbon Intensity API isn’t able to work out where an IP address is located. If this happens, those results are shown at the bottom of the page.</p><p>Under each country’s name and flag, you’ll get a short summary of:</p><ul><li>How many requests came from there</li><li>How many host IP addresses served those requests</li><li>The carbon intensity of that country’s electricity grid</li><li>The percentage of annual electricity generation from fossil fuels</li></ul><p>This can start to give a sense of how carbon intensive resources hosted in that country might be.</p><p>Below this summary are details of each individual IP address from the country in question. Each block represents a unique IP address, and presents information about:</p><ul><li>The host that is served from that IP address</li><li>Whether it uses a CDN (if so, which one)</li><li>How many requests are served from that address</li><li>And whether or not the host domain is a known green web host</li></ul><p>The green hosting information comes from <a href="https://developers.thegreenwebfoundation.org/api/greencheck/v3/check-single-domain/">Greencheck</a>, another API from The Green Web Foundation. If a green host is found, the IP address block is turned green and a leaf icon is shown in the top right. This additional information about each IP address allows developers to be more informed about how a web page’s resources are hosted.</p><h2>Using these results</h2><p>The main purpose of ReqCheck is to be informative. It aims to give extra insight into how a web page comes together. With this information, website owners can start to think about what they can do to reduce website carbon emissions on their domain and beyond.</p><p>By presenting information about the grid intensity, and fossil-fuel generation, developers can start thinking about possibly moving self-hosted resources to greener regions. Additional information about the CDN/hosting provider gives additional context to help with decision making. Perhaps requests are being served from a region that has a grid with high carbon intensity, but are hosted on a green web host. In this case, changing how these requests are served could be given a lower priority than ones which are not served from a green host.</p><p>Sometimes, you’re not in control of how the assets served on a web page are hosted. This is often the case with third-party requests. In those cases, the information from ReqCheck can help you start a conversation with your service provider about becoming a green web host. The Green Web Foundation have <a href="https://www.thegreenwebfoundation.org/sample-emails/">some email templates</a> to help you get started.</p></div>Driven by defaults2024-02-20T13:25:46Zhttps://fershad.com/writing/driven-by-defaults/<div><p>When talking with folks about web sustainability, there are often two tracks of conversation that we regularly venture down. The first stems from people just being unaware of the physical impact of our digital lives. The other, is often a variation on “but my [website/app/digital presence] is so small, changing that won’t make a difference anyway”.</p><p>Often, all I can say back is “yeah, you’re probably right”. Because, especially when it comes to web sustainability, our sites are not silos. A more sustainable web requires <em>all</em> website owners to be moving in the same direction. Thinking about this, and the scale of the web, it sounds like an impossible task.</p><p>That’s where, I believe, defaults have a vital role to play.</p><h2>It starts with requirements</h2><p>Developers have a lot of decisions to make, even when building what might look like a “simple” website. In most cases they’ll have a visual mockup/wireframe provided by a designer, as well as specifications provided by the person requesting the website. Then comes question, after question. What stack/framework/database should be used? Should this site be SSG, SSR, MPA, SPA, PWA, or whatever other flavour of the month acronym is going around? What will the development setup look like? Do I need a build tool or bundler? What packages or services do I need to achieve functionality? Somewhere along the line there’ll also be a deadline imposed. This might mean some decisions have to be rushed.</p><p>In a perfect world, the requirements given to the developer at the start of the project would include performance, accessibility, and sustainability considerations. This would help the developer to at least keep those in mind when making decisions later in the project. Most times, though, these are not included at all. With Google’s focus on performance and legal requirements around accessibility, you <strong><strong><strong>might</strong></strong></strong> find detailed requirements in some briefs. But requirements to ensure a sustainable website is delivered … probably not.</p><p>More regulation and reporting around sustainability and carbon emissions might see this change. I hope it does.</p><h2>npm install sustainable-web</h2><p><em>I don’t think that’s a real package (yet), but probably best not to run that in your console just to be safe.</em></p><p>Sustainability requirements is a good place to start, but that’s still primarily focused on a single website. To push faster change at a larger scale we need to start looking at some of the questions developers are faced with when building out projects. These focus on the tools and services being used, and ensuring that those come with sustainable defaults out of the box. Often times, these defaults are what gets shipped to production.</p><p>What kind of levers can be pulled? What would sustainable defaults look like? You can find some ideas in this <a href="https://screenspan.net/blog/green-by-default/">post by Brian Louis Ramirez</a>. Below are just a few aspirational ideas I’ve got. They range in their degree of practicality and implementability.</p><h3>Green regions by default</h3><p>When you spin up a cloud service using AWS, GCP, Azure, or some other provider, you’re often asked to select a region in which that service should run. Most times, this selection is presented as a list of regions with bugger all (translation: no) additional information. It’s up to developers to find out details about the region, and that includes information about sustainability. When spinning up a new project on a tight deadline, you’ll more often than not end up going with the default presented by these services.</p><p>Now I don’t know my <code class="language-markup">us-east-1</code> from my <code class="language-markup">us-east-2</code>, so I definitely wouldn’t be able to make a snap decision about which region runs on a cleaner energy grid. The only cloud provider I’ve seen with public information about this is Google, who’ve got <a href="https://cloud.google.com/sustainability/region-carbon">a page</a> dedicated to presenting the grid intensity of their data center locations. Their default <code class="language-markup">us-central1</code> location is also one of their most green (though using <code class="language-markup">northamerica-northeast1</code> Montreal as the default would be even better).</p><p>Having public information on the energy intensity of data center locations from each cloud provider would be terrific. But, even if they want to keep this information private for whatever reason, setting the default region to their most green region would be a massive first step. Not only would it see more developers running cloud operations on low-carbon hosting, but it’s impact would flow down to other providers who’ve built services and abstraction layers on top of the major cloud providers.</p><h3>Carbon-aware browsers?</h3><p>I think this one might be a bit controversial. To what extent can browsers help to automatically reduce the amount of energy being used to serve sites on a user’s device. Could browsers be made carbon-aware, and serve different content based on the grid intensity in the user’s location?</p><p>We’re already seeing carbon-awareness entering mainstream software. <a href="https://support.microsoft.com/en-us/windows/windows-update-is-now-carbon-aware-a53f39bc-5531-4bb1-9e78-db38d7a6df20">Microsoft recently announced</a> that Windows 11 will now schedule updates to run when more electricity is coming from lower-carbon sources on a user’s electric grid. Apple also <a href="https://support.apple.com/en-us/HT213323">announced Clean Energy Charging</a> for iOS 16.1 users in the US.</p><p>The idea of serving different content based on grid intensity is similar to what some websites, <a href="https://branch.climateaction.tech/issues/issue-1/designing-branch-sustainable-interaction-design-principles/">like Branch magazine</a>, already implement today. They’re doing it with <a href="https://github.com/climateaction-tech/branch-theme/blob/master/js/gridintensity.browser.min.js">a little bit of JavaScript</a> on the client, and using the UK grid as the basis for deciding what content to show. But while the UK grid might be green, I’m visiting the site from Taiwan where we are still very fossil-fuel reliant.</p><p>It would probably be a decent hit to Branch’s website performance if each visitor’s location was requested & then an API call was made to get grid intensity data. This would hurt the site’s user experience, and search performance too. But, if the browser was doing that for every site I visit, then everyone is on an equal footing.</p><p>This is controversial, in one sense, because a browser’s primary role is to render and serve whatever gets sent to it. If that’s <a href="https://almanac.httparchive.org/en/2022/page-weight#other-assets">a 110MB font file</a>, then so be it. Its implementation would also be fraught with gotchas, since every website is unique. There’d also be privacy concerns to be reckoned with here, since the browser would need to be aware of the user’s location.</p><h3>Lightweight embeds</h3><p>Sticking with the content theme, wouldn’t it be great social media and video embeds were light by default. There’s no reason why a single YouTube video (which I might not even watch) on a page should result in my browser having to download and parse <a href="https://www.smashingmagazine.com/2022/02/reducing-web-carbon-footprint-optimizing-social-media-embeds/#youtube">600 kB of JavaScript</a>. These embeds can also hurt a site’s performance as well.</p><p>There are already lightweight alternatives for many popular third-party embeds. In the past, I’ve written about how these could be <a href="https://fershad.com/writing/youtube-facades-with-cloudflare-workers/">used with Cloudflare Workers</a> to reduce the impact of YouTube & Vimeo content on a page. That’s all good if you’ve got the time to find and implement these packages.</p><p>Imagine how nice would it be, though, if the default embed code for a YouTube video was a lightweight snippet that anyone could copy into their code/CMS without having to think about how it might impact web page performance or sustainability.</p><h3>Sustainable-first search results</h3><p>Just picture if Google turned around tomorrow and said “we’re going give a search ranking boost to domains that are hosted on a green web host”. We saw a sudden surge in people taking an active interest in web performance when Google made a similar statement for Core Web Vitals. Albeit that Core Web Vitals is a Google initiative, having them as a ranking factor saw a lot more website owners suddenly focusing on their site’s performance.</p><p>The Ecosia search engine already <a href="https://blog.ecosia.org/green-search/">highlights planet-friendly organisations</a> in its search results. Meanwhile, Alphabet has also started to <a href="https://blog.google/outreach-initiatives/sustainability/sustainability-2021/">surface sustainability information and nudges</a> across several of its products. Having the default Google search experience return prioritised sustainable results, or presenting indicators next to results, would have a profound impact in the movement towards a more sustainable web.</p><h2>More defaults, less thinking</h2><p>The ideas I’ve presented above are just that, ideas. Some are a pretty whack, while others are more realistic. When it’s all said and done, though, the less I need to actively think about the sustainability impacts of my decisions as a developer, consumer, or human the better.</p></div>COP27 Egypt: A webpage sustainability review2024-02-20T13:25:46Zhttps://fershad.com/writing/cop27-egypt-a-webpage-sustainability-review/<div><p>Around this time last year there was a lot of focus on the upcoming COP26 summit in Glasgow. Rightly so, as it marked the five-year anniversary of the Paris Agreement. Under that agreement, countries agreed to revisit and strengthen their commitments towards limiting global temperature rise to 1.5 degrees Celsius at the 26th COP.</p><p>With so much attention on COP26, I got curious about how their website stacked up in terms of digital sustainability and performance. So, I took a look under the cover and wrote up <strong><strong><a href="https://fershad.com/writing/cop26-a-quick-sustainability-check/">COP26.org: A quick sustainability check</a></strong></strong>.</p><p>The article got a fair bit of attention, definitely more than anything I’d written before. This attention helped catch the eye of some folks over at the UK’s Government Digital Services (GDS) team. With their help, we were able to address one of the factors contributing to the large size of the COP26 homepage.</p><p>While COP26 grabbed a lot of headlines, COPs (mercifully short for Conference of the Parties to the United Nations Framework Convention on Climate Change) actually happen annually. This year’s COP event is being hosted by Egypt, and starts on November 7th.</p><p>As it approaches, I thought it would be “fun” to take a look at this year’s COP homepage. How does it do in terms of website sustainability?</p><h2>What we’ll look at</h2><p>For this review, we’re going to focus on the desktop version of the COP27 homepage (<a href="https://cop27.eg/">https://cop27.eg/</a>). We will:</p><ul><li>Touch briefly on the page’s Core Web Vitals & performance story.</li><li>Evaluate its sustainability profile by concentrating on hosting, data transfer, and device energy consumption.</li></ul><p>Though that’s only two bullet points, we’ll actually be covering a lot in this post. That includes trying out some experimental diagnostic tooling for the first time.</p><h2>Performance - Core Web Vitals</h2><p>To start with, let’s take a look at the site’s performance through the lens of Core Web Vitals. This is a Google initiative which ties website performance geekery (aka metrics) with tangible user experience outcomes.</p><p>Though this post is concerned with the website’s sustainability profile, its performance history helps to set the scene for what we’ll cover later.</p><h3>Field data</h3><p>Google makes Web Vitals data for websites accessible to the public. We'll use that to see how the site is performing in the real world. I personally like to use <a href="https://treo.sh/sitespeed">Treo’s Site Speed Audit</a> tool for this, since it presents the data in an easy to digest visual format.</p><p>Looking at the COP27 website, and <a href="https://treo.sh/sitespeed/cop27.eg?formFactor=desktop">filtering for desktop sessions</a>, we can see that something happened between July and August which absolutely destroyed the website’s paint metrics (First & Largest Contentful Paint).</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/4506b7119a6c7599cf5dbdfbe6e47b2e84f41be2-1719x575.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/4506b7119a6c7599cf5dbdfbe6e47b2e84f41be2-1719x575.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Screenshot from Treo showing the COP27 website’s Time to First Byte (TTFB), First Contentful Paint (FCP), and Largest Contentful Paint (LCP) metrics. TTFB remains steady, but from August FCP & LCP both deteriorate significantly.</figcaption></figure><p>It’s worth bearing in mind that the results we see in Treo are for the entire origin (all pages visited on that website) rather than <strong><strong>just</strong></strong> the homepage. But it’s pretty clear that something changed. Given how it’s impacted the paint metrics, my initial guess was that perhaps a larger main image was swapped in. To find out, let’s use the <a href="http://web.archive.org/">Wayback Machine</a>.</p><h3>August, 2022. A redesign.</h3><p>Spoiler in the heading, ey.</p><p>Yep, there was a website redesign in August (or late July) which seems to have coincided with the deterioration in paint metrics that we see in Treo. Now I’ll hold off on commenting about the design until later in this post, but below are before and after screenshots.</p><p><strong>Before - July 2nd, 2022</strong></p><p><a href="http://web.archive.org/web/20220602135905/https://cop27.eg/">http://web.archive.org/web/20220602135905/https://cop27.eg/</a></p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/aa8096a1450d5a457e9fb110a6a174ab4b2541f9-2241x1399.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/aa8096a1450d5a457e9fb110a6a174ab4b2541f9-2241x1399.png?auto=format" alt="undefined" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Screenshot of the COP27 homepage from July 2nd, 2022 on Wayback Machine.</figcaption></figure><p><strong>After - August 2nd, 2022</strong></p><p><a href="http://web.archive.org/web/20220802092302/https://www.cop27.eg/">http://web.archive.org/web/20220802092302/https://www.cop27.eg/</a></p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/5e09749f013910f35e5ad996bfdf4fd5fe3acef0-2245x1404.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/5e09749f013910f35e5ad996bfdf4fd5fe3acef0-2245x1404.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Screenshot of the COP27 homepage on August 2nd, 2022 from Wayback Machine.</figcaption></figure><p>Quite a significant change above the fold for the homepage. One interesting thing to note here is the size of the main image. The image of the windmills on July 2nd was a 111 kB PNG file, while the audience photo from August comes in at 2.1 MB (still a PNG).</p><p>Another thing to note, from a performance perspective, is that the July image is discovered through an <code class="language-markup"><img></code> tag in the HTML. In August, that’s changed to an inline CSS <code class="language-markup">background-image</code> that’s loaded as part of what seems to be a JavaScript powered slider. The image in August is not preloaded either. Harry Roberts <a href="https://csswizardry.com/2022/03/optimising-largest-contentful-paint/#use-the-best-candidate">covers these patterns and some</a> over on his blog. To grossly simplify it, the <code class="language-markup"><img></code> tag <strong><strong><strong><em>should</em></strong></strong></strong> always win out.</p><p>The two things above are probably enough to have pushed out the LCP, though the degree of change would suggest some other factors are at play too. You're welcome to dig around and see what you can find.</p><p>While we were looking at performance, though, we’ve also started to surface some website sustainability issues. Let’s shift focus to that.</p><h2>Sustainability</h2><p>In last year’s audit of the COP26 website, I started looking at sustainability by calculating the homepage’s carbon emissions using Beacon, one of many online website carbon estimation tools. We’re not going to be doing that this year, for <a href="https://fershad.com/writing/website-carbon-beyond-data-transfer/">reasons I’ve outlined earlier this month</a>.</p><p>Instead, we’ll look at each individual segment of a website’s sustainability profile. There are three key areas that developers and website owners can control:</p><ul><li>Servers (data centers and hosting)</li><li>Networks (data transferred over the wire to load site content)</li><li>Devices (how the website impacts the devices used to view the site)</li></ul><h3>Servers - hosting and CDNs</h3><p>Let’s start by building a sustainability profile of how the website is hosted.</p><h4><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>Does it use green hosting?</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></h4><p>The first thing we’ll do is check if the website uses a known green web host (or CDN). To find out, we’ll use The Green Web Foundation’s <a href="https://www.thegreenwebfoundation.org/green-web-check/">Green Web Checker</a>. Running the domain (<code class="language-markup">https://cop27.eg</code>) through this tool reveals that the site is not hosted on a known green web host or CDN.</p><h4><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>Where is it being hosted?</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></h4><p>It looks as though the site itself is hosted in Egypt, and doesn’t use a Content Delivery Network (CDN) to serve cached versions to global users. To uncover this, I ran the page through WebPageTest from three different locations:</p><ul><li>Virginia, USA (<a href="https://www.webpagetest.org/result/221024_BiDcEW_8Y5/1/details/#waterfall_view_step1">test results</a>)</li><li>Frankfurt, Germany (<a href="https://www.webpagetest.org/result/221024_BiDcEW_8Y5/1/details/#waterfall_view_step1">test results</a>)</li><li>Sydney, Australia (<a href="https://www.webpagetest.org/result/221027_AiDc5R_756/1/details/#waterfall_view_step1">test results</a>)</li></ul><p>Looking at the request headers (found at the bottom of each test result’s details page), we can find the IP address of each website request. In each test, the homepage document itself, and resources from the <a href="http://cop27.eg/">COP27.eg</a> domain, come from the same address - <code class="language-markup">163.121.141.38</code>.</p><p>Knowing this, we can find out where the site is hosted. We’ll another tool from The Green Web Foundation - their <a href="https://developers.thegreenwebfoundation.org/api/ip-to-co2/overview/">IP to CO2 Intensity API</a>. This API does two things:</p><ol><li>It reveals the country in which the IP address is located, and</li><li>It surfaces the latest annual average carbon intensity figures of that country’s electricity grid.</li></ol><p>Running the IP address above through the API returns:</p><pre class="language-json"><code class="language-json">{
"country_name": "Egypt",
"country_code_iso_2": "EG",
"country_code_iso_3": "EGY",
"carbon_intensity_type": "avg",
"carbon_intensity": 466.006,
"generation_from_fossil": 88.87,
"year": 2021,
"checked_ip": "163.121.141.38"
}</code></pre><p>So, the IP address is located in Egypt as I mentioned earlier. Last year (2021), Egypt’s grid had an average carbon intensity of 466 grams per kilowatt-hour, with close enough to 89% of electricity generation coming from fossil fuels.</p><p>That’s not great. Even though emissions-wise Egypt is pretty close to the global average (442 g/kWh), in terms of generation from fossil fuels it is way above the global average (61.56%). The data here comes from Ember. You can play around with their <a href="https://ember-climate.org/data/data-explorer/">Data Explorer</a> if you want to go deeper.</p><h4><strong>Yes, we’ve made some assumptions</strong></h4><p>The checks above don’t paint the rosiest picture of the site’s hosting. However, it is worth mentioning that we’re limited here to general, publicly available data. There is every possibility that the team behind the COP27 website has hosted it on a server located in Egypt that is connected to a clean energy source like solar. I really hope this is the case.</p><h3>Networks - Page weight & data transfer</h3><p>Even though network energy use is pretty much a constant (<a href="https://fershad.com/writing/website-carbon-beyond-data-transfer/#data-transfer-network-energy-usage">I go into that here</a>), it’s still important to be looking for ways reduce the amount of data we send to those visiting our sites.</p><p>When a page loads it can help with performance (as we’ve seen above). When a page is <strong><strong><strong><em>loading,</em></strong></strong></strong> it might help to reduce energy consumption on the user’s device. And, if anything, it can help ensure that the people consuming our sites who may have data caps, or pay-as-you-go plans, don’t chew through their budgets faster than they expect.</p><h4>Page weight is a problem</h4><p>The homepage initially loads over 19 MB of data. That’s over 9x the <a href="https://httparchive.org/reports/page-weight#bytesTotal">size of the median desktop page</a> as of September 2022. As I was checking in DevTools, though, I noticed the data being transferred kept ticking up. It eventually stopped at 31.9 MB.</p><p>That’s probably another reason why the performance paint metrics are as poor as they are. There’s a heap of data contending for bandwidth as the page loads. Where’s all this data coming from, and what can we do to reduce it?</p><h4><strong>Images</strong></h4><p>There are 155 images requests when the page first loads. They account for 16.8 MB of data transfer. Of these, only <strong><em>one</em></strong> was a modern image format - a WebP image served by a YouTube video embed near the bottom of the page. Most are PNG or JPEG format, including two images for the event’s mobile app that are 4.5 MB and 2.0 MB.</p><p>Reducing the impact of images would be a sensible first step. Applying the <code class="language-markup">loading="lazy"</code> attribute to off-screen images should help both sustainability and performance. Using modern formats (with PNG/JPEG fallbacks) would help take things that extra mile, and really get the overall transfer size down.</p><p>A lot of the image bloat seems to have come about since the website redesign that we saw earlier. The previous iteration of the site had a clean design that was light on images and fluff. It also demonstrated <a href="https://sustainablewebdesign.org/category/design/">some features of Sustainable Web Design</a>. I’m not sure what was behind the redesign, but my personal opinion is that it was done to the detriment of the website.</p><h4><strong><strong><strong><strong><strong><strong><strong><strong><strong>A self-hosted video file</strong></strong></strong></strong></strong></strong></strong></strong></strong></h4><p>The homepage also features a self-hosted video file near the bottom of the page. Of the final 31.9 MB downloaded, this one video accounts for 12.9 MB. I’ve included a simplified snippet of the video tag from the site below:</p><pre class="language-html"><code class="language-html"><video preload="auto" controls poster="path/to/post/image.jpg"></code></pre><p>The video tag has a <code class="language-markup">poster</code> attribute, so there’s going to be something nice shown in place until the user starts watching it. For that reason, the <code class="language-markup">preload</code> attribute could be set to <code class="language-markup">"none"</code> which would instruct the browser not to download any video content until the user requests it (by playing the video).</p><h4><strong>A questionable Twitter feed</strong></h4><p>Just above the video, there’s a section promoting the event’s app alongside a Twitter feed. On my screen, it looks like this:</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/c1fd039028ed29fc693b5b9ce268dcb763166d35-1838x570.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/c1fd039028ed29fc693b5b9ce268dcb763166d35-1838x570.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Screenshot from a section of the COP27 homepage.</figcaption></figure><p>Firstly, remember I mentioned a 2.0 MB image promoting the event’s app? This (the phone) is that image. A file which is 1443 × 2802 pixels, rendered at 227 x 440. We’ve talked about images already though. Let’s talk about Twitter.</p><p>Files served from Twitter (and related domains) account for 3.4 MB of the total page weight. That’s a fair bit of data to download, only to then display it in a section like the one in the picture above. I’m not a designer, but I’m sure there are better ways this part of the homepage could be presented. Michelle Barker’s covered some ways to reduce the impact of social embeds in <a href="https://www.smashingmagazine.com/2022/02/reducing-web-carbon-footprint-optimizing-social-media-embeds/">this post for Smashing Magazine</a>.</p><h4><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>Let’s guesstimate the impact</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></h4><p>If we could edit the website today to address the three areas highlighted above, what kind of impact might we have on overall page weight? Note that the figures below are guesses made off some knowledge, and assumptions.</p><ul><li>We could <strong><strong>probably</strong></strong> reduce image size by at least 75% (saving 12.6 MB)</li><li>We’d remove the 12 MB downloaded for the video</li><li>We could defer the loading of all Twitter stuff until the user actually requests it (saving 3.4 MB)</li></ul><p>So, we could potentially get the page down to a somewhat reasonable 3.9 MB without really having to dive deep into the weeds of website optimisation.</p><h3>Devices - Testing energy usage</h3><p>Still here? Thanks!</p><p>We’re now going to do something that I’ve never done before when reviewing a website. We’re going to see how much power it consumes on the client (the browser in this case).</p><p>We’ll be using the Firefox DevTools Profiler on a Windows 11 Surface Pro 6 device to record a capture of the site as it loads. We’ll then use a ⚠️ <strong><strong><strong><strong><em>currently very much work in progress ⚠️</em></strong></strong></strong></strong> fork of the Profiler to look at the power usage. We’ll see data presented in both units of watts, and in units of CO2e (carbon dioxide equivalent).</p><p>The CO2e estimates we'll see should be considered experimental at this stage. I’ve been helping to <a href="https://github.com/firefox-devtools/profiler/pull/4243">build out this feature</a> as part of my work with <a href="https://www.thegreenwebfoundation.org/">The Green Web Foundation</a>. That said, I’ve never used it to review a website, so it’ll be my first time trying to analyse these figures too. Let’s learn together, friends.</p><h4>Recording the capture</h4><p>Rather than diving straight into the profile, let’s race through recording the capture first. I believe that the ability to record a power profile from devices was introduced in Firefox 104, so you’ll need to be on that version of the browser or newer. The option will show up on all platforms, but actually only works on Windows 11 & Mac OS at the time of writing.</p><p>In the browser, open up DevTools and navigate to the Performance tab. There, you’ll see a dropdown with different presets which you can use. There’ll be a Power option in that list. With that selected, you can start recording!</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/c002eb1d76f1294417f3f7f63ffef37af7edcd7c-989x443.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/c002eb1d76f1294417f3f7f63ffef37af7edcd7c-989x443.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Screenshot from Firefox 106 DevTools, showing the Power preset selected in the Performance tab.</figcaption></figure><h4>A wild profile appeared!</h4><p>Once you’re done recording, the Profiler will automatically load your recording in a new tab. It’ll probably be overwhelming, just like the screenshot below. What we’re going to do in this post mostly focuses on the four Power rows (which you <em>might</em> be able to see highlighted in the screenshot below).</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/18a5f3eb8d60df755c455a086359ac3aac0c346c-3663x1958.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/18a5f3eb8d60df755c455a086359ac3aac0c346c-3663x1958.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Screenshot of a Firefox Profiler results tab. If you look carefully, you can see the four Power rows highlighted in green.</figcaption></figure><p>Before continuing, I’d like to again reiterate that I’m using a local, forked version of the Profiler to uncover carbon emission data. While energy consumption data is currently available in the live version of the tool, the carbon emissions data you’ll see shortly is from a branch that is still being actively worked on.</p><h4>Device level energy and carbon estimates</h4><p>Let’s zoom in on the four Power rows. Here’s what they show us.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/b8b3f4fc314f3da222b1397eedca403b29ed92a9-873x196.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/b8b3f4fc314f3da222b1397eedca403b29ed92a9-873x196.png?auto=format" alt="undefined" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Screenshot showing Power tracks (rows) in the Firefox Profiler.</figcaption></figure><p>In the screenshot above, I’ve hovered over a point in time on the <strong>Power: CPU package</strong> row. This brings up a tooltip showing:</p><ul><li>Power (in watts) - The power consumption by the CPU package at that point in time on the timeline.</li><li>Energy used in the visible range (in milliwatt-hour) - the energy used by the CPU package over the entire profile.<ul><li>Next to the milliwatt-hour figure there’s also a carbon estimate shown in milligrams of CO2e. This is the bit that I’ve been working on with The Green Web Foundation.</li></ul></li></ul><p>Okay, so there’s a likelihood that you’re like me and don’t know your milliwatts from your milligrams. All we really need to know is that we’re dealing with some pretty small figures here.</p><h4><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong><strong>What does this mean?</strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></strong></h4><p>Here’s how I’m reading it:</p><blockquote>For each time someone on a Surface Pro 6 loads the COP27 homepage, 45 milliwatts of energy are used by the device, equating to an estimated 20 milligrams of CO2e.</blockquote><p>It might help to work with larger numbers here. If 100,000 people visited the homepage on a Surface Pro 6, then a collective 2000 grams (2 kilograms) of CO2e will be generated by the devices.</p><h4>A few things to note</h4><p>While this is super handy data to have, there are a few things to keep in mind - especially with the experimental CO2e figures I’ve shown.</p><ul><li>The profile I’m reviewing is loading the page with an empty cache. Would caching change power consumption? That’s something to look into in another post.</li><li>The CO2e figures are generated using <strong><strong><em>global average grid intensity</em></strong></strong> figures for 2021, which are <a href="https://developers.thegreenwebfoundation.org/co2js/data/">imported into the project from CO2.js</a>. Ideally, we’d like to use more region-specific figures for even greater detail. There’s been <a href="https://github.com/firefox-devtools/profiler/pull/4243#issuecomment-1266624528">some conversation</a> <a href="https://github.com/firefox-devtools/profiler/pull/4243#issuecomment-1270018399">around this</a> as part of the PR for this feature.</li><li>I honestly have no idea whether looking at CPU package, CPU cores, iGPU, or DRAM would be the right approach here. There are some details documented in the <a href="https://firefox-source-docs.mozilla.org/performance/power_profiling_overview.html">Firefox Source Docs</a>, but I’ve yet to go into detail on that.</li><li>This just captures the page load. The only interactions I had with the page were to close the cookie banner & app promotion overlay.</li></ul><p>I’d love to explore these further when I’ve got a bit more time. Especially measuring interactions.</p><h2>Overall sustainability profile</h2><p>We’ve unpacked a lot in this post, so thank you for making it this far.</p><p>To finish, let’s get back to answering the question we began with - how does the COP27 homepage stack up from a website sustainability perspective?</p><p>On the whole, there’s a lot that could be done better. Let’s break it down by looping back on the server, network, and device information we’ve found.</p><h3>Servers</h3><p>Summary: Based off what we can uncover, not great.</p><ul><li>It looks like all content for the domain is served from Egypt, without the use of a CDN.</li><li>Egypt’s energy intensity is a bit above the global average, with most of that generation coming from fossil fuels.</li><li>We can only hope that the site has been hosted on servers that get all/most of their power directly from clean energy sources.</li></ul><h3>Networks</h3><p>Summary: Poor, but can be easily improved with a few quick fixes.</p><ul><li>The page loads almost 32 MB of data when first visited.</li><li>Most comes from images. Using lazy loading & modern formats could reduce their size significantly.</li><li>Next culprit is a video near the bottom of the page. Changing to <code class="language-markup">preload="none"</code> here would save 12 MB of data download.</li><li>Finally, a design rethink of the Twitter feed on the site could help shave off more bytes.</li></ul><h3>Devices</h3><p>Summary: “Yeah nah”. Or maybe “nah yeah”.</p><p>I’m not really sure to be honest. Since we’ve used some experimental tooling to surface device level CO2e estimates, I don’t really have a reliable baseline to go off.</p><p>Let’s end on a positive, and give them a passing grade here 🙂.</p><p> </p><p> </p></div>Website carbon: Beyond data transfer2024-02-20T13:25:46Zhttps://fershad.com/writing/website-carbon-beyond-data-transfer/<div><p>When I first started getting curious about digital sustainability, especially the carbon impact of my website, one of the first places I was pointed to was <a href="https://www.websitecarbon.com/">Website Carbon Calculator</a>. Being presented with a carbon impact for my small personal homepage was an eye opener.</p><p>That was a bit over two years ago. Since then, I’ve learnt (and am still learning) a heck of a lot about what goes into the carbon footprint of a website or digital service. Would I use Website Carbon Calculator when auditing a site now? Probably not. That said, I would still 100% recommend it, or <a href="https://digitalbeacon.co/">Beacon</a>, to anyone that’s looking at figuring out website carbon emissions for the first time.</p><h2>Data transfer is just the start</h2><p>Most website carbon calculators look at the size of data transfer for a site or app and base calculations off that. At a high level, they’ll take the amount of data transferred and use constants to work out an estimated amount of energy consumed to transmit that data over the internet. This energy figure is then multiplied by a carbon intensity constant to return a carbon estimate.</p><p>Depending on the calculation model being used by the tool, there’ll be different assumptions made about the energy consumption of data, <a href="https://www.wholegraindigital.com/blog/website-energy-consumption/">system boundaries</a>, and even carbon intensity. Some might discount energy intensity for site’s using green web hosting, while others will not. Tools using the <a href="https://sustainablewebdesign.org/calculating-digital-emissions/">Sustainable Web Design</a> model will also likely be making fixed assumptions about a website’s returning visitors & caching.</p><p>There are a fair few assumptions and generalisations being made there. Again, these tools make a great jumping off point for folks looking at digital carbon emissions for the first time. To get more accurate outputs, though, we need to start having tools that look beyond data transfer, and we need to start replacing some assumptions with more concrete inputs.</p><h2>Data transfer ≠ network energy usage</h2><p>Wait, what?</p><p>Since I started looking into digital sustainability, I have always had the notion in my mind that the more data we send as part of a site or app, the more energy we consume on the network to transfer those bytes. Sticking with the theme of assumptions, I think it’s fair to say most people learning about digital sustainability would have similar thinking.</p><p>It is only recently that I became aware that this couldn’t be further from the truth. In fact, network devices are <em>always on</em>, and their energy consumption is generally independent of their traffic load. There’s a spike in energy consumption when the device is first turned on, and then its power usage stays relatively constant even as utilisation increases. The below graph illustrates this. It is taken from a paper titled <a href="https://hotcarbon.org/pdf/hotcarbon22-jacob.pdf">The Internet of tomorrow must sleep more and grow old</a> by Romain Jacob and Laurent Vanbever. Romain gives a really nice summary of it in this <a href="https://www.youtube.com/watch?v=EUprOJTvQ84">YouTube video</a>.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/f475f393bc56f10723cf961ce40334a70e57ed26-1034x617.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/f475f393bc56f10723cf961ce40334a70e57ed26-1034x617.png?auto=format" alt="A graph showing network utilisation as a steady line moving up at a 45 degree angle, compared to network power draw which rises sharply at the start (at 0 on the x-axis) before immediately leveling off to stay constant at almost full power consumption." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Power draw of networks is decoupled from data transfer.</figcaption></figure><h3>A moment to ponder</h3><p>So, reducing the amount of data being transferred to load a website doesn’t really change the amount of energy required by networks to deliver it to users. Learning this left me questioning whether I’d been spending my time optimising for the wrong thing in the pursuit of more sustainable websites. I’d often explain to people that “by reducing the amount of data we move over the web we reduce the amount of energy the internet consumes”.</p><p>After thinking about this for a while, I’ve come to the conclusion that the data size still matters. Okay, it might not change how much energy the network consumes. But, data in 1’s and 0’s still needs to be processed either by a server or a client, or maybe both. And it can still have a significant impact on performance as well. So, yes, I’ll still be optimising for less bytes whenever possible.</p><p>What’s required, however, is a shift in how we think about the impact of data transfer, and where we focus. To do this, we need flexibility in our carbon estimation models.</p><h2>So, where to from here?</h2><p>With network energy usage being fixed, that’s <a href="https://sustainablewebdesign.org/calculating-digital-emissions/#:~:text=Network%20use%3A%20data%20transferred%20across%20the%20network.%20This%20accounts%20for%20an%20estimated%2014%25%20of%20the%20system.">around 14% of the total energy usage</a> for a digital product that we can’t change. Instead, where should we focus our efforts? What variables can we measure that can help deliver more detailed digital carbon results?</p><h3>Measure at the server</h3><p>Variability of server load would be the first place to start. This isn’t <em>as easy</em> as it might seem. For those of us who rely on services like Netlify or Vercel to get our sites online, getting server resource utilisation isn’t possible (as far as I know). There may be WordPress hosts, or other hosting providers, out there that do provider this level of information in some kind of dashboard. If you know of any, please do let me know!</p><p>If you’re using cloud functions or serverless, most of the large providers now have dashboards or reports that surface some level of energy/carbon related data. There can be a degree of ambiguity to some of the data, though, with providers claiming “commercial secrets” could be revealed if certain information is made public. If you’d like to get the data yourself, you can look into using <a href="https://www.cloudcarbonfootprint.org/">Cloud Carbon Footprint</a> instead.</p><p>Finally, if you run your own servers on premise then you should have all the data on hand by comparing your energy bill with server utilisation. You can then use <a href="https://github.com/thegreenwebfoundation/co2.js/tree/main/data/output">grid intensity data for your country</a> to work out the carbon emissions of your on-premise operations. Alternately, you could try asking your energy provider for your grid intensity data.</p><h3>Measure on the device</h3><p>Loading and consuming anything on a device (desktop, laptop or mobile) draws power. So, the next place to look is how the data required to load our website, app, or service is impacting a user’s device itself. Now, there are a bunch of gotchas and edge cases here which I won’t go into in detail for this post (e.g. I am using my phone unplugged, but it was charged when the grid was using mostly renewable electricity).</p><p>Firefox browsers now include <a href="https://www.mozilla.org/en-US/firefox/104.0/releasenotes/">device power usage data in their developer tools profiler results</a> (Windows & Mac only). This is very useful in gaining insights into how much power a site or app is utilising both when it is in standby and also actively being used. The data is displayed on a timeline track, and you can zoom in on parts of it. This means it could be possible to estimate the impact of a particular action on a site or app too!</p><p>This level of granular data is priceless in working out more accurate carbon estimates. In fact, there’s even <a href="https://github.com/firefox-devtools/profiler/pull/4243">a pull request open</a> to display carbon estimates alongside these power usage results. The PR uses the global annual average grid intensity for now. Ideally, though, you’d want to take power consumption and multiply it by a grid intensity value for the country/region in which that device is being used.</p><p>As for other browsers, Safari’s power meter provides guidance without returning any detailed figures. Meanwhile, the team at Microsoft Edge are looking to get some <a href="https://fershad.com/writing/microsoft-propose-sustainability-section-in-edge-devtools/">sustainability metrics into their DevTools</a> (which should hopefully mean all Chromium browsers get it too).</p><h3>Adjust usage assumption</h3><p>I mentioned earlier that the Sustainable Design Model includes assumptions about return visitors to a site, and data caching on the device. In most cases, we actually know (or can find) this information for our own websites or apps. If you’re using the Sustainable Web Design model as the basis for a carbon audit, then you’d want to replace their assumptions with your own figures at the very least.</p><p>This has been <a href="https://github.com/thegreenwebfoundation/co2.js/issues/109">raised as an issue</a> for the Sustainable Web Design implementation in CO2.js, and it’s something I’m very keen to get working on.</p><h3>Better production data</h3><p>Finally, looking at the full lifecycle of our products can be a big undertaking but is critical to generating the most accurate overall results. It would be a whole post in to go into working out the lifecycle carbon footprint of a device. All I’ll say for now is that it would be great to see manufacturers make carbon/environmental impact data about a device readily available to consumers. This <a href="https://www.hpe.com/psnow/doc/a50002430enw">example from HP Enterprise</a> (page 2 of the PDF) presents the information in a really easy to digest form If this kind of information was available in store/online when buying a device, as well as on a manufacturer’s website it would help us include more detailed device lifecycle impact data into our estimates.</p><h2>What does all this mean?</h2><p>My personal take is that we’ll see a lot of movement in how digital carbon estimates are calculated in the next year. This will come as a result of access to more granular data, coupled with a better understanding of how different parts of the system operate & come together. We’ll probably see a shift away from generic, average based figures towards more case specific results that provide a truer reflection of the impact of digital.</p></div>Release guide: CO2.js v0.112024-02-20T13:25:46Zhttps://fershad.com/writing/release-guide-co2-js-v0-11/<div><div class="callout"><p></p><p>🚧 <strong>Note:</strong> The changes referred to in this update are <strong><em>scheduled to go live on October 3rd, 2022</em></strong>.</p><p></p></div><p>The v0.11.0 release of CO2.js marks a major milestone for the library. It sees the Sustainable Web Design (SWD) model become the default estimation model in the library. In addition, we have introduced scripts to generate global average and margin grid intensity data.</p><h2>Defaulting to SWD</h2><p>The Sustainable Web Design model is now the default carbon estimation model for CO2.js.</p><p>CO2.js includes two models for calculating digital carbon emissions – the OneByte model, and the Sustainable Web Design model. The OneByte model has been the default model in CO2.js since the library was first created, and has been the default until now. The Sustainable Web Design (SWD) model was added to CO2.js in April, 2022.</p><p>The OneByte model is a few years old now, and while the original model used wider systems boundaries a judgement call was made to reduce its scope when it was originally implemented in CO2.js. You can read more about why in <a href="https://github.com/thegreenwebfoundation/co2.js/issues/68">this GitHub issue</a>.</p><p>With the updates in <a href="https://github.com/thegreenwebfoundation/co2.js/releases/tag/v0.10.2">v0.10.2</a> making it easier for developers to switch between models, we decided the time was right to change the default estimation model to the more recent SWD version.</p><h3>What does this change mean for estimates?</h3><p>Let’s take the very simple code below as a starter:</p><pre class="language-javascript"><code class="language-javascript">import { co2 } from '@tgwf/co2'
const carbon = new co2()</code></pre><p>This code imports the main CO2.js library and initialises a new <code class="language-markup">carbon</code> object. Prior to v0.11.0, this would use the OneByte model for carbon estimates. Now, it will use the SWD model, and expose the <a href="https://developers.thegreenwebfoundation.org/co2js/methods/">methods available</a> in that model.</p><p>What does this mean for carbon estimates using the default initialisation above? Those produced by the OneByte model will be lower than those from Sustainable Web Design for the same amount of data transfer. You can <a href="https://developers.thegreenwebfoundation.org/co2js/explainer/methodologies-for-calculating-website-carbon/#how-the-models-differ">read more about why</a> in the CO2.js docs.</p><h3>What if you still want to use the OneByte model?</h3><p>If you want to continue using the OneByte model for carbon estimates, then you can do so by passing in the <code class="language-markup">{ model: '1byte' }</code> parameter when initialising CO2.js.</p><pre class="language-javascript"><code class="language-javascript">import { co2 } from '@tgwf/co2'
const oneByte = new co2({ model: '1byte' })</code></pre><h2>Including average and marginal intensity data</h2><p>Sourcing carbon intensity data shouldn’t be the remit of developers. For that reason, CO2.js now includes yearly average grid intensity data from <a href="https://ember-climate.org/data/data-explorer/">Ember</a>, as well as marginal intensity data from the <a href="https://unfccc.int/">UNFCCC</a> (United Nations Framework Convention on Climate Change).</p><p>Both sets of data are yearly figures, but they can be useful when making carbon estimates based of energy usage in different scenarios. While average figures are more commonly used in reporting and standards, some specifications like The Green Software Foundations’s <a href="https://github.com/Green-Software-Foundation/software_carbon_intensity/blob/dev/Software_Carbon_Intensity/Software_Carbon_Intensity_Specification.md">Software Carbon Intensity specification</a> use marginal data instead.</p><p>You can find the data in two formats – JSON and CommonJS. These can be found in the <code class="language-markup">data/output</code> folder of the <a href="https://github.com/thegreenwebfoundation/co2.js">CO2.js repository</a>.</p><h3>Importing grid intensity data</h3><p>With average and marginal grid intensity data now available in CO2.js, it would be only be right to make this useful to anyone wanting to use such information in their projects. As a result, we have exposed both average and marginal grid intensity data through the CO2.js library.</p><p>For example, if we wanted to use the average grid intensity for Australia in our project, we could use the code below:</p><pre class="language-javascript"><code class="language-javascript">import { averageIntensity } from '@tgwf/co2';
const { data } = averageIntensity;
const { AUS } = data;
console.log({ AUS })</code></pre><p>All countries are represented by their respective <a href="https://www.iso.org/obp/ui/#search">Alpha-3 ISO country code</a>.</p><p>You can find <a href="https://github.com/thegreenwebfoundation/co2.js/releases">details of every release</a> for CO2.js on GitHub, where you’ll also be able find the <a href="https://github.com/thegreenwebfoundation/co2.js/blob/main/CHANGELOG.md">changelog</a> for this project.</p><p>If you are using CO2.js in production then The Green Web Foundation would love to hear from you! Use the <a href="https://www.thegreenwebfoundation.org/support-form/">contact form</a> on the website to get in touch.</p></div>CO2.js: An Open Library for Digital Carbon Reporting2024-02-20T13:25:46Zhttps://fershad.com/writing/co2-js-an-open-library-for-digital-carbon-reporting/<div><p>This article was originally published in <a href="https://branch.climateaction.tech/issues/issue-4/co2js/">ClimateAction.tech's Branch Magazine</a>.</p><p>Uploading and downloading the bits and bytes that make up the internet uses <em>a lot</em> of electricity. Breaking the internet down to a systems level, data transfer over networks accounts for an <a href="https://sustainablewebdesign.org/calculating-digital-emissions/">estimated 14% of the web’s total electricity</a> consumption. Networks are also globally distributed, meaning that the bytes you downloaded to render this web page in your browser are probably passed through several different electricity grids. Those grids are made up of different mixes of green and fossil fuel energy.</p><p>Being able to measure and account for the emissions of digital services is increasingly important, especially as carbon emissions reporting becomes a mandated part of business operations. Outside of the corporate space, consumers are also increasingly demanding greater visibility of CO2 information for the goods and services they use.</p><p>In order to meet the growing demands for reporting and transparency, developers need a way to measure the carbon emissions associated with the apps, sites, and software they build. On the server side, we’re seeing more providers build carbon reporting into their platforms. However, on the application side, it’s largely up to developers themselves to implement solutions. That’s where libraries like CO2.js come in handy, providing a set of research-based, standard calculations that enable developers to quickly add carbon awareness to their products and projects.</p><h2>What is CO2.js?</h2><p><a href="https://github.com/thegreenwebfoundation/co2.js">CO2.js</a> is a JavaScript library that allows developers to estimate the emissions associated with their apps, websites and software. At its core, CO2.js takes an input of data, in bytes, and returns an estimate of the carbon emissions produced to move that data over the internet. It can be run in Node.js server environments, in the browser, as well as on some serverless and edge compute runtimes.</p><h2><strong>Why use it?</strong></h2><p>Being able to estimate the carbon emissions associated with digital activities can be of benefit to both development teams and end users. The carbon emissions of the internet are something that is abstract, and out of sight. Using CO2.js allows these emissions to be surfaced, visualised, and presented in ways that make it easier for people to comprehend and act on.</p><p>The possible uses for CO2.js are wide and varied. It can be used in user-facing applications to give visibility to the carbon impact of user activity in the application. Users uploading files, or downloading content, could be notified of the impacts of those tasks. Large, carbon-intensive, data transfers could also be blocked or limited. Users could also have the option to set a carbon budget for their browsing or use of an app, website, or online service.</p><p>Behind the scenes, developers could look to use CO2.js as part of their deployment workflow. In the same way that web developers might set a <em>performance budget</em> for their site, <a href="https://css-tricks.com/reduce-your-websites-environmental-impact-with-a-carbon-budget/">a carbon budget could also be used</a>. If a website or app exceeds a threshold for carbon intensity, then an alert can be raised or a new deployment can be blocked. The data from CO2.js can also be used as part of internal monitoring tools and dashboards.</p><p>Office managers and sustainability teams could also use CO2.js to track the carbon intensity of data usage within an office environment. Plugging network data usage into CO2.js can allow for monitoring and reporting on the digital usage footprint of an organisation or business.</p><h2>CO2.js today</h2><p>CO2.js can do more than <em>just</em> estimate the carbon impact of data transfer on the internet. It also includes functions that use The Green Web Foundation’s <em>greencheck</em> API to determine if a website or domain is hosted on a green web host.</p><p>In the wild, the CO2.js package already receives close to 2000 weekly downloads on NPM. It is actively used by website testing tools like <a href="https://ecograder.com/">Ecograder</a>, and performance tools like <a href="http://sitespeed.io/">SiteSpeed.io</a> to track and display website carbon emissions for users. Website analytics service <a href="https://withcabin.com/">Cabin</a> also use CO2.js to calculate emissions for page views and historical analysis of CO2 for each page.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/91a0730008025f36b7cdce53d19d5c1a9620fd87-1536x641.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/91a0730008025f36b7cdce53d19d5c1a9620fd87-1536x641.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Screenshot for the SiteSpeed.io sustainability plugin.</figcaption></figure><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/7b881f3e8d0b6f25294e91f14f057d49310a1bf1-1497x799.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/7b881f3e8d0b6f25294e91f14f057d49310a1bf1-1497x799.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Screenshot of a SiteSpeed.io dashboard in Grafana.</figcaption></figure><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/0ce0cca8b8dda5200946a313331c0be1f3ff1e65-1536x1156.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/0ce0cca8b8dda5200946a313331c0be1f3ff1e65-1536x1156.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Screenshot of the Cabin website analytics dashboard showing page view carbon emissions.</figcaption></figure><p>CO2.js is covered by an Apache 2.0 license. This allows for the library to be used in digital tools and services. Since the <a href="https://github.com/thegreenwebfoundation/co2.js">code is kept on GitHub</a>, and is also open for anyone to review, fork, modify, and contribute to. As we’ll touch on a little later, any datasets and figures used in CO2.js are also open, keeping with the themes of open-source, open data, and Open Climate.</p><h3>Models available for estimating digital carbon</h3><p>There are a few different models that can be used to measure digital carbon emissions. CO2.js includes two of these – the OneByte model, and the Sustainable Web Design model.</p><p><strong>Sustainable Web Design</strong></p><p>By default, CO2.js uses the <a href="https://sustainablewebdesign.org/calculating-digital-emissions">Sustainable Web Design model</a> developed by a collaboration of <a href="https://www.wholegraindigital.com/">Wholegrain Digital</a>, <a href="https://www.mightybytes.com/">Mightybytes</a>, <a href="https://www.medina-works.com/">Medina Works</a>, <a href="https://ecoping.earth/">EcoPing</a>, and the <a href="https://www.thegreenwebfoundation.org/">Green Web Foundation</a>. It is designed for helping understand the environmental impact of websites, as well as digital products and services.</p><p>This model segments the system (the internet) into four parts – data centres, networks, end-user devices, and device production. Based on the bytes passed to it, the Sustainable Web Design model calculates the energy used by each part of the system. These figures are then converted to carbon estimates using the global carbon intensity of electricity from the <a href="https://ember-climate.org/insights/research/european-electricity-review-2022/"><strong>Ember annual global electricity review</strong></a>.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/b52fd54fa4c751699523d542c99bca497ca4e5bf-960x540.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/b52fd54fa4c751699523d542c99bca497ca4e5bf-960x540.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">The breakdown of system segments used for calculations in the Sustainable Web Design model.</figcaption></figure><p>The carbon intensity of electricity figures used by the Sustainable Web Design model includes full lifecycle emissions. This includes upstream methane, supply-chain and manufacturing emissions, and includes all gases, converted into CO2 equivalent over a 100-year timescale.</p><p><strong>OneByte Model</strong></p><p>Additionally, CO2.js also allows developers to use the OneByte model introduced by The Shift Project in their report on CO2 emissions from digital infrastructure, <a href="https://theshiftproject.org/en/lean-ict-2/"><strong>Lean ICT: for a sober digital</strong></a>. This model returns a number for the estimated carbon emissions given the number of bytes sent over the wire. It has been used for estimating the impact of video streaming, file downloads and websites.</p><p><strong>How the models differ</strong></p><p>These models return slightly different results since they apply different system boundaries as part of their calculations. Tom Greenwood has written <a href="https://www.wholegraindigital.com/blog/website-energy-consumption/">a terrific blog post</a> explaining system boundaries and how they impact carbon estimates.</p><p>The OneByte model, as it has been implemented in CO2.js, applies narrow system boundaries – datacenter and network only. It takes a top-down approach to calculations, returning a single carbon emissions result based on a given input. It should be noted that the original model used in the Lean ICT report did have broader systems boundaries. However, when the model was included in CO2.js a judgement call was made to reduce its scope. You can read more about why in <a href="https://github.com/thegreenwebfoundation/co2.js/issues/68">this GitHub issue</a>.</p><p>On the other hand, the Sustainable Web Design model has a broader system boundary (explained above). It takes a more complex, but detailed, bottom-up approach. By using a wider system boundary, the Sustainable Web Design model provides a more comprehensive carbon estimate. This also means that segmented estimates can be produced for each part of the system, allowing for greater granularity and flexibility.</p><p>As a result, the carbon estimates returned when using the OneByte model will be lower than those from Sustainable Web Design for the same amount of data transfer.</p><h2>What’s planned for CO2.js?</h2><p>CO2.js will evolve alongside the continued research into the environmental impact of the digital sector. As new data and methodologies emerge, the library will be updated to provide the best possible source of digital carbon estimates for developers. Being an <a href="https://github.com/thegreenwebfoundation/co2.js/issues">open-source library</a>, contributions from the community are always welcome.</p><p>Alongside this, making it easier for developers to start using CO2.js is a key priority. Building flexibility into the library, so that it can be used across JavaScript environments and frameworks, will hopefully empower more developers to build carbon intelligence into their tools, platforms, and services. Work on this has already begun, with the <a href="https://github.com/thegreenwebfoundation/co2.js/blob/main/CHANGELOG.md#0100-2022-06-27">recent v0.10.0 release of CO2.js</a> making it much easier to use the library in both node and the browser with ESM, CJS and IIFE build.</p><p>In the near term, there are a few more updates planned:</p><h3>Sensible and extensible defaults</h3><p>Each carbon estimation model comes with a set of constants that calculate electricity use and carbon emissions. Allowing users to be able to adjust these constants will enable more contextually accurate results to be returned. For example, a developer whose app runs on a server based in Norway should be able to update the CO2 per kilowatt-hour constant to reflect the carbon intensity of the Norwegian electric grid.</p><p>Taking it further, it may also be possible for developers to use the Sustainable Web Design model to update the constants for website caching and return visitors. This would allow them to generate website carbon estimates that are more accurate and appropriate for the site they are analysing.</p><h3>Bring in average and marginal carbon intensity data</h3><p>Sourcing carbon intensity data shouldn’t be the remit of developers. For that reason, CO2.js now includes yearly average grid intensity data from <a href="https://ember-climate.org/data/data-explorer/">Ember</a>, as well as marginal intensity data from the <a href="https://unfccc.int/">UNFCCC</a> (United Nations Framework Convention on Climate Change).</p><p>Average emissions intensity uses the fuel mix of the entire electricity grid and can be used to derive estimates for the carbon footprint of a digital product or service. You’ll see average intensity used in the majority of carbon reporting standards and tooling. This makes it useful if you were to use CO2.js to feed in data to other carbon reporting tools.</p><p>Marginal intensity, on the other hand, looks at where the additional electricity to power a device, product or service would come from. In almost all cases it would be from a fossil-fuel power source, and so marginal intensity figures tend to be higher than average intensity figures. The <a href="https://greensoftware.foundation/">Green Software Foundation</a> is one group that uses marginal intensity as part of its specification.</p><p>The team over at Electricity Maps have two great blog posts <a href="https://electricitymaps.com/blog/marginal-emissions-what-they-are-and-when-to-use-them/">explaining the concepts</a> and why you might <a href="https://electricitymaps.com/blog/marginal-vs-average-real-time-decision-making/">use one over the other</a>.</p><p>Enabling developers to access this data, and use it with the models available in CO2.js, is a key step toward generating more accurate carbon estimates.</p><h3>Reducing the barrier to use</h3><p>To accompany these changes, <a href="https://developers.thegreenwebfoundation.org/">a new developer documentation website</a> has been created for CO2.js and some of the other open-source libraries maintained by The Green Web Foundation. Providing an easy-to-digest, accessible set of tutorials and guides will hopefully enable more developers to start using CO2.js in their work. The documentation site will grow with time and as new capabilities are added to the library.</p><h2>Making carbon reporting accessible</h2><p>The possible uses for CO2.js are wide and varied, and the data produced can play an important role in educating stakeholders and enabling more climate-conscious decisions to be made when creating digital products and services.</p><p>If you are using (or do end up using) CO2.js in production then The Green Web Foundation would love to hear from you! Use the <a href="https://www.thegreenwebfoundation.org/support-form/">contact form</a> on their website to get in touch.</p></div>Microsoft propose sustainability section in Edge DevTools2024-02-20T13:25:46Zhttps://fershad.com/writing/microsoft-propose-sustainability-section-in-edge-devtools/<div><p>Getting accurate measurements of a website or app’s carbon impact can be extremely difficult. Ismael Velasco talks through the nuance of digital carbon calculations <a href="https://ismaelvelasco.dev/emissions-in-1gb">on his blog</a>. Even if you just want to <a href="https://greensoftware.foundation/articles/how-to-measure-the-energy-consumption-of-your-frontend-application?utm_source=pocket_mylist">measure how much energy your site or app consumes</a>, you’d probably need to get yourself a watt-meter, and work out a way to run your app in isolation.</p><p>The growing number of online website carbon calculators serve as a great entry point for those interested in figuring out their site’s environmental impact. However, they are far from perfect. There are assumptions made about visitors, caching, and even different research and energy intensity figures used in calculations. David Mytton gave a well explained take on this in a recent episode of the <a href="https://greenio.gaelduez.com/e/5nzm9wk8-ep6-david-mytton-bringing-reliable-and-transparent-information-to-green-it">Green.io podcast</a>.</p><p>That’s not to throw any shade on website carbon calculators. They are a fantastic starting point, enabling more people to get a first insight into the carbon impacts of their website. With that in mind though, getting more detailed data into the hands of developers could really help progress the push towards more accurate digital carbon calculations. It may also help unlock the door to making digital carbon part of the development and business conversation, much in the same way performance and accessibility are now.</p><h2>Exposing more granular data</h2><p>In Ismael’s article, he covers the complexity of calculating digital carbon emissions since no two devices or circumstances are the same. That’s one of the reasons why a general-purpose website/digital carbon calculator can only ever provide indicative results. But with these calculators, and the methodologies behind them, as a starting point we can start looking for better ways to get more accurate results.</p><h3>Using more specific energy intensity figures</h3><p>To calculate the carbon emissions of an app or website, you’ll need to figure out the energy used for each part of the process and then multiply that by the grid intensity (CO2 per kilowatt hour). What you’ll normally see is that most digital carbon calculations use a global average grid intensity figure when returning results. Again, this is a great start and is probably the right figure to be using for calculating emissions from network traffic.</p><p>However, for location specific segments like data centers or end user devices it should be possible to use grid intensity figures for the country or area in which those segments are located. That’s something we’ve got in the works for CO2.js, as we look to make use of open data from <a href="https://www.notion.so/Microsoft-propose-sustainability-section-in-Edge-DevTools-4ca5cdd814ee42db8b5fdb0b8192e5dc">Ember</a> and the <a href="https://github.com/thegreenwebfoundation/co2.js/issues/97">UNFCCC</a>.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Aside</p><p></p><p>There’s plenty of research around the energy consumption of data centers, networks, and devices. There’s an equal, if not greater, amount of conjecture around the accuracy and validity of some research findings and figures. We’re not going to get into that here.</p><p></p></div><p>Most cloud hosting providers aren’t publishing detailed energy consumption figures for their data centers. Industry secrets, competitive advantages, stuff like that I suppose. Some do have average grid intensity values published (see <a href="https://cloud.google.com/sustainability/region-carbon#data">this example from Google</a>), but we’re still going to be working with a few generalisations and assumptions when working out the emissions attributed to data centers.</p><h3>Getting better energy usage data at the device level</h3><p>For the longest time we’ve also had to work on assumptions when determining how much energy our apps and websites consume on a user’s device. Safari does have some energy usage information surfaced in its DevTools, but I’m not an Apple person so can’t speak to this with any authority. For the main, though, to get detailed energy consumption data you’d have to calculate it yourself which is no mean feat.</p><p>That’s why I’m pretty excited by <a href="https://github.com/MicrosoftEdge/DevTools/blob/main/explainers/Sustainability/explainer.md">this proposal</a> from the Microsoft Edge DevTools team (we finally got to the topic of this post ey!). In it, they outline a new Sustainability tab in DevTools that would surface website sustainability issues, suggestions, and metrics. <a href="https://github.com/MicrosoftEdge/DevTools/issues/92#issuecomment-1226711956">I’ve got my questions</a> about the issues and suggestions. There are other comments <a href="https://github.com/MicrosoftEdge/DevTools/issues/92">on the feedback page</a> questioning if publishing a CO2 figure would really be all that helpful.</p><p>What interests me the most about this proposal, however, is the possibility that detailed energy usage data could be available at the device level. The current proposal suggests an energy consumption “score”, but I’d really like to see the actual consumption figures also provided. Being able to get this data could:</p><ul><li>Enable developers to see the energy consumption of their apps/sites.</li><li>Allow developers to assess the energy impacts of code changes, and block changes that increase energy usage.</li><li>Pave the way for more detailed device level emissions estimates.</li></ul><h3>What <em>might</em> be</h3><p>Those first two points have some tangible flow on effects as well. The less power an app or site is consuming, the less of a drain it will be on the battery of a mobile device or laptop. Energy consumption information can allow us to identify the sites or apps that might be draining our battery, and look for more effective alternatives. Once marketing teams grab a hold of this, we might even start to see energy consumption become part of an app’s messaging. Heck, Google and Apple might even put up “energy scores” as part of App Store listings.</p><p>The final point, about having more detailed device level emissions estimates, is meaningful to me as someone who does the odd website sustainability audit. If we’re able to somehow export this data to analytics and reporting tools, then we’d be able to greatly improve the accuracy of website carbon estimates. I wonder if we might even be able to see the impact of different file types (looking at you JavaScript) on energy consumption. Even without that, being able to get this information on a few different devices that I have on hand would be a great starting point.</p><p>The Edge team are at the start of what might be a long journey to get more sustainability data into DevTools. I’m sure that over time the data they can and plan to include will evolve with community feedback. You can read the <a href="https://github.com/MicrosoftEdge/DevTools/blob/main/explainers/Sustainability/explainer.md">full proposal</a> on GitHub, and submit your own feedback in <a href="https://github.com/MicrosoftEdge/DevTools/issues/92">this issue</a>.</p><p>With Edge now running on Chromium, I hope that the work the Microsoft team do here can also find its way into the DevTools of Chrome and other Chromium-based browsers. Because, while some data and assumptions about digital’s carbon impact <em>may</em> be imperfect, raising awareness of digital sustainability among more developers can help drive action for the better. Which is what our industry and our planet needs.</p></div>Building a digital carbon API with Cloudflare Workers and CO2.js2024-02-20T13:25:46Zhttps://fershad.com/writing/digital-carbon-api-cloudflare-workers-co2-js/<div><p>CO2.js is a library that allows developers to estimate the emissions related to use of their apps, websites, and software. At its core, CO2.js takes an input of data, in bytes, and returns an estimate of the carbon emissions produced to move that data over the internet.</p><p>You may look to use CO2.js directly inside of an app, website or digital service to calculate carbon emissions from user activities. Alternately, you could use it to generate, or feed data into, company reports. You could even hook it up to data from your home network to calculate your own digital carbon footprint.</p><p>Today, however, we’ll take CO2.js to the edge and use it inside of a Cloudflare Worker. We’ll use the Cloudflare Worker to build a simple API that will allow us to return a CO2 emissions result for a given number of bytes. This could come in handy if you are working on multiple apps that require digital carbon emission estimates. Rather than having to install and maintain multiple instances of CO2.js, you could point your apps to the API endpoint and fetch data from a single source.</p><h2>Getting started</h2><p>If you’d like to skip to the end then you can find the <a href="https://github.com/fershad/co2js-cloudflare-worker-api">source code for this tutorial</a> on GitHub.</p><h3>Installing Wrangler</h3><p>To start working with Cloudflare Workers we should have <a href="https://developers.cloudflare.com/workers/wrangler/get-started/">Wrangler</a> installed locally on our development machine. Wrangler is a command-line tool that helps with setting up, test, and configuring Cloudflare Workers projects. Follow the steps in the <a href="https://developers.cloudflare.com/workers/wrangler/get-started/">Cloudflare docs</a> to install Wrangler and login to your Cloudflare account.</p><h3>Spin up a project</h3><p>With Wrangler installed, we can get started on our project. We’re going to use Cloudflare’s <a href="https://github.com/cloudflare/worker-template-router">worker-template-router</a> to give us a jump start. This starter project comes setup with <a href="https://github.com/kwhitley/itty-router">itty-router</a>, allowing us to route the incoming requests to our Cloudflare Worker. To get started, create a new project by running the commands below.</p><pre class="language-sh"><code class="language-sh">git clone https://github.com/cloudflare/worker-template-router co2js-api
cd co2js-api
npm install</code></pre><p>With this done, you can open the project folder in your code editor. We’ve still got one more project setup step to complete before we can start writing code. Find the <code class="language-markup">wrangler.toml</code> file in the root of the project, and open it. Replace the entire contents of the file with the bare bones snippet below.</p><pre class="language-text"><code class="language-text"># wrangler.toml
name = "co2js-api"
workers_dev = true
main= "./index.js"</code></pre><p>This is a really simple configuration, and in practice you’d set this up more thoroughly so as to deploy the Worker to the right account and routes.</p><h2>Setting up the router</h2><p>With our project now configured, we can start writing some code! Well, deleting some code to begin with.</p><p>Head over to the <code class="language-markup">index.js</code> file that’s in the root of the project. Here you’ll find some sample router code already written for us. Take some time to have a read of it if you’d like to better understand it.</p><p>When you’re ready to proceed, delete the entire content of this file, and replace it with the code below:</p><pre class="language-javascript"><code class="language-javascript">// index.js
import { Router } from 'itty-router'
// Create a new router
const router = Router()
router.get("/", () => {
return new Response("Hello, world! This is the root page of your Worker template.")
})
router.all("*", () => new Response("404, not found!", { status: 404 }))
addEventListener('fetch', (e) => {
e.respondWith(router.handle(e.request))
})</code></pre><p>You can now run <code class="language-markup">wrangler dev</code> in your terminal, and select the Cloudflare account you want this Worker to be deployed to. Wrangler will then proceed to spin up a local version of the Worker on <code class="language-markup">localhost:8787</code> which we can use for testing. If you navigate to <code class="language-markup">localhost:8787</code> in your browser then you should see the message “<em>Hello, world! This is the root page of your Worker template.”</em> displayed.</p><h2>Create a custom route</h2><p>Since we want to be able to send this API a number of bytes and have it return us a carbon emissions estimate, we’ll need to create our own route. We’ll be using path parameters for this.</p><p>Let’s add a new route to the <code class="language-markup">index.js</code> file, just before the <code class="language-markup">router.all</code> statement. We’ll leave it empty for now.</p><pre class="language-javascript"><code class="language-javascript">// index.js
router.get("/bytes/:value", ({ params }) => {
}</code></pre><p>We now have a <code class="language-markup">/bytes/</code> endpoint which expects to receive a value passed to it. When hitting this endpoint in real life, we’ll replace the path parameter (<code class="language-markup">:value</code>) with a number of bytes we’d like to get a result for.</p><h2>Installing CO2.js</h2><p>To install CO2.js from NPM, run the <code class="language-markup">npm install @tgwf/co2</code> command. This will add CO2.js to the project dependencies.</p><h3>Using CO2.js in the router</h3><p>To use CO2.js with our router, we’ll need to import it into the <code class="language-markup">index.js</code> file. Add the below import statement to the top of the file.</p><pre class="language-javascript"><code class="language-javascript">import { co2 } from '@tgwf/co2'</code></pre><p>Now we can use CO2.js in the new route we created.</p><pre class="language-javascript"><code class="language-javascript">router.get("/bytes/:value", ({params}) => {
const { value } = params
const emissions = new co2();
const result = emissions.perByte(value);
return new Response(JSON.stringify({result}), {
headers: {
"Content-Type": "application/json"
}
});
})</code></pre><p>Let’s step through what this block of code does:</p><ul><li>Firstly, we can get the value parameter from the request <code class="language-markup">params</code> object.</li><li>Next, we initiate a new instance of CO2.js. At the time of writing, this uses the <a href="https://developers.thegreenwebfoundation.org/co2js/explainer/methodologies-for-calculating-website-carbon/">default OneByte model</a> for carbon emissions estimates.</li><li>Once initiated, we can use the <code class="language-markup">perByte</code> function to return the emissions (in grams) for the value we pass into it.</li><li>Finally, we wrap this up in a JSON object, stringify it, and send it back to the user.</li></ul><p>Now, we’re able to test our route by running the <code class="language-markup">wrangler dev</code> command. Once you’re Worker is running locally, you can go to <code class="language-markup">localhost:8787/bytes/1000000)</code> to test it out. You should see a value of <code class="language-markup">0.29081299999999993</code> returned.</p><h2>Next steps</h2><p>We’ve just created a very simple API for calculating digital carbon emissions using CO2.js and Cloudflare Workers. If you’d like to keep building on this, here are a few ideas you can try:</p><ul><li>Add some checks to ensure the value parameter is a number.</li><li>Try creating more routes for different file sizes (kilobytes, megabytes etc.). Remember, though, that CO2.js performs calculations based on bytes. So you’ll need to do some conversions first.</li><li>Take a look at the <a href="https://developers.thegreenwebfoundation.org/co2js/models/">CO2.js docs</a>, and try using the Sustainable Web Design model instead.</li></ul><p>You can find the <a href="https://github.com/fershad/co2js-cloudflare-worker-api">source code for this tutorial</a> on GitHub.</p></div>Release guide: CO2.js v0.102024-02-20T13:25:46Zhttps://fershad.com/writing/release-guide-co2-js-v0-10/<div><p>The v0.10.0 release of CO2.js saw several large developer experience and usability improvements added to the library. Although, at the time of writing, the v0.10.x releases have already been out for over a month, we feel that it is important to document and cover some of the key updates delivered in this version of the library.</p><h2><strong><strong>CO2.js goes hybrid</strong></strong></h2><p>The <a href="https://github.com/thegreenwebfoundation/co2.js/releases/tag/v0.10.1">v0.10.1 release</a> introduced an updated build process for the CO2.js library, using <a href="https://esbuild.github.io/">esbuild</a>. This allowed for three different versions of the library to be generated, to target three kinds of platforms:</p><ol><li>A browser build using an IIFE</li><li>A CommonJS compatible build for pre-ESM versions of NodeJS</li><li>An ES Modules compatible build for modern, ESM compatible JavaScript runtimes</li></ol><p>By making this change, we enable developers to easily use CO2.js in more places. Examples of this new capability includes running <a href="https://github.com/fershad/co2js-cloudflare-worker-api">CO2.js in a Cloudflare Worker</a>, and <a href="https://gitpod.io/#https://github.com/thegreenwebfoundation/co2.js">calculating emissions in the browser</a>.</p><h2>Streamline changing between estimation models</h2><p><a href="https://github.com/thegreenwebfoundation/co2.js/releases/tag/v0.10.2">In v0.10.2</a> of CO2.js we simplified the steps required to switch between the OneByte and Sustainable Web Design carbon estimation models. Previously, developers wishing to switch from the default OneByte model to the newer Sustainable Web Design model had to manually change the CO2.js source code.</p><p>v0.10.2 introduced a new <code class="language-markup">model</code> parameter which can be passed to CO2.js as an option. By passing in <code class="language-markup">{ model: "swd" }</code>, developers can quickly switch to the Sustainable Web Design model after importing the library into their project.</p><p>The addition of this parameter also makes it possible for us to change the default model while still providing a frictionless way for developers to continue using the OneByte model if they wish to.</p><p>You can find more details, and code examples, for changing models in the <a href="https://developers.thegreenwebfoundation.org/co2js/models/">CO2.js docs</a>.</p><h2>Adding a perVisit method</h2><p>CO2.js has always had a <code class="language-markup">perByte</code> method, which returns a CO2 value (in grams) for raw data transfer. In <a href="https://github.com/thegreenwebfoundation/co2.js/releases/tag/v0.10.3">v0.10.3</a>, we introduced a new <code class="language-markup">perVisit</code> method which can be used specifically for calculating the carbon emissions of websites.</p><p>The <code class="language-markup">perVisit</code> method is only available when using the Sustainable Web Design model. It adopts the assumptions the model authors have made <a href="https://sustainablewebdesign.org/calculating-digital-emissions/#:~:text=Returning%20visitors%20are%20assumed%20to%20be%2025%25%2C%20loading%202%25%20of%20data.">about website visitors and caching</a> as part its calculation.</p><p>You can find more details about both models, and code samples using both, in the <a href="https://developers.thegreenwebfoundation.org/co2js/methods/">CO2.js docs</a>.</p><p>You can find <a href="https://github.com/thegreenwebfoundation/co2.js/releases">details of every release</a> for CO2.js on GitHub, where you’ll also be able find the <a href="https://github.com/thegreenwebfoundation/co2.js/blob/main/CHANGELOG.md">changelog</a> for this project.</p><p>If you are using CO2.js in production then The Green Web Foundation would love to hear from you! Use the <a href="https://www.thegreenwebfoundation.org/support-form/">contact form</a> on the website to get in touch.</p></div>A carbon aware internet2024-02-20T13:25:46Zhttps://fershad.com/writing/a-carbon-aware-internet/<div><p>Recently I’ve been doing a bit of writing for The Green Web Foundation. Most of that has involved building <a href="https://www.notion.so/A-carbon-aware-Internet-d6c350d466804249aedf98571e76a67f">better, more beginner-friendly documentation</a> for some of the foundation’s open-source code libraries.</p><p>One of those libraries, the Grid Intensity CLI, aims to provide developers with global grid intensity data from a range of providers. In doing so, it can enable developers to surface, monitor, and understand the carbon intensity of the code they write. This, in turn, allows them to make decisions on when/where to run their code so that it uses as much green energy as possible.</p><h2>The Grid Intensity CLI</h2><p>The <a href="https://github.com/thegreenwebfoundation/grid-intensity-go">Grid Intensity CLI</a> is a library written in Go. At it’s very basic, the CLI can be used to return the carbon intensity data of electricity grids around the world. It does this by leveraging APIs of several providers including <a href="https://ember-climate.org/">Ember</a>, <a href="https://electricitymaps.com/">Electricity Maps</a>, and <a href="https://www.watttime.org/">WattTime</a>.</p><p>At a more practical level, the CLI comes with a data exporter. This can be used to spin up a server that exposes data to graphing, scraping, and monitoring tools. Setting this up on the servers or clusters that are being used for a site or app can allow operations teams to gain insights into the carbon intensity of the code they run. This makes it possible for them to then influence the carbon intensity of that code by moving it through time and space.</p><h3><strong>Moving through time</strong></h3><p>Sometime code <em>needs</em> to be run in a particular region. In these instances, developers can use grid information to identify periods of low carbon intensity. These are times when more green or renewable electricity is in the fuel mix. Scheduled jobs or heavy computational tasks can then be run during periods of low grid intensity, making them less carbon intensive.</p><h3><strong>Moving through space</strong></h3><p>Increasingly, code is deployed to multiple regions around the world. In this scenario, developers can use the Grid Intensity CLI to easily consolidate carbon intensity data for different regions into a single dashboard or dataset.</p><p>With this information at hand, they can look for ways to run code in locations with a greener fuel mix. This can be extended to smarter, carbon-aware routing so that more requests are directed to code running in regions with a lower carbon intensity.</p><h2>Using the Grid Intensity CLI</h2><h3>Installing</h3><p>You can install the Grid Intensity CLI locally to try it out. If you’re familiar with using the terminal, then running the curl command below is all you need to get going.</p><pre class="language-sh"><code class="language-sh">curl -fsSL https://raw.githubusercontent.com/thegreenwebfoundation/grid-intensity-go/install-script/install.sh | sudo sh</code></pre><h3>Getting data</h3><p>Once you’ve installed the CLI, you can run <code class="language-markup">grid-intensity --region=TW</code> to get the last calendar year’s grid intensity data for Taiwan.</p><p>By default, the Grid Intensity CLI uses data from <a href="https://ember-climate.org/">Ember Climate</a>. You can change the value passed to the <code class="language-markup">--region</code> flag to return data for different parts of the world. With Ember, you’ll need to use an <a href="https://www.iso.org/obp/ui/#search">Alpha-2 or Alpha-3 ISO country code</a>.</p><h3>Changing provider</h3><p>Ember provides historical data. However, if we want to get more recent data for a particular region then we can use one of the other provider integrations.</p><p>Let’s say we have a server in the UK, and want to know the latest grid intensity we can use take the <a href="https://carbonintensity.org.uk/">UK Carbon Intensity API</a> for a spin. Running <code class="language-markup">grid-intensity --provider=carbonintensity.org.uk --region=UK</code> in the terminal will bring back the latest intensity data (which is updated every half hour).</p><p>The docs have more details on the <a href="https://developers.thegreenwebfoundation.org/grid-intensity-cli/explainer/providers/">other providers available</a> with the CLI.</p><h3>Exporting data</h3><p>Running these commands in the terminal is all well and good, but in the real world you probably want to expose this data to some monitoring or decision-making systems. The Grid Intensity CLI has an <code class="language-markup">exporter</code> command for doing just this.</p><p>Running the <code class="language-markup">grid-intensity exporter --provider=carbonintensity.org.uk --region=UK</code> command will start a <a href="https://prometheus.io/">Prometheus</a> server on localhost port 8000. Once the server has started, you can go to <code class="language-markup">localhost:8000/metrics</code> where you’ll be presented with a page full of data and stats. Doing a search for <code class="language-markup">grid_intensity_carbon_average</code> will bring up the grid intensity information for the UK. Since the UK Carbon Intensity API is updated every 30 minutes, you can leave the server running & refresh it later to see updated results.</p><p>To set this up in production, you can use Docker to <a href="https://github.com/thegreenwebfoundation/grid-intensity-go#docker-image">deploy the exporter</a> to a <a href="https://github.com/thegreenwebfoundation/grid-intensity-go#kubernetes">Kubernetes</a> or <a href="https://github.com/thegreenwebfoundation/grid-intensity-go#nomad">Nomad</a> cluster. Prometheus can also be used as a data source for <a href="https://prometheus.io/docs/visualization/grafana/">Grafana visualisations</a>. None of these are things I’m overly familiar with, so I can’t go into more detail.</p><h2>Learning more</h2><p>Documentation for the Grid Intensity CLI can be found on <a href="https://developers.thegreenwebfoundation.org/grid-intensity-cli">Developer Docs @ The Green Web Foundations</a>. You can also check out the <a href="https://github.com/thegreenwebfoundation/grid-intensity-go">repository on GitHub</a>.</p><p>It’s relatively early days, and what I’ve covered above is really just a taste of what’s possible with the Grid Intensity CLI. As more provider integrations are added, and the tool matures, it’ll be interesting to see what real world use cases it serves.</p></div>Testing a page with Performance Insights2024-02-20T13:25:46Zhttps://fershad.com/writing/testing-a-page-with-performance-insights/<div><p><a href="https://www.builder.io/c/performance-insights">Performance Insights</a> is a relatively new entrant in the website performance tooling space. Created by the team at <a href="http://builder.io/">Builder.io</a> it takes a slightly different approach to website performance auditing. Rather than presenting a whole swath of data on your site’s current performance, this tool focuses on what improvements you can make and the impact they could have.</p><h2>Getting started</h2><p>Head over to <a href="https://www.builder.io/c/performance-insights">https://www.builder.io/c/performance-insights</a> and enter in a web page URL. Hit ‘Analyze’ and the tool will start running the page through a series of emulated mobile Google Lighthouse tests.</p><h2>The results</h2><p>Once the tests have been run, you’ll be presented with a set of five cards showing you the results of the tests.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/3b899f4252d0ed7eec31bf6195cfe42317559257-1610x703.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/3b899f4252d0ed7eec31bf6195cfe42317559257-1610x703.png?auto=format" alt="Results showing current performance and potential optimisations." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Results shown on Performance Insights</figcaption></figure><p>Only the first card, titled “<strong>Current</strong>”, shows you information about how your website is performing at this point in time. The other four cards show you the <em>potential</em> impact of optimising different parts of your website’s content.</p><p>All the scores shown are the Lighthouse Performance Score. While this score isn’t the most representative web performance metric, in this case it does help to provide a nice and neat guide.</p><p>Personally, I find the way these results are presented to be very useful when starting out an audit on a website. They allow me to see where I can start looking to potentially find some quick wins. More and more I’ve found myself checking Performance Insights first, before diving into more detailed tools like WebPageTest.</p><h2>Suggested improvements</h2><p>Below the results, you’ll find a section with suggested improvements. To be honest, I don’t really scroll down this far when using Performance Insights.</p><p>In this section you’ll find a brief explanation about the suggested improvement, and some related libraries and tools you can use. Suggestions don’t appear to be customised based on your results, other than highlighting any high impact areas to look at.</p><p>Since Performance Insights is still in beta, I’m hopeful there’ll be a bit more work done to present suggested improvements that are relevant to your site’s results, and/or the tooling you use to build your site (improvements for a WordPress site <em>could</em> differ from those for a Gatsby site for example).</p></div>YouTube facades with Cloudflare Workers2024-02-20T13:25:46Zhttps://fershad.com/writing/youtube-facades-with-cloudflare-workers/<div><p><a href="https://web.dev/third-party-facades/">Lazy load third-party resources with facades</a>. That’s one of the recommendations you might have come across when running your site through Google Lighthouse audits. Using facades can greatly reduce the amount of data downloaded, and (at times) computation required, when a page first loads.</p><p>YouTube embeds are a very good use case for facades. Out of the box, a regular YouTube embed downloads around 1MB of data as a page loads. Not just that, but a lot of this data is in the form of JavaScript which the browser also has to parse and execute.</p><p>In most cases, a facade will load a placeholder image & pull in the rest of the content only if the user clicks to view the video. In this way, both data download and execution is deferred until it’s actually needed.</p><h2>Using facades without modifying the source</h2><p>Ideally, you’d be able to modify the source code for a web page to replace YouTube embeds with facade. But what if you can’t? That’s a question that I was having a think about last week.</p><p>Often, in audits I run, reducing the impact of YouTube videos is one of the key sustainability findings that emerges. Sometimes, the website owners I work with are in a position to make changes to address that.</p><p>But what about when they’re not able to? I was wrestling with that last week, and realised this is a perfect use case for edge functions. So, I wrote up a quick Cloudflare Worker as a proof of concept.</p><h2>YouTube Lite Worker</h2><p>You can find the code for <a href="https://github.com/fershad/yt-lite-worker">this Cloudflare Worker on GitHub</a>. It relies on Cloudflare Worker’s HTML Rewriter API and Cheerio.js to do most of the heavy lifting. It replaces standard YouTube iframes, with <a href="https://github.com/justinribeiro/lite-youtube">justinribeiro/lite-youtube</a> implementation. Let’s take a quick walk through the code.</p><pre class="language-javascript"><code class="language-javascript">import * as cheerio from 'cheerio';
const ytIdRegex = /^.*((youtu.be\/)|(v\/)|(\/u\/\w\/)|(embed\/)|(watch\?))\??v?=?([^#&?]*).*/;
function getID(url) {
var match = url.match(ytIdRegex);
return (match && match[7].length == 11) ? match[7] : false;
}
async function findIframes(req) {
const html = await req.text()
try {
const $ = cheerio.load(html)
const iframes = $('iframe[src*="youtube"]')
for (const iframe of $(iframes)) {
let src = $(iframe).attr('src')
const id = await getID(src)
const className = $(iframe).attr('class')
const params = new URL(src).searchParams.toString()
if (id) {
const lite = `<lite-youtube class="${className || ''}" videoid="${id}" nocookie params='${params}'> </lite-youtube>`
$(iframe).replaceWith(lite)
}
}
return $.html()
} catch {
console.log('Error parsing html')
return html
}
}
class addJS {
async element(element) {
element.append(`<script type="module" src="https://cdn.jsdelivr.net/npm/@justinribeiro/lite-youtube@1.3.1/lite-youtube.js"></script>`, {
html: true,
})
}
}
async function handleRequest(req) {
const acceptHeader = req.headers.get('accept');
if (acceptHeader && acceptHeader.indexOf('text/html') >= 0) {
const url = new URL(request.url);
const res = await fetch(url)
const html = await findIframes(res);
const newRes = new Response(html, {
headers: {
'Content-Type': 'text/html',
}
});
const rewritter = new HTMLRewriter().on('body', new addJS())
return rewritter.transform(newRes)
}
return fetch(req.url, req)
}
addEventListener('fetch', event => {
event.respondWith(handleRequest(event.request))
})</code></pre><p>Starting from the bottom, this Worker would sit on a route waiting for <code class="language-markup">fetch</code> requests to be made. When a request is made, the Worker sends it to the <code class="language-markup">handleRequest()</code> function to perfom some action on it.</p><p>The <code class="language-markup">handleRequest()</code> function is where all the work happens. First, we check to see if the request being made is for a HTML page. If it’s not, we just pass it through without any modifications.</p><p>If we are dealing with a HTML request, then we fetch that page’s content. We then pass that to the <code class="language-markup">findIframes()</code> function. In there, Cheerio.js is used to parse the HTML of the page, looking for <code class="language-markup">iframe</code> tags with a <code class="language-markup">src</code> attribute that contains the string <code class="language-markup">youtube</code>. When matching iframes exist, we loop through them, parse the <code class="language-markup">src</code> to extracting some parameters like the YouTube video ID. Then, we build the <code class="language-markup"><lite-youtube></code> web component that replaces the YouTube iframe in the HTML.</p><p>Once all this is done (within a few tens of milliseconds), the HTML then gets returned to user. From the user’s perspective, they’ll see the same web page as they would have originally. But under the hood, the browser has been able to avoid the overhead of downloading and parsing extra content to show the YouTube videos.</p><h2>The same can be done for Vimeo</h2><p>I’ve also created a <a href="https://github.com/fershad/vimeo-lite-worker">similar repository for Vimeo</a> embeds. It works much in the same way as described above. That said, Vimeo doesn’t seem to send down <em>as much</em> data initially when compared to YouTube, so the sustainability and performance upsides are less pronounced.</p></div>Hidden in plain sight2024-02-20T13:25:46Zhttps://fershad.com/writing/hidden-in-plain-sight/<div><p>This post looks at a small code snippet that I was shown during a conversation about eliminating flash of unstyled text (FOUT) from a page. It does that, but actually exposes the site to some pretty nasty performance issues.</p><p>On a recent call I was shown the below code snippet by a designer.</p><pre class="language-html"><code class="language-html"><script>
var Webflow = Webflow || [];
Webflow.push(function () {
$('html').addClass('webflow-loaded')
});
</script>
<style>
.wf-loading * {
opacity: 0;
}
</style></code></pre><p>Upon seeing it, I broke out into a cold sweat. The designer went on to explain how they were looking for a way to prevent text visually changing as the page was loading (FOUT), and had found this code snippet recommended on a forum. It did the trick. Happy days.</p><p>But this code also sets their site up for the possibility of not showing any content at all. Let’s take a look at what it’s doing, and why it’s not so great from a performance perspective.</p><h2>An invisibility cloak</h2><p>To understand what’s going on we’ll start from the bottom.</p><pre class="language-html"><code class="language-html"><style>
.wf-loading * {
opacity: 0;
}
</style></code></pre><p>You might be able to take a guess at what this small style block is doing. It tells the browser that anything (<code class="language-markup">*</code>) that lives inside a parent element with a class of <code class="language-markup">.wf-loading</code> should be hidden (<code class="language-markup">opacity: 0</code>).</p><p>On the page we were looking at, the <code class="language-markup">.wf-loading</code> class was applied to the <code class="language-markup"><html></code> tag. On a web page, <em>everything</em> is a child of the <code class="language-markup"><html></code> tag. The code above was effectively telling the browser <em>“just hide everything on the page”.</em></p><h2>The great reveal. Maybe.</h2><p>The script block above the style tag is where the magic happens.</p><pre class="language-html"><code class="language-html"><script>
var Webflow = Webflow || [];
Webflow.push(function () {
$('html').addClass('webflow-loaded')
});
</script></code></pre><p>This block declares a Webflow variable, and then adds a function to it that adds the <code class="language-markup">webflow-loaded</code> class to the <code class="language-markup"><html></code> tag on the page. That class will reveal the page by setting <code class="language-markup">opacity: 1</code> for the all children of the <code class="language-markup"><html></code> tag.</p><p>As an aside, this is a very similar technique to what most A-B testing services use. That’s a topic for another time, but it’s worth being aware of how they might be impacting your site’s performance as well.</p><h2>What could possibly go wrong?</h2><p>When the page is loaded on a fast, desktop internet connection, things go pretty smoothly. Largest Contentful Paint fires well within the threshold for a “Good” Core Web Vitals score.</p><p>Things start getting a little hairy when moving over to mobile though. On a 4G network, testing on a Motorola G4, the Largest Contentful Paint time jumps out to over 5 seconds. On a 3G connection that goes up to almost 9 seconds.</p><p>That’s 9 seconds during which the user is presented with no content, despite the fact that most of the page’s CSS, fonts, and images have already been downloaded. This is because the browser is waiting for the Webflow variable to be declared. This declaration relies on the Webflow JavaScript file being downloaded and parsed.</p><h3>“All your users are non-JS while they’re downloading your <strong>JS.”</strong></h3><p>So said Jake Archibald. And how true that is. As I touched on just above, while the user is waiting for the JavaScript package to finish downloading and executing, they’re looking at a plain white screen for close to 10 seconds.</p><p>Research suggests 53% of visits are likely to be abandoned if pages take longer than 3 seconds to load (<a href="https://blog.google/products/admanager/the-need-for-mobile-speed/">Google, 2016</a>). That 10 second window could be seeing hundreds, or thousands, of visitors abandoning the site.</p><h2>What could be done to fix this?</h2><p>First up, I’d remove the script entirely. Browsers do an amazing job of loading websites, and we should embrace that (plus give them a bit of help occasionally). You’d be surprised how often performance issues are a result of us trying to be too clever for our own good.</p><p>But okay, there’s still the desire on the part of the client to reduce/remove FOUT on the page. That could be tackled in a few different ways. We touched on a few during the <a href="https://optimised.email/series/optimising-web-fonts/">series on web font optimisation</a> last year.</p><ol><li>Use system fonts for most (or all) of the text content.</li><li>Limit the number of custom web fonts that are used.</li><li>Subset the web fonts you’re using. This removes characters that are surplus to requirements, and brings down the file size.</li><li>Use the <code class="language-markup">font-display: optional</code> declaration. This instructs the browser to hide text for 100ms and then load the web font only if it's available. If it's not ready, then the browser will use a fallback font instead for that page view. The custom font will be saved in cache, ready for the next time it’s needed.</li><li>Set an appropriate fallback font that most closely matches the custom font you’re using.</li></ol><p>Using one, or a combination of, the techniques above we can <em>minimise</em> FOUT on the page while also ensuring users are presented with content as soon as possible during page load.</p></div>Take it easy with transitions2024-02-20T13:25:46Zhttps://fershad.com/writing/take-it-easy-with-transitions/<div><p>Transitions and animations can make for wonderful website experiences. <a href="https://www.cassie.codes/">Cassie Evans’s personal website</a> is a playful, whimsy corner of the internet that is delightfully crafted. But as with a lot of things, transitions and animations can be overused on websites. And, at times, that can come detriment of web performance.</p><p>This post will look at how transitioning content above the fold can impact a page’s performance metrics.</p><h2>Triggered</h2><p>Okay, so a quick admission. I’m not a huge fan of pages that fade in entire sections of content as you scroll. So, even though we’ll be concentrating on content ‘above the fold’, I’ve got some biases when it comes to what we’ll talk about for the rest of this article.</p><h2>Full page fade</h2><p>This is something I’ve seen on a few sites recently, and it’s a very, very easy way to absolute trash your site’s Core Web Vital scores.</p><p>Fading in a page when a user lands on it might look “cool”, but it relies heavily on JavaScript to kick off the fade in transition. This is because you probably want to wait until the page’s content has been loaded before fading it in. You’ll therefore need a script that’s listening for some load event, and then executing some code (normally adding/removing a class from the <code class="language-markup">body</code>).</p><p>I was recently looking at a site that was pulling in all their assets from multiple external domains. They were using jQuery UI as part of their theme to fade in the page once all the content had been loaded. Connections to the external origins was setting their page load back by about 500 milliseconds. Meanwhile waiting for jQuery and its dependencies to load (also from external origins) was tacking on another 1.5 seconds. The jQuery code was then waiting for the rest of the page’s content to download before revealing the page.</p><p>It’s little wonder that LCP timings on mobile were upwards of 5 seconds in this case.</p><h2>Partial fades</h2><p>This is another one that I see regularly. Rather than transitioning in the full page, effects are added to individual elements above the fold. While this is better than waiting for the entire page to load, poor implementation can still lead to LCP or CLS blowouts.</p><h3>Minimise your reliance on JS</h3><p>If you do want to partially fade in a particular above-the-fold element on your site, then try to avoid a situation like the one above. If possible, try to target the element specifically using as little JavaScript as possible. You can use the <code class="language-markup">onload</code> attribute to trigger a small bit of code that addes a CSS class to fade in the element once it’s ready. This is far more efficient than having to download an entire JavaScript library.</p><h3>Know your LCP</h3><p>It could be the case that the element you’re fading in is also the LCP candidate for the page. In this case, you’ll want to be sure to check how the transition effect impacts your LCP timings.</p><h3>Watch out for layout shifts</h3><p>This could happen in the event that you’re using a JavaScript library to control how content on your page is positioned. One site I recently audited experienced this. Their page would load, with content in one position, then once their theme’s JavaScript kicked in that content would reposition itself and have some transition effects applied. This presents a rather janky experience for the user, and is something you’ll want to try and avoid.</p><h2>Sprinkles are good</h2><p>As with a lot of things, when used in moderation transitions can add a remarkable amount of character to a website. Being selective in how you apply transitions above the fold can go a long way to ensuring they’re not negatively impacting on your site’s performance too.</p></div>Introducing Flowty - Build low carbon, self-hosted Webflow sites2024-02-20T13:25:46Zhttps://fershad.com/writing/introducing-flowty-build-low-carbon-webflow-sites/<div><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Update 24 March 2023</p><p></p><p>Due to a legal notice from Webflow, the Flowty project can no longer be publicly accessed. Please note that some website links, and Github links in this post may not work as expected.</p><p></p></div><p><a href="https://webflow.com/">Webflow</a> is a really nice service for designers on the web. It couples a drag and drop builder with options to add more granular configurations on elements. It really allows web designers to create beautiful websites without getting too much into the technical weeds.</p><p>That said, this ease can sometimes come at the cost of web sustainability. Webflow sites are hosted on Amazon Web Services (AWS). Although AWS has taken steps to improve the sustainability of their services, they still have a way to go. Webflow itself is also currently lacking a few optimisations that would allow designers to build truly sustainable sites.</p><h2>Nerd sniped</h2><p>The idea for creating optimised, low carbon Webflow sites goes back a couple of months to a conversation in the <a href="https://climateaction.tech/">Climate Action Tech community</a>. A post from <a href="https://www.suninthecorner.com/">designer Katy Jackson</a> triggered my curiosity. Katy is focused on delivering low-carbon, sustainable websites for her clients. As part of this, she was looking for a sustainable option to host the sites she builds with Webflow.</p><p>After some back and forth, and with Katy trying other services that allow Webflow sites to be self-hosted, we jumped on a call. Chatting with her, and getting a sense of how these other services worked, I began to think “heck, I could probably spin up something that could solve this”. I had officially nerd sniped myself.</p><h2>The idea behind Flowty</h2><p>With Flowty, you still use Webflow’s design and editing tools to build, publish, and maintain sites. Flowty’s code then takes your Webflow site, runs the pages through the <a href="https://www.11ty.dev/">Eleventy static site generator</a>, and applies a series of optimisations to the page content and assets. Flowty outputs sites as good old fashioned HTML files. This allows sites to be hosted almost anywhere. Through this flexibility, plus the optimisations applied, Flowty can help designers deliver more sustainable, low carbon Webflow sites for their clients.</p><h2>What does Flowty optimise?</h2><p>In deciding how to allow Flowty to be configured, I wanted to keep things as simple as possible. Designers should be allowed to focus their efforts on creating amazing sites for their clients, not having to hand-roll website code and manually optimise images. For that reason, I have chosen to build an online dashboard that will provide a no code way for designers to select the optimisations to apply on their sites.</p><p>Via the dashboard, designers can turn on/off the following optimisations:</p><h3>Image optimisations</h3><ul><li><strong>Download:</strong> Images can be downloaded, and served from the same domain as the website itself. This helps with performance, and ensures they are hosted sustainably if a green web host is used.</li><li><strong>Modern formats:</strong> At the time of writing, Webflow does not serve WebP or AVIF versions of images. Flowty runs all images through an optimisation step and serves modern formats with a fallback for older browsers.</li><li><strong>Optimise background images:</strong> Background images that are requested via CSS also get downloaded locally, and run through an optimisation step to reduce their size.</li></ul><h3>CSS optimisations</h3><ul><li><strong>Download:</strong> Like images, Flowty also downloads Webflow CSS files and serves them locally.</li><li><strong>Inline critical CSS:</strong> An optional step to help with performance. Flowty extracts the CSS required for the initial rendering of a page, and inlines it into the HTML.</li><li><strong>Remove unused CSS:</strong> An optional step that should be used with care. Flowty checks each page, and generates a file CSS that includes only the declarations required by that page. This can greatly reduce the CSS file size for some pages.</li></ul><h3>JavaScript optimisations</h3><ul><li><strong>Download:</strong> The same idea as CSS and images. Flowty downloads both the Webflow JS and jQuery files used by a site.</li><li><strong>Remove:</strong> Some sites don’t much interactivity. In these cases, designers can save kilobytes by removing the Webflow JS or jQuery files from the site.</li><li><strong>Minify:</strong> Sites on Webflow’s free plan come with unminified JS. Flowty runs a minification step on the Javascript files it downloads, to reduce their transfer size.</li></ul><h3>Font optimisations</h3><ul><li><strong>Download:</strong> Webflow allows designers to use Google Fonts or upload their own custom fonts. Flowty downloads these font files and hosts them on the same origin as the site itself.</li></ul><h3><strong>Video optimisations</strong></h3><ul><li><strong>Download:</strong> Background videos uploaded to Webflow are downloaded by Flowty and served locally.</li><li><strong>Optimise Youtube embeds:</strong> Flowty uses the <a href="https://github.com/paulirish/lite-youtube-embed">Lite Youtube Embed package</a> to significantly reduce the amount of data consumed by embedded Youtube videos when a page is first loaded.</li></ul><h2>A few added extras</h2><p>One top of having the option to host a Webflow site on a sustainable hosting provider, plus the optimisations mentioned above, Flowty has a few extra bonuses that designers can leverage.</p><ul><li>Download website metadata (icons, open graph images etc) and host them locally.</li><li>Use <a href="http://instant.page/">instant.page</a> to improve site navigation.</li><li>Add custom code to the <code class="language-markup">head</code> and <code class="language-markup">body</code> of a site (currently available in Webflow only through a paid plan).</li><li>Remove the “Made in Webflow” branding on pages.</li><li>Generate sitemap and robots.txt.</li></ul><p>As the web platform continues to evolve, I’ll be looking at how to bring in more sustainability and performance optimisations into Flowty.</p><h2>Show me some results</h2><p>Glad you asked! In testing with some sites, I’ve seen multiple megabyte reductions in page size. On others the gains are less significant. To test things out for myself, I built a simple landing page for Flowty using Webflow. You can visit it at <a href="https://flowty.site/">https://flowty.site</a>.</p><p>Saying it’s just a simple page is a bit of disservice. It does include a background image, and a Youtube embed for good measure. The Webflow hosted version (which <a href="https://flowty-landing-page.webflow.io/">you can find here</a>) comes in at 1.2mb. Using Flowty, and hosting on Cloudflare pages, the size of the page come down to just 125kb. It’s worth noting that the Flowty version of the site also includes a <a href="https://usefathom.com/ref/CEHKLY">Fathom Analytics</a> (affiliate link) script that is added to the site via the custom code configuration option in Flowty.</p><p>Even with that additional script we’re able to get more than 1MB off the size of the page, and host it on Cloudflare Pages, one of the more sustainable options for static site CI/CD hosting. Pretty neat.</p><h2><del>Want to find out more?</del></h2><p><del>Flowty will be open to a very early access set of alpha users in May. If you’re building sites with Webflow and would like to see a demo, then head over to the <a href="https://flowty.site/">Flowty landing page</a> and use the link at the bottom (or drop me an email)!</del></p><p><del><strong>Update Sept 25, 2022: </strong>Flowty is now an open source project. It is also no longer maintained, updated, or supported. Recent actions by Webflow towards similar paid services have made me decide to stop work on this project. By making the code available, I hope that those who do care about web sustainability, but want/need to use Webflow can still have a means to build & host their site sustainably.</del></p></div>Adding a Directory and API to ‘Are my third parties green?’2024-02-20T13:25:46Zhttps://fershad.com/writing/adding-a-directory-and-api-to-are-my-third-parties-green/<div><p>Since it launched a few months ago, ‘<a href="http://aremythirdpartiesgreen.com/">Are my third parties green?</a>’ has grown beyond the original idea of a small tool to scan a single web page.</p><p>In recent times I’ve added a <a href="https://aremythirdpartiesgreen.com/directory">Directory section to the site</a>, allowing people to find sustainable third-party providers across a range of categories. While building out the Directory, I realised it would be helpful to have some way for people to search third-party providers as well. This subsequently led me to develop an API to surface different bits of information.</p><h2>Creating the Directory</h2><p>The original aim of ‘Are my third parties green?’ was to provide a way for people to find out if the third-party services being used on their websites are being served from green web hosts. After launching it, though, I realised that it would probably be even more helpful if there was a way for people to select greener third-party services straight off the bat.</p><h3>A lot of data to display</h3><p>I use the <a href="https://github.com/patrickhulce/third-party-web/">Third Party Web dataset</a> to categorise third-party requests. This dataset already contained all the information I would need for the Directory page. To work out the green status of a third-party service, I would query the <a href="https://www.thegreenwebfoundation.org/"><strong>Green Web Foundation's</strong></a> (GFW) API. My original plan was to group third-party services by category, and show these groupings on the page in a single view.</p><p>So, I set about writing a script to massage the data from Third Party Web and GFW into the structure I’d needed. By scripting this step, I could rerun the process whenever new data was available.</p><p>My initial approach quickly came up against a couple of problems. First up, the JSON file that was being generated to categorise third-party services ended up at over 750kb. That’s a very large file to download and parse when loading the page.</p><p>The other issue that arose was the sheer size of the page itself. Showing all categorise in a single view resulted in a very, very, very long page. This also resulted in the HTML document being very large, which meant the browser really had to work hard to render the page. I tried adding filters for each category to work around this, but they didn’t help.</p><p>I had to rethink how data would be presented.</p><h3>Loading only what’s needed</h3><p>Having the filters in place gave me an idea of how to reduce the size of both the page, and the data behind it. Rather than loading all the data at once, I created individual data files for each category. Using the filter as a trigger, the site downloads the required data for a given category only when the user selects that category.</p><p>This meant that the initial render of the page would be very small, and the user wouldn’t have to download any data they weren’t interested in.</p><h2>The need arises for an API</h2><p>Despite the Directory having filters, there was still a lot of content being presented for some categories. I wanted to give people an easier way to quickly find details about a particular third-party service they were interested in. I needed something like a search functionality.</p><p>In order to provide this for the Directory, I needed a means of filtering through over 2000 third-party entities based on a user’s input. Ideally, this processing would be performed off of the user’s device.</p><p>As I thought through some of the ways I could tackle this, I began to realise that building my own API and dogfooding that would be a good approach. Not only would an API be useful for this use case, but it could come in handy for other projects later down the line.</p><h3>Building the API with Cloudflare Workers</h3><p>I’ve never built an API before. I have some basic idea of how they work, being that you have some endpoint that returns data based on the kind of request it receives. So, to start with I needed a router and I assumed the rest would be just building out some functions to parse and return results.</p><p>Using Cloudflare for my DNS allowed me to use Cloudflare Workers to serve my API endpoints. Rather than building a Worker for each individual endpoint, the Cloudflare team have created a <a href="https://github.com/cloudflare/worker-template-router">starter Worker’s router repository</a>. Using this I could keep all my code in one place while still being able to handle requests to different endpoints.</p><h3>Endpoints for now, and later on</h3><p>With the Worker’s router setup, it made it easier for me to build out multiple endpoints that I could use. Starting out I needed an endpoint that allowed me to search for third-party services whose name matched the entered search string. But now I started to wonder if there were some other endpoints that I could build, which might come in handy for other things later on. In the end, I decided to create three endpoints:</p><ul><li>One for searching by third-party service name</li><li>One to search by company</li><li>And, one to search by the URL of a third-party request</li></ul><p>For the Directory page, I would only need the first one. But I can already picture uses for the other two in future projects. To provide data for the three routes above, I also created two new JSON data files. Both files stripped out any data that was surplus to requirements.</p><h3>A free, public API is born</h3><p>Because the API was fetching data from static files, and thanks to Cloudflare’s generous free tier, I decided to make the API free for public use. I hope this allows others to use the information as part of other cool projects they might be working on.</p><p>You can find <a href="https://aremythirdpartiesgreen.com/api-docs">docs for the API</a> on the ‘Are my third parties green?’ website. If you do end up using the API as part of a project, I’d love to know!</p><h2>Search for the Directory</h2><p>Once the API was built, I could start dogfooding it to build search functionality for the Directory. With over 2000 services available to search, I wanted to ensure that people could find what they were looking for without stumbling across “no data found” kind of situations.</p><p>With that in mind, I used the <a href="https://github.com/pstanoev/simple-svelte-autocomplete">Simple Svelte Autocomplete</a> component to check the API for a user’s query and return any matching service names. This allows you to type ‘Google’, for example, and see all the known Google services you can select.</p><h2>An end in sight?</h2><p>I originally thought ‘Are my third parties green?’ would be something I’d put out there into the world, and maybe touch once or twice a year with small data updates. So far, that’s been far from the case. But with the Directory and API now in place there’s only one last thing I want to add to the site - the ability to set cookies when a page is scanned. Once that’s done, I think that ‘Are my third parties green?’ will be set, and I can shift my focus to another side projects.</p></div>Creating user personas for website performance testing2024-02-20T13:25:46Zhttps://fershad.com/writing/user-personas-for-website-performance-testing/<div><p>In a past life, I worked in product marketing. One of the things our management would regularly ask us to do was to research and create user persona. These personas would form the basis around which would we write and craft the marketing materials for our products. In the same way, it helps to have some baseline personas on hand when looking at doing performance testing on a website.</p><p>The global nature of the web allows companies to target international markets that would otherwise be inaccessible to them. That’s where these personas come in really handy. They allow us to have different baselines for users in different parts of the world. This, in turn, allows us to conduct more realistic performance tests.</p><h2>The web isn’t the same for everyone</h2><p>Not everyone’s experience of the internet is the same. Hardware, software, and network conditions different greatly across the world, and even within countries. Some of your website’s visitors will be on high-speed connection, and new laptops. Others experience it on less powerful Android phones with mobile internet. Some of your users may be using the internet on limited or capped data plans.</p><p>So, with all this variance how can we use personas to set hardware, software and network baselines to use when running website performance tests?</p><h2>Approaches to creating user personas</h2><p>There are a few approaches we can take to get an understanding of the conditions under which users in different parts of the world visit our sites. At the heart of each is a need to understand:</p><ul><li>What device are they using?</li><li>What’s their internet connectivity like?</li><li>What browser are they using?</li></ul><h3>User interviews</h3><p>If you have access to customers, or users, from different parts of the world then you can ask them these questions directly. This can be either in the form of surveys, or direct interviews.</p><h3>Website analytics</h3><p>Your website’s own analytics might allow you to filter data to answer the questions above. Most services allow you to filter visitors by country, and get general information about the browsers and device types (desktop, mobile, tablet) they are using. Ideally, we’d like to get more detailed device information like brand, model, and/or operating system.</p><h3>Conduct your own research</h3><p>There’s enough data out there that should allow you to create an “average user” persona for visitors from almost any country. This does take a bit of extra time and effort, but it is definitely worth it.</p><h4>Find out the average device</h4><p>Looking at mobile operating system market share is one way to get a feel for the types of devices that are more common in a given country or region. StatCounter Global Stats’ <a href="https://gs.statcounter.com/os-market-share/mobile/worldwide">Worldwide Mobile Operating System Market Share</a> is a great way to find this information. Data is available globally, by region, and by country.</p><h4>Understand network conditions</h4><p>There’s a bit more data available when we’re looking to get a feel for network speeds in another part of the world. <a href="http://cable.co.uk/">Cable.co.uk</a>’s <a href="https://www.cable.co.uk/broadband/speed/worldwide-speed-league">Worldwide Broadband Speed League</a> presents median broadband internet speeds at both the country and regional level.</p><p>If you’re focused on mobile experience, <a href="https://www.gsma.com/mobileeconomy/#trends">GSMA’s ‘The Mobile Economy’ reports</a> are available for different regions, and present a breakdown of mobile technology adoption per country. These reports also include projections for technology adoption for the next few years, making them handy for future planning.</p><h4>What browser are people using?</h4><p>Once again, StatCounter Global Stats has our back here. Their <a href="https://gs.statcounter.com/browser-market-share">Browser Market Share Worldwide</a> dataset contains browser market share data by country, and even platform.</p><h2>Using personas for a performance test</h2><p>Now that we’ve created a few personas for different users around the globe, we can start using them in website performance tests. Probably the best tool for this type of testing is <a href="http://webpagetest.org/">WebPageTest</a>. Using WebPageTest’s Advanced Configurations panel we can set:</p><ul><li><strong>Test location</strong>: Pick one in or close to the country you want to target</li><li><strong>Browser</strong>: Choose the desktop browser or mobile device you want to test on</li><li><strong>Connection</strong>: Set the network conditions for your test</li></ul><p>Remember, the profile we’ve created is a baseline. You can adjust the preferences to test for slightly better or worse conditions.</p></div>New Web Vitals Responsive Metric Appears in the Wild2024-02-20T13:25:46Zhttps://fershad.com/writing/new-web-vitals-responsive-metric-appears-in-the-wild/<div><p>In <a href="https://web.dev/better-responsiveness-metric/">a blog post last June</a>, Google tabled some of the ways it was looking to improve capturing and reporting on website responsiveness. <a href="https://web.dev/responsiveness/">Last November</a> they put forward details of a new responsiveness metric that would go beyond what is currently measured by First Input Delay (FID).</p><p>Now, in March 2022, experimental reporting of this new responsiveness metric has become available in the Chrome User Experience (CrUX) dataset. Data is available starting from February, 2022.</p><h2>What is the new responsiveness metric?</h2><p>To summarise it down to a few short points, the new responsiveness metric aims to:</p><ul><li>Capture the full duration of interaction events.</li><li>Group events into interaction types. For example, a keypress would consist of a <code class="language-markup">keydown</code> and <code class="language-markup">keyup</code> event.</li><li>Aggregate all events for a page visit.</li></ul><p>The blog posts linked to in the previous paragraph go into much more depth on this.</p><h2>How does it differ from FID?</h2><p>FID measures the time the browser takes to begin responding to a user interaction event. It’s very much an “under the hood” metric, as it deals with main-thread responsiveness. As the Google team acknowledge:</p><blockquote>FID does not include the time spent running those event handlers, nor any work done by the browser afterwards to update the screen. <a href="https://web.dev/better-responsiveness-metric/#what-improvements-are-we-considering"><strong>Towards a better responsiveness metric</strong></a> </blockquote><p>Good FID does not guarantee good responsiveness scores, as is highlighted by the data for <a href="http://edition.cnn.com/">edition.cnn.com</a> below.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/1effae10a96410d58ed183b2c7d37324465e07b3-1079x887.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/1effae10a96410d58ed183b2c7d37324465e07b3-1079x887.png?auto=format" alt="Treo web vitals report for the CNN website, showing the new responsiveness metric." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Treo web vitals report for the CNN website, showing the new responsiveness metric.</figcaption></figure><h2>How can I start checking my site’s responsiveness?</h2><p>If you want to query the <a href="https://developers.google.com/web/tools/chrome-user-experience-report/bigquery/getting-started">CrUX BigQuery</a> dataset, you can surface responiveness data with <code class="language-markup">experimental.responsiveness</code> (<a href="https://groups.google.com/a/chromium.org/g/chrome-ux-report-announce/c/F7S4_emZkcw?pli=1">release notes</a>). The team at Treo have also updated their free <a href="https://treo.sh/sitespeed/">Site Speed report</a> to show the new metric.</p></div>A conversation with Gaël Duez on Green I/O2024-02-20T13:25:46Zhttps://fershad.com/writing/a-conversation-with-gael-duez-on-green-io/<div><p>A couple of weeks ago, on a rainy afternoon here in Taipei, I sat down for a virtual chat with Gaël Duez. We spoke about web sustainability, how companies can start approaching the topic, and a bit about some of the other projects I’m working on too.</p><p>That chat has become the first episode of Gaël’s new podcast, Geen I/O - the podcast for doers building a greener digital world, one byte at a time! It’s my first time as a podcast guest, and the cat also managed to make a guest meow appearance at around the 18-minute mark 😅.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Links</p><p></p><div><p><strong>Listen</strong> </p><ul><li><a href="https://anchor.fm/greenio/episodes/Fershad-Irani---Using-website-performance-to-green-the-web-e1f6179">Listen to Green I/O - Episode 1 - Fershad Irani - Using website performance to green the web</a> </li></ul><p><strong>Extras</strong></p><ul><li><a href="https://gaelduez.com/blog/2-greenio-1-Fershad-Irani-web-performance-sustainability">Episode show notes & links</a> </li><li><a href="https://gaelduez.com/">Gaël’s website</a> </li></ul></div><p></p></div></div>Checking the sustainability of third-party requests with “Are my third parties green?”2024-02-20T13:25:46Zhttps://fershad.com/writing/building-are-my-third-parties-green/<div><p>Over 94% of sites use at least one third-party resource, accounting for over 45% of website requests. This finding <a href="https://almanac.httparchive.org/en/2021/third-parties#prevalence">from the 2021 Web Almanac</a> absolutely blew my mind. It also got me thinking. “How many of these third-party requests are served from green web hosts?”, I wondered. It wasn’t something I’d seen talked about much or surfaced in other website sustainability tools. So, I decided to build something that would allow me to answer the question - “<a href="https://aremythirdpartiesgreen.com/">Are my third parties green?</a>”</p><h2>A few early decisions</h2><h3>Building a website</h3><p>From the outset I wanted to build something that was accessible and usable beyond just developers. A lot of third-party services are added to websites to marketing, sales, and tracking purposes. So, building a tool that folks from non-dev teams could use was a large reason why I chose to build a website, rather than a command line tool or node script.</p><h3>Using Google Lighthouse</h3><p>Although the main aim of the project was to check third-party requests for green hosting, I did also want the ability to surface other details later down the line. I knew Google’s PageSpeed Insights (PSI) API, which uses Google Lighthouse under the hood, was one way to get this kind of information when given a web page URL. For the bulk of the time developing the site I was working with PSI to scan sites.</p><p>However, as I got further into building the site, I realised that it would be very handy to have my own instance of Lighthouse. Doing so would allow me to manipulate data before it’s returned, as well as allow me to explore other scenarios like how pages load third-parties for different geographies (think EU, GDPR etc). Now, I needed to work out where to host it.</p><h3>Google Cloud Functions</h3><p>In the end I settled on Google Cloud Functions (GCF) for hosting my Lighthouse instance. I settled on GCF for a few reasons.</p><ul><li>I wanted to host my code sustainably, and know that Google Cloud Platform has published information about the <a href="https://cloud.google.com/sustainability/region-carbon">sustainability of their locations globally</a>.</li><li>I considered services like <a href="https://render.com/">Render</a>, but the limited choice of regions wouldn’t allow me to easily expand to more locations.</li><li>I found a <a href="https://github.com/matthoffner/lighthouse-cloudfunction">code repository</a> that I could use to get up and running quickly.</li></ul><p>I wanted to avoid making too many function requests, however. I’d never used something like GCF before, but have heard plenty of stories about cloud functions running up huge usage bills in the past. In a bid to prevent this, I wanted to cache test results for a reasonable period.</p><h3>Cloudflare Workers KV</h3><p>For caching results, I went with Cloudflare Workers KV (key-value) storage, a simple storage solution that would allow me to easily save results. Workers KV also allows for keys to be set to automatically expire after a set duration. With this I was able to cache results for one week, to limit the number of new test runs that were made to the GCF.</p><p>KV also provides a low-complexity ways to persist results. This made it possible to enable results to be shared. Being able to persist results also opens up the possibility to analyse results later. To be able to do more with historical results would require a slightly more refined storage solution. But, for simply persisting results with a unique ID key KV workers perfectly.</p><h3>Made with Svelte</h3><p>I’ve been wanting to use <a href="https://kit.svelte.dev/">SvelteKit</a> on a project for a very long time. “Are my third parties green?” provided the perfect case for me to do so.</p><h3>Determining green hosting</h3><p><a href="https://thegreenwebfoundation.org/">The Green Web Foundation</a> maintains a dataset of known, verified green hosting providers. They provide an API which also allows for URLs to be checked for green hosting. I leverage this to check the third-party domains found in each test run.</p><h2>More than green hosting</h2><p>When setting out to build this project, the aim was to uncover the hosting of third-party services. However, when looking at the results returned from Lighthouse I realised that there would be a chance to show a bit more useful information.</p><h3>Carbon impact</h3><p>Because Lighthouse returns the downloaded size of requests, I decided to also show a carbon estimate for each third-party resource. To achieve this, I used the <a href="https://sustainablewebdesign.org/calculating-digital-emissions/">calculations presented by Sustainable Web Design</a> - <a href="https://www.wholegraindigital.com/">Wholegrain Digital</a>, <a href="https://www.mightybytes.com/">Mightybytes</a>, <a href="https://www.medina-works.com/">Medina Works</a>, and <a href="https://ecoping.earth/">Ecoping</a>.</p><p>Since I was looking at the first load of the page, I refined the calculation to remove the assumptions made about returning visitors. With more time I may look at building in a sliding scale that can reflect the effect caching has on data transfer.</p><h3>Effective Caching</h3><p>Lighthouse also tries to determine if requests are effectively cached. When scanning a page it calculates a <code class="language-markup">cacheHitProbability</code> figure for each request. It presents this alongside other information like the cache duration.</p><p>For the purpose of “Are my third parties green?” I decided that anything with a <code class="language-markup">cacheHitProbability</code> less than 50% (0.5) would be considered to be ineffectively cached. This is an arbitrary determination on my part. Later on, I hope to use the cache duration details to present a more accurate reflection of third-party data transfer over time.</p><h3>Categorising third-party requests</h3><p>During my research for this project, I came across the <a href="https://github.com/patrickhulce/third-party-web/">third-party-web repository</a>. It contains over 2000 third-party entities, attempts to categorise them, and includes other information like sample domains and performance impact. I’ve leaned heavily on third-party-web for this project, and hope to contribute to it later this year.</p><h2>One last thing ... where to host it</h2><p>Originally, I wanted to host the site on Cloudflare Pages. Cloudflare is a green web host, according to The Green Web Foundation, and I’ve been using it for some time now on my own website. However, after initially setting it up there, I just couldn’t get the site to work. After some digging, I learnt that server-side rendering (SSR) was not yet supported by Pages. Bummer.</p><p>Instead, I moved to hosting the site on Netlify while still fronting it with Cloudflare’s CDN. Matt Hobbs has a very useful post if you’re <a href="https://nooshu.com/blog/2021/09/06/migrating-from-github-pages-to-cloudflare-and-netlify/">looking to set up the same</a>. With this in place, I was able to get the site up and running. It also provided the added benefit of being able to use Cloudflare’s Page Rules to cache the test result pages on the edge.</p><h2>Launching to the world</h2><p>This is the first time I’ve ever built something like this, so I was a bit apprehensive about sharing it publicly. Were there any bugs I’d not picked up? Did I miss something in the design? How will it go if there’s a few people hitting the API at once?</p><p>The reception the site got from the community at large was terrific. It meant a lot that folks in the ClimateAction.Tech community were positive about it (and helped give a few feature requests!). It was also very encouraging to see the response from web performance folks on Twitter. I hope that it got a few more people thinking about the overall impact of the internet.</p><h2>A few more ideas</h2><p>I had achieved the original aim of the project, to build a way to determine the sustainability of third-party resources on a web page. But, after launching I had a few more ideas to add extra value to the tool. There were also some requests for features that came up from folks in the ClimateAction.Tech community and on Twitter.</p><h3>European testing</h3><p>Originally tests were run out of servers in the US. However, with GDPR in place, some websites apply different third-party service based on the region of their visitors. A few people asked for the ability to test from a European location.</p><p>I’d thought about this as well, and it was part of the reason I went with using my own instance of Lighthouse on GCF. A few weeks after launching the site, I spun up a new instance hosted in Belgium and included the ability to change regions via an “Advanced settings” menu.</p><h3>Green third-party directory</h3><p>While building out the test results page I was trying to figure out if there was any way to suggest greener alternatives for services that were not on green web hosts. While I couldn’t think of anything at the time, shortly after launching the project I realised that the information from the third-party-web repository could be used for this very purpose.</p><p>Rather than being limited to “suggestions” on the results page, I decide to create a “<a href="https://aremythirdpartiesgreen.com/directory">Green third-party directory</a>”. In this way, people could find sustainable third-party services before implementing them on their website.</p><p>The building of this page was a challenge in itself, with over 2000 services. It’s probably worthy of a whole post on its own.</p><h3>More for the future</h3><p>There are still a few more things I want to do with the website before I feel it’s done:</p><ul><li>Allow the ability to set cookies before running tests. This was requested by a few people, since a lot of site load third-party resources based on user interaction (normally accepting/rejecting a cookie consent form).</li><li>Linking to the directory from results pages. This might be in place by the time you read this post.</li><li>Showing past test results for a page. This would require some more work around how results are persisted, perhaps needing to save them to a database so they can be easily accessed and filtered.</li><li>Showing file size and emissions data in the directory. This would be taken from the test results, so there might not be data for a lot of services to begin with. It would also probably need data to be stored in something other than KV to make it easier to retrieve and parse.</li><li>A search filter for the directory. Some categories have a lot of services, so being able to search for a specific service would make things easier to users.</li></ul></div>Render-blocking resources2024-02-20T13:25:46Zhttps://fershad.com/writing/render-blocking-resources/<div><p>When looking to optimise for paint metrics (First and Largest Contentful Paint for example), we’re almost certainly going to encounter render-blocking resources. In this post I’ll cover what they are, and some general tips on how to mitigate their impact on page performance.</p><h2>What is a render-blocking resource?</h2><p>The name says it all, to be honest. Render-blocking resources are like a roadblock for the browser as it goes about painting web page content to the screen. When encountering render-blocking resources, the browser waits until the code has been completely parsed and executed before it can continue rendering content in the viewport.</p><h3>Common render-blocking resources</h3><p>The most common forms of render-blocking resources are JavaScript and CSS requests that are present in the <code class="language-markup">HEAD</code> of a page. This includes inline code (found in <code class="language-markup"><script></code> and <code class="language-markup"><style></code> tags) as well as linked resources (using the <code class="language-markup"><link></code> tag). Third-party requests and tag managers can also be a source of render-blocking requests.</p><h3>CSS is <em>always</em> render-blocking</h3><p>Regardless of what it’s doing at the time, the moment a browser encounters a CSS resource it will stop to download and parse the CSS. Once that’s out the way the browser will continue working on rendering content on the screen.</p><p>CSS <a href="https://www.filamentgroup.com/lab/load-css-simpler/">can be loaded asynchronously</a> to avoid render-blocking though if not carefully managed can result in unstyled content being shown before the stylesheet has been parsed.</p><h3>JavaScript is <em>occasionally</em> render-blocking</h3><p>When the browser finds a synchronous script on the page it will pause and fetch the file (if it’s called via a <code class="language-markup"><link></code> tag), then parse and finally execute the code.</p><p>Asynchronous JavaScript won’t be render blocking, however the browser will execute asynchronous requests as soon as they finish downloading. So, there is a chance that asynchronous script execution gets in the way of other processes that are important to your page load.</p><h3>CSS will block synchronous JavaScript</h3><p>Because JavaScript can be used to manipulate CSS, the browser will first try to download and parse any synchronous CSS it has already encountered before coming across a JavaScript resource.</p><pre class="language-html"><code class="language-html"><!-- CSS will download and parse first -->
<link rel="stylesheet" href="/css/my-styles.css" />
<script src="/js/important-file.js"></script>
<!-- JS will download, parse and execute first -->
<script src="/js/important-file.js"></script>
<link rel="stylesheet" href="/css/my-styles.css" />
<!-- JS will start downloading in parallel with the CSS file -->
<script async src="/js/important-file.js"></script>
<link rel="stylesheet" href="/css/my-styles.css" /></code></pre><h2>How can you find render-blocking resources?</h2><h3>PageSpeed Insights</h3><p>If you’ve <a href="https://fershad.com/writing/testing-a-web-page-with-pagespeed-insights/">tested a page with Google’s PageSpeed Insights tool</a>, then you’ll be able to identify any render-blockng resources in the <strong>Opportunities</strong> section of the test results.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/1088f5900374c44e97a59d7fc48a2b452ee171ae-1080x567.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/1088f5900374c44e97a59d7fc48a2b452ee171ae-1080x567.png?auto=format" alt="PageSpeed Insights highlighting render-blocking resources" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">PageSpeed Insights highlighting render-blocking resources</figcaption></figure><h3>WebPageTest</h3><p>The WebPageTest waterfall chart surfaces render-blocking resources, and marks the request row with an orange circle containing a white cross. You can see that in rows 2 and 4 of the truncated waterfall chart below.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/6b4e520760ec82859cea3f260372b46e3409df29-826x433.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/6b4e520760ec82859cea3f260372b46e3409df29-826x433.png?auto=format" alt="WebPageTest waterfall chart showing render-blocking requests." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">WebPageTest waterfall chart showing render-blocking requests.</figcaption></figure><h2>Reducing the impact of render-blocking resources</h2><h3>Reduce the size of CSS</h3><p>Since CSS will always be render blocking, the best way to reduce its impact on performance is to reduce the amount of CSS you’re using.</p><ul><li>Are you able to inline critical CSS on a page, and load the rest later?</li><li>Avoid using <code class="language-markup">@import</code> statements in your CSS.</li><li>Can you self-host Google Fonts? This eliminates the outbound request for the font stylesheet.</li><li>If you’re using an icon font, try replacing that with SVGs.</li></ul><h3>Reduce the size of JS</h3><p>Reducing the impact of JavaScript is a bit trickier. As with CSS, reducing the amount you’re using is a very good start.</p><ul><li>Can you use <a href="https://developer.mozilla.org/en-US/docs/Glossary/Code_splitting">code-splitting</a> and <a href="https://developer.mozilla.org/en-US/docs/Glossary/Tree_shaking">tree-shaking</a> in your build process?</li><li>Can you use the <a href="https://www.patterns.dev/posts/import-on-interaction/">import on interaction pattern</a>?</li><li>Defer any code you don’t need right away, and load the rest asynchronously.</li></ul><h3>Execute code off the main thread</h3><p>Since JavaScript is single-threaded, execution of scripts gets queued up one after another. To get around this, we can look to move non-essential tasks off the main thread, freeing it up to handle code that is critical to rendering initial page content. There are a few ways to move code execution off the main thread. Some use other processors on the device, while others don’t even send down the code in the first place.</p><h4>Move A/B testing to the server</h4><p>Client-side A/B testing is a recipe for poor rendering performance. This is because the script to run the A/B test must first download, before it is then parsed an executed. As a result, content is hidden from the user until code execution is completed.</p><p>Moving A/B testing to the server prevents this entirely. If you are server-side rendering you site then you can perform all the heavy lifting before sending the page back to the browser. Another approach is to use edge compute like <a href="https://philipwalton.com/articles/performant-a-b-testing-with-cloudflare-workers/">Cloudflare Workers to perform the A/B tests</a>.</p><h4>Use a web worker</h4><p>Third-party code like analytics, ads, and other tracking scripts probably aren’t critical to rendering content on your page. Sure, they might be important for your business, but you can defer their loading until after the rest of the page’s content has been loaded.</p><p>Alternately, you can move the execution of these scripts off the main thread using web workers This is an area I’ve got to explore more, so there could be a post about it later in the year. If you want to learn more, Surma has a <a href="https://web.dev/off-main-thread/">great post & talk</a> about the topic. There are also tools like <a href="https://github.com/BuilderIO/partytown">Partytown</a> & <a href="https://developers.cloudflare.com/zaraz/">Cloudflare’s Zaraz</a> which are worth checking out.</p></div>Start performance tests from your browser's address bar2024-02-20T13:25:46Zhttps://fershad.com/writing/start-performance-tests-your-browser-s-address-bar/<div><h1>Start performance tests from the address bar</h1><p>Sometimes you land on a web page and just feel compelled to fire up a quick performance test. Well, okay maybe that’s just me. Anyway, in recent months I’ve been using the custom search engine feature in my browser to make this process just a bit quicker.</p><p>In the past, I would open up another browser tab, go to the URL for the testing tool I want to use, and then start the process of running a test on the page I want to examine. The first step hasn’t changed. But now I can simply type a keyword into my address bar, hit the <code class="language-markup">TAB</code> key and paste in the URL of the page I want to search. It’s a small thing that just helps to make the process to kicking off multiple performance tests on different tools just that little bit quicker.</p><p>Here’s how to set it up for yourself, and a few of the search queries that I regularly use.</p><h2>Setting up</h2><p>The steps I’ll be going through below are for Microsoft Edge (Chromium), but you can do the same thing in <a href="https://support.google.com/chrome/answer/95426">Google Chrome</a> & <a href="https://support.mozilla.org/en-US/kb/add-or-remove-search-engine-firefox">Firefox</a> browsers.</p><ol><li>In Settings, navigate to the “<em>Manage search engines</em>” section.<ol><li>You can either search for it, or find it in <em>Privacy, search and services > Address bar and search > Manage search engines</em></li></ol></li><li>Once there, you can click the <code class="language-markup">Add</code> button to setup your own search engine. You’ll see the following fields: <ol><li><strong>Search engine</strong>: Any name you want to use to keep it memorable.</li><li><strong>Keyword:</strong> What you’ll type in the address bar to trigger this search engine.</li><li><strong>URL:</strong> The URL that will be requested when a search is made. The <code class="language-markup">%s</code> placeholder is used in place of your search query.</li></ol></li></ol><h2>Web performance testing search engines</h2><p>Here are some of the web performance search engines that I have set up. You can set the search engine & keyword fields to whatever you want.</p><p>If you use a search engine that isn’t listed below, feel free to <a href="mailto:itsfish@fershad.com">get in touch with me</a>. I’d be happy to check it out and add it to the collection.</p><h3>Treo Site Speed Report</h3><p>Get CrUX data for a domain using Treo’s free site speed report.</p><ol><li>Search engine: <code class="language-markup">Treo</code></li><li>Keyword: <code class="language-markup">@treo</code></li><li>URL: <code class="language-markup">https://treo.sh/sitespeed/%s</code> </li></ol><p>To use this search engine, type <code class="language-markup">@treo</code> into the address bar and press the TAB key. Enter the <strong>domain</strong> (e.g. www.fershad.com) you want query, and press enter.</p><h3>PageSpeed Insights</h3><p>Start a test on a URL with Google PageSpeed Insights.</p><ol><li>Search engine: <code class="language-markup">PageSpeed Insights</code></li><li>Keyword: <code class="language-markup">@psi</code></li><li>URL: <code class="language-markup">https://developers.google.com/speed/pagespeed/insights/?url=%s</code> </li></ol><p>To use this search engine, type <code class="language-markup">@psi</code> into the address bar and press the TAB key. Enter the <strong>URL</strong> (e.g. https://www.fershad.com/writing) you want query, and press enter.</p><h3>Calibre Core Web Vitals Checker</h3><p>Calibre’s Core Web Vitals Checker allows you to surface CrUX data for a domain.</p><ol><li>Search engine: <code class="language-markup">Calibre (Domain)</code></li><li>Keyword: <code class="language-markup">@calibre</code></li><li>URL: <code class="language-markup">https://calibreapp.com/tools/core-web-vitals-checker/%s?context=origin</code> </li></ol><p>To use this search engine, type <code class="language-markup">@calibre</code> into the address bar and press the TAB key. Enter the <strong>domain</strong> (e.g. www.fershad.com) you want query, and press enter.</p><h3>The Green Web Foundation</h3><p>Check if your site is hosted on a green web host using the Green Web Foundation’s online checker.</p><ol><li>Search engine: <code class="language-markup">Green Web</code></li><li>Keyword: <code class="language-markup">@greenweb</code></li><li>URL: <code class="language-markup">https://www.thegreenwebfoundation.org/green-web-check?url=%s</code> </li></ol><p>To use this search engine, type <code class="language-markup">@greenweb</code> into the address bar and press the TAB key. Enter the <strong>URL</strong> (e.g. https://www.fershad.com/writing) you want query, and press enter.</p><h3>WebPageTest</h3><p>Check if your site is hosted on a green web host using the Green Web Foundation’s online checker.</p><ol><li>Search engine: <code class="language-markup">Web Page Test</code></li><li>Keyword: <code class="language-markup">@wpt</code></li><li>URL: <code class="language-markup">https://webpagetest.org/?url=%s</code> </li></ol><p>To use this search engine, type <code class="language-markup">@wpt</code> into the address bar and press the TAB key. Enter the <strong>URL</strong> (e.g. https://www.fershad.com/writing) you want query, and press enter.</p><p>The URL above will setup a WebPageTest simple configuration. If you want to start with the advanced configuration open, use this URL: <code class="language-markup">https://webpagetest.org/?url=%s&advanced</code></p><p></p></div>Testing a web page with PageSpeed Insights2024-02-20T13:25:46Zhttps://fershad.com/writing/testing-a-web-page-with-pagespeed-insights/<div><p><a href="https://pagespeed.web.dev/">PageSpeed Insights (PSI)</a> is a free tool from Google that allows anyone to gain performance insights for a web page across both desktop and mobile devices. In recent years real-user data from Google’s Chrome User Experience Report (CrUX) has also been added to test results when available.</p><p>The PSI test report also presents suggestions on how a page can be improved. Recent updates have also introduced framework specific suggestions. For example, if you test a page of a WordPress website you could be presented with some suggestions related to specific WordPress plugins or settings.</p><h2>Getting started</h2><p>Head over to <a href="https://pagespeed.web.dev/">https://pagespeed.web.dev/</a> and enter in a web page URL. Hit ‘Analyze’ and the tool will start running the page through emulated desktop and mobile Google Lighthouse tests.</p><h2>The results</h2><p>After running a test, you’ll be taken to the results page. All results are separated into two tabs - Mobile and Desktop. By default, you’ll be shown the mobile results. This is because mobile results matter to Google, especially since we’re in the time of ‘mobile-first indexing’.</p><h3>Redirects</h3><p>If the web page you entered gets redirected to another destination, PSI will show a small notice at the top of the test results. Since redirects can have a negative impact on performance, you’re given the chance to ‘Reanalyze’ the results.</p><h3>Real-user data</h3><p>The first section of results shown are for real-user experiences on the page. The data presented here is from the CrUX report, for the past 28-days at the 75th percentile. Just a reminder that:</p><ul><li>The data is from sessions of Google Chrome users that have <em>opted in</em> to sharing this data with Google.</li><li>The data excludes those using Chrome on iOS.</li><li>If a page doesn't get a lot of traffic (enough for meaningful, anonymised data to be provided) then you won't see any results.</li></ul><p>You’ll notice that there are two tabs within this section - This URL and Origin. If you test a page that has data for both, then you’ll be able to get a sense of how the page you’re testing compares with the rest of the website.</p><p>At the top of these results is a pass/fail rating, based on CrUX performance for Core Web Vitals metrics. To pass the Core Web Vitals Assessment, a page (or origin) must rank as “good” for all three Core Web Vital metrics.</p><p>Below this, you’ll be presented with details for how the page performs at the 75th percentile for each Core Web Vital metric. If you want to drill down further, there’s an “Expand view” toggle at the top right of this section. Clicking that will expand each Core Web Vital, and show a breakdown of page loads that fall into the “Good”, “Needs improvement”, and “Poor” buckets.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/586e55eecf2952bf238444324c1c5ddcd02e0a9c-1080x567.jpg?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/586e55eecf2952bf238444324c1c5ddcd02e0a9c-1080x567.jpg?auto=format" alt="Screenshot showing Core Web Vitals Assessment: Passed. FCP: 2.1s, LCP: 2.5s, FID: 12ms, CLS: 0.00" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">An example of a page that ranks “Good” for all three Core Web Vitals metrics.</figcaption></figure><h4>No data found</h4><p>In the case that there’s not enough CrUX data available for the page you’re testing, you might still be able to get results in the Origin tab. While this is more generalised data, covering multiple pages across a domain, it can still be somewhat insightful.</p><p>If there’s no data in the Origin tab either, well move along down to the Lab results.</p><h3>Lab results</h3><p>The next section (titled “Diagnose performance issues”) shows the results to Lighthouse performance audits run on the web page in simulated desktop and mobile environments.</p><p>The results here start with a headline performance score. You’ll be familiar with this If you’ve ever run a Lighthouse Audit on a web page. This score is a weighted calculation based on the key performance metrics measured in the simulated test.</p><p>Under this headline score you’ll see the results for each of the key performance metrics. Again, there’s an “Expand view” toggle to the right of this section. This time clicking it will reveal some more information about each metric.</p><h4>Test conditions</h4><p>Below this first set of results, you’ll see the conditions under which the test was run. Some details are underlined. Clicking on these will give some more information about the test environment.</p><p>It’s worth noting that PSI runs tests across four global datacenters based out of North America, Europe, and Asia. Where your test is run can impact performance results, especially if the page being tested isn’t served from a global CDN so it’s worth paying attention to this. Whenever I run tests from here in Taiwan, I find that I’m almost always getting results run from the Asia datacenter.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/180900fcd9785228baf138e62464725877fcc2e7-1080x567.jpg?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/180900fcd9785228baf138e62464725877fcc2e7-1080x567.jpg?auto=format" alt="A snapshot of CNN.com test results." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">A snapshot of the metrics for CNN.com, showing details about the network conditions set for the simulated mobile page test.</figcaption></figure><h3>Treemap</h3><p>Under the test environment details, you’ll see a little button titled “View Treemap”. Clicking this will open a new tab, filled with squares of all different colours and sizes. Treemaps allow you to identify all the JavaScript that is download and used/unused by a page. Understanding and exploring treemaps is a whole topic on its own. If you want to get started, check out <a href="https://sia.codes/posts/lighthouse-treemap/">Explore JavaScript Dependencies With Lighthouse Treemap</a> by Sia Karamalegos.</p><h3>Opportunities & diagnostics</h3><p>Now onto the real handy part of the PSI test results. As part of the tests that PSI runs on a web page, it surfaces some recommendations to help make the page load faster. It’s worth noting the distinction between the opportunities and diagnostics sections.</p><ul><li><strong>Opportunities</strong> - are suggestions to help the page load faster, and hopefully improve key performance metrics.</li><li><strong>Diagnostics</strong> - are additional information, gathered during the test run, that shows how the page stacks up to industry best practice.</li></ul><p>For both, you may see framework specific suggestions to improve a page’s performance. You’ll see these if Lighthouse has been able to identify the framework used to build a page.</p><h4>Opportunities</h4><p>Each opportunity is presented alongside an estimate of how much faster a page might load if the optimisations are applied. Clicking on an opportunity will expand it, giving more details of the elements or code that is causing the test to fail. You’ll also get details on how to fix it, and links to Google articles so you can get started.</p><h4>Diagnostics</h4><p>Each diagnostic item can also be expanded. Doing show gives details on the elements that are causing the test to fail, as well as details on why/how to fix these.</p><h4>Filtering results</h4><p>A new additional to PSI is the ability to filter opportunities and diagnostics for individual Core Web Vitals. Doing so allows you to see the results that impact a particular metric, and be more detailed and methodical about applying fixes based on PSI results.</p><p>You’ll find the filter at the top right side of the Opportunities section, underneath the filmstrip that shows how your site loads. Clicking through the different metrics will surface only the results that are likely to have an impact on the chosen metric.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/e162d6c8a447a2c7edea69c80ec18818a420edf2-1080x567.jpg?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/e162d6c8a447a2c7edea69c80ec18818a420edf2-1080x567.jpg?auto=format" alt="Opportunities for CNN.com filtered for LCP" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Filtering opportunities for the CNN.com website by those that should improve LCP.</figcaption></figure><h3>Passing Audits</h3><p>The final section of the test results is collapsed by default. Expanding it will show you all the audits which the page has passed. It’s always nice to see the number of passing audits rise as you work through addressing the opportunities and diagnostics for a given page.</p></div>Website performance and the planet2024-02-20T13:25:46Zhttps://fershad.com/writing/website-performance-and-the-planet/<div><p>Solid website performance can have a lot of a positive effects. For businesses, it can have direct <a href="https://wpostats.com/">impacts on revenue and engagement</a>. For end users, it can make time on the web <a href="https://simonhearne.com/2021/web-stress/">less stressful</a>. But how about for planet? Yep, that's right making the website's we build and manage more performant can also be good for planet Earth.</p><h2>The web's got an emissions problem</h2><p>The environmental impact of the digital world is often one of those <em>'out of sight, out of mind'</em> kind of things. Whether it's something tangible like a smartphone or virtual like a website, the environmental and carbon impacts of our digital lives are often invisible to us as end users.</p><p>It's estimated that global information and communication technology (ICT) accounts for <a href="https://theshiftproject.org/wp-content/uploads/2019/03/Lean-ICT-Report_The-Shift-Project_2019.pdf">around 4% of global CO2 emissions</a>. Just over half of those emissions come from the usage of ICT products and services (that's through data centres, networks, and terminals/devices).</p><p>For a bit of perspective, in a year the web as whole uses more electricity than the UK. The internet is annually responsible for emissions equivalent to Germany (the world's 7th largest polluter). That's more polluting than the civil aviation sector.</p><p>All this is before we even get into the usage of water required power, cool, and produce the hardware that powers much of the web.</p><h3>What contributes to the web's carbon footprint?</h3><p>There are four components that form the emissions profile for the web.</p><ul><li><strong>Data centers</strong> - The energy required to power the servers and facilities that host sites, APIs, and databases.</li><li><strong>Networks</strong> - The power required to push data around the planet.</li><li><strong>Consumer devices</strong> - The energy required to power consumer devices (including Wi-Fi modems).</li><li><strong>Production</strong> - The embodied emissions from the manufacturing of the hardware involved in the three areas above.</li></ul><p>When it comes to web performance, our impact can be most felt in items 1, 2 and 3 in the list above. There's a more detailed explanation of how website emissions are currently calculated in the <a href="https://sustainablewebdesign.org/calculating-digital-emissions/"><em>Calculating Digital Emissions</em></a> blog post on Sustainable Web Design.</p><h2>Where web performance can help</h2><p>So, where should you start looking if you want to make your little (or not so little) corner of the web greener? What can you, as a performance-minded developer, do to reduce the carbon emissions of the sites you work on?</p><p>The great news is that a lot of the steps we take to improve website performance also help reduce a site's carbon impact. Better Core Web Vitals plus a low carbon website? Win, win!</p><h3><strong>Start by measuring</strong></h3><p>If you want to improve an existing website, start off by getting a sense of its current impact. Using tools like <a href="https://www.websitecarbon.com/">Website Carbon Calculator</a>, <a href="https://digitalbeacon.co/">Beacon</a>, or <a href="https://ecoping.earth/">Ecoping</a> you can get an estimate of carbon emissions for a given web page. By using these figures alongside your site's analytics you can start to estimate your site's total carbon emissions.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Note</p><p></p><p>You should consider whatever figures you get from these tools as the <strong><em>minimum estimated</em></strong> carbon emissions for a web page. They are all based on data transferred for the initial page load, and so do not take into account lazy-loaded images, JavaScript, or other data that might be fetched through user interaction. They also make some assumptions about repeat visits and caching.</p><p></p></div><h3>Optimise all the things!</h3><p>On the frontend, making sure the assets we send along with our sites are as optimised as possible can go a long way to reducing the carbon produced per pageview. Ensuring GZIP or Brotli compression is enabled for your site helps here too! Every kilobyte matters.</p><h4>Images</h4><p>Effective image optimisation can instantly take megabytes off the total size of your page. All the usual suspects have an impact here:</p><ul><li>Use modern image formats like AVIF or WebP. Where that's not possible, compress JPEG and PNG images before uploading them.</li><li>Think about using MP4 video rather than a GIF for animated content.</li><li>Lazy-load any images that are not visible when the page initially loads & provide responsive versions for smaller viewports.</li></ul><h4>Fonts</h4><p>If you're using web fonts, find out if you can subset them to remove characters you probably won't use. While you're at it, check to make sure that fonts are being served in an optimised format like WOFF2.</p><p>Take things a step further by totally removing custom fonts from you site. Using system fonts is the most sustainable approach you can take. Iain Bean's written <a href="https://iainbean.com/posts/2021/system-fonts-dont-have-to-be-ugly/">a great post</a> highlighting some of the more attractive system font options you can consider.</p><h4>CSS</h4><p>Beyond minifying CSS, check to see if you're sending down more than you need. Can you <em>tree-shake</em> unused classes from the CSS library you use? Are you using an icon font that could be replaced with SVGs?</p><h4>JavaScript</h4><p>Tree-shaking helps here too. If you have the chance to review your code periodically then look for any libraries or polyfills that could be replaced with native implementations.</p><p>Also make a point to regularly check any third-party code that your site is pulling in. Is it minified? Is it served over a CDN? Most importantly, is it still used or can it be removed?</p><h3>Caching, caching, caching</h3><p>The most carbon-friendly data request is the one that doesn't need to be made. Caching static resources on the browser can dramatically speed up website navigations and return visits. They also reduce the network energy consumption required to serve a site.</p><p>Set the most aggressive caching rules you can for things like images, CSS and JavaScript files. Harry Roberts' <a href="https://csswizardry.com/2019/03/cache-control-for-civilians/">Cache-Control for Civilians</a> is a great resource to help understand just what's possible.</p><h3><strong>CDNs & edge caching</strong></h3><p>Caching static content closer to end users via a content delivery network (CDN) definitely helps with performance. In the process you also reduce the amount of electricity required for data transmission, which is better for the planet. This is especially the case if your site is serving a global audience.</p><p>CDNs transfer huge amounts of data around the world through a network of distributed data centers. The energy used by these data centers is still a large contributor to digital carbon emissions, so take a moment to look into a provider's sustainability policy and plans. Some large CDN providers like <a href="https://blog.cloudflare.com/cloudflare-committed-to-building-a-greener-internet/"><strong>Cloudflare</strong></a> and <a href="https://www.akamai.com/company/corporate-responsibility/sustainability"><strong>Akamai</strong></a> have sustainability commitments outlined on their websites. If you're already using a CDN provider, or are looking for a new one, check with their sales team about their sustainability commitment and the steps they're taking to reduce the environmental impact of their operations.</p><p>If your site is 'dynamic', like a traditional WordPress website, then see if it's possible to cache some/all of the pages. Some hosting providers offer this service, or you may also look at using a CDN here. This can help reduce the load on your databases, and as a result reduces energy used by your servers to generate each pageview.</p><h3>Green hosting</h3><p>Hosting accounts for about 15% of a website's carbon footprint. So, hosting your site on a green provider can go a long way to reducing your site's footprint. The more people who move to green web hosts, the stronger the message will be to the rest of the industry that green options should become the norm.</p><p>The Green Web Foundation maintain a <a href="https://www.thegreenwebfoundation.org/directory/"><strong>directory of verified green web hosts</strong></a>. You can reference this list to find a provider in your region, or one that's located close to your users.</p><h2>Website design also plays a part</h2><p>The design decisions made early on in the life of a website can also have a big impact on its longer-term sustainability. If you're able to contribute to design discussions, then here are a couple of impactful things to consider.</p><h3><strong>Jumbotrons, heroes and carousels</strong></h3><p>Large videos at the top of web pages force an incredible amount of data to be transferred over the network. Often these videos are purely aesthetic. Ask yourself if it's really needed, or if you can instead play the video only if the user interacts with it.</p><p>The same applies for large hero images or carousels. Carousels, in particular, can result in multiple images being downloaded, some of which may never be seen by the user. Plus, there's evidence that they're <a href="https://thegood.com/insights/ecommerce-image-carousels/"><strong>not as effective</strong></a> as your marketing team might think. If you've got no option but to use a carousel/hero image then ensure its optimised, and that any images not required for the initial page load are lazy loaded.</p><h3><strong>Believe it or not, colours have an impact too</strong></h3><p>The choice of colours used on your site can have a small impact on the energy consumed by a user's device when using your site. Sometimes it's hard to change this because of branding and the like. Where possible, consider a darker colour palette or offering a dark mode option. Interestingly, blues are about <a href="https://www.youtube.com/watch?v=N_6sPd0Jd3g"><strong>25% more energy intensive than reds or greens</strong></a>.</p><h2><strong>Systemic change is needed</strong></h2><p>A 100ms faster largest contentful paint for one pageview might be insignificant. Heck, the person visiting the site probably won't even notice. But at scale, that 100ms <a href="https://www2.deloitte.com/ie/en/pages/consulting/articles/milliseconds-make-millions.html">can be worth millions</a> for a website, and for a business.</p><p>Reducing the carbon impact of the web is much the same. Reducing the carbon emissions of my personal website, with its 800 odd pageviews each month, isn't going to make much of a difference. However, if we as a web community can bring awareness to the impacts that our sites, apps, and platforms are having then we'll be in a better place to drive broader change.</p><p>A sustainable web is also a faster web. By standardising sustainable web development practices we can, as an industry, do our part to provide a cleaner more sustainable future for the planet.</p><p>PS. If you're interested in this topic, there's <a href="https://github.com/WPO-Foundation/webpagetest/issues/1613">an open discussion</a> about adding website sustainability into WebPageTest over on Github.</p><h2>Further reading</h2><h3>Online</h3><ul><li><a href="https://sustainablewebdesign.org/calculating-digital-emissions/">Calculating Digital Emissions</a> - Sustainable Web Design</li><li><a href="https://www.the-public-good.com/web-development/measuring-the-web">Measuring the web</a> - Daniel Hartley (The Public Good)</li><li><a href="https://dannyvankooten.com/website-carbon-emissions/">CO2 emissions on the web</a> - Danny van Kooten</li><li><a href="https://marmelab.com/blog/2020/09/21/web-developer-climate-change.html">Developers can save the planet</a> - François Zaninotto (Marmelab)</li></ul><h3><strong>Books</strong></h3><ul><li><a href="https://abookapart.com/products/sustainable-web-design">Sustainable Web Design</a> - Tom Greenwood, A Book Apart</li><li><a href="https://gerrymcgovern.com/books/world-wide-waste">World Wide Waste</a> - Gerry McGovern</li></ul></div>“Use less. Use green. Buy green.”2024-02-20T13:25:46Zhttps://fershad.com/writing/use-less-use-green-buy-green/<div><p>This post is inspired by a quote from a podcast I listened to just before Christmas. The podcast is My Climate Journey hosted by Jason Jacobs. In the episode <a href="https://www.myclimatejourney.co/episodes/kentaro-kawamori">(Ep. 189) Jason speaks with Kentaro Kawamori</a>, Co-founder and CEO of Persefoni. Persefoni is a Climate Management & Accounting Platform. You can listen to the podcast to learn more about what they do.</p><p>Around two-thirds of the way through the episode (about 32 minutes in) Kentaro drops this amazingly simplistic three step path for decarbonising a company.</p><blockquote>I'll let you in on a little secret, you don't need to pay McKinsey a million dollars for your climate strategy. I can tell you in three simple ways how to reduce your footprint, the first is use less power. The second is use greener power, and the third is buy greener services and products.</blockquote><p>For some reason this has just stuck with me since I first heard it. It sounds so simple, and can be applied to many situations beyond corporate decarbonisation. In this post I want to take this idea and look at how it can be applied to website sustainability, and also website performance.</p><h2>Website Sustainability</h2><p>If you’re familiar with website sustainability then you’re probably already doing a lot of this, but let’s go through it for everyone that’s new to the topic.</p><p>It's estimated that global information and communication technology (ICT) accounts for <a href="https://theshiftproject.org/wp-content/uploads/2019/03/Lean-ICT-Report_The-Shift-Project_2019.pdf">around 4% of global CO2 emissions</a>. For a bit of perspective, in a year the web as whole uses more electricity than the UK. The internet is annually responsible for emissions equivalent to Germany (the world's 7th largest polluter). That's more polluting than the civil aviation sector.</p><h3>Use less power</h3><p>In the case of web sustainability, the main way we can use less power is by reducing the amount of data we’re pushing down the wire to our users. I’ve written previously about some of the technical and design considerations that can be made when <a href="https://fershad.com/writing/reducing-website-carbon-emissions/">trying to make a low carbon website</a>.</p><p>Another way we can use less power is by looking at the code our sites execute on the end-user’s device. Especially on mobile devices, large JavaScript runtimes can drain the battery which means more recharging and faster degradation.</p><h3>Use greener power</h3><p>Switch your sites, APIs, and backends to certified green hosting providers. The Green Web Foundation maintain a <a href="https://www.thegreenwebfoundation.org/directory/">directory of verified green web hosts</a>. If you’re using a CDN, then look for sustainability commitments on your provider’s website.</p><p>For those who need to run operations in the cloud, Google, AWS and Azure are all making very strong sustainability pushes as part of their offerings. Look to see if you can run functions in regions powered by 100% renewable energy, or at times of peak low carbon power supply.</p><h3>Buy greener services and products</h3><p>The services and products our websites use come in the form of third-party tools and scripts. Since these scripts come from external vendors, we have little control over the quality or size of their code. What we can control, however, is which providers we use.</p><p>When evaluating vendors to use for parts of your site’s functionality, check to see the size of the script they’ll be sending down to your users. Run the URL they provide you through The Green Web Foundation’s tool to <a href="https://www.thegreenwebfoundation.org/">see if they use a green host</a>.</p><p>If you’re currently using third-party resources on your site, then do the above checks as well. Contact the provider and ask them what they are doing to bring down their file size, or if they have plans to move to green hosting. If you decide to change providers for these reasons, let your current provider know as such (be nice about it though, fam).</p><h2>Website Performance</h2><p>Applying Kentaro’s advice to website performance takes a bit more mental gymnastics. I’ve been thinking about this a bit as 2021’s rolled into 2022, and have come up with something that at least makes sense to me. I’d be interested to hear what others think.</p><h3>Use less power</h3><p>This is very much along the same lines of the website sustainability point above. Sending less data over the wire makes websites faster, and more accessible to users across a range of devices. As <a href="https://infrequently.org/2021/03/the-performance-inequality-gap/">Alex Russel notes</a>:</p><blockquote><em>We can now afford <strong>~100KiB of HTML/CSS/fonts and ~300-350KiB of JS (gzipped)</strong>.</em></blockquote><p>That’s just under 500KiB to deliver a site that performs reasonably well on the average mobile device.</p><p>It can also <a href="https://ecoping.earth/blog/core-web-vitals-and-sustainability">help with things like Core Web Vitals</a>. Optimising images should improve your Largest Contentful Paint. Sending down only the JavaScript you need will keep First Input Delay to a minimum.</p><h3><del>Use greener power</del> Use the platform</h3><p>This one’s a bit of a stretch, but here me out. In the context of website performance, I’ve translated <em>“use greener power”</em> to mean <em>“use the platform”.</em></p><p>For those not familiar with that phrase, it refers to the practice of using the capabilities provided to us by the browser (through native HTML, CSS and JS) before reaching for external libraries.</p><p>Using the platform will help get you closer to that 500KiB webpage budget mentioned by Alex. It will help you deliver an experience that should work for most users on modern browsers, while relying on a handful of polyfills for those on older versions. And since you’re using features that come built into the browsers your website visitors are using, the computational overhead on their end device should be minimal too.</p><h3>Buy greener services and products</h3><p>This again ties back to third-party services used on our sites. They can have a significant impact on website performance, and degrade the overall user experience in the process.</p><p>Does your chat widget need to load in its entirety when a user first lands on a page? Or, can a facade be loaded in its place, with the rest of the third-party content being requested only once the user interacts with the feature?</p><p>I’d strongly recommend running regular audits on the third-party resources in use on your site. Are they still needed? If they are, then what impact are they having on your site’s performance? Could they be lazy-loaded, or replaced with a smaller/native alternative?</p></div>Approaches to video on the web2024-02-20T13:25:46Zhttps://fershad.com/writing/approaches-to-video-on-the-web/<div><p>I’ve recently been working on a website build for a client that wanted to self-host videos in their Content Management System (CMS). To make things a bit more challenging from a performance perspective they also wanted to be able to use hero videos on parts of their site. This sent me down a bit of a rabbit hole looking at some of the best ways to handle video on the web, from both a performance and sustainability perspective.</p><h2>The best video is no video</h2><p>This is easier said than done. Every client has their own vision for their website, and video is playing an ever increasing role in that. Education around the potential impacts of video on website performance and sustainability is always a good place to start.</p><p>Loading video can consume a lot of bandwidth, which can get in the way of other resources. This is especially important to keep in mind when loading autoplay video elements. Videos can also take some time to load, especially over slower connections. This has the potential to hurt your site’s Largest Contentful Paint (LCP) scores, especially if you’re using hero (or jumbotron) videos at the top of a page.</p><h2>The site <em>needs</em> video. Now what?</h2><p>So, you’ve talked things through and reached the conclusion that videos are still an important part of the overall website design. There are a few things you can do now to minimise the performance impacts when video is used. These will also reduce data consumption when loading the page.</p><h3>Compress it</h3><p>First up, have someone run the video files through a video transcoding tool like <a href="https://ffmpeg.org/">FFMPEG</a> or <a href="https://handbrake.fr/">Handbrake</a>. If you’re using the video file directly on your site (i.e. not using a service like Cloudinary or YouTube), then you’ll probably want to export it in MP4 (H.264 codec) format for the best cross-browser support.</p><h3>Use a dedicated video streaming service</h3><p>If possible, look at using a dedicated video streaming service to host video for your site. <a href="https://cloudinary.com/invites/lpov9zyyucivvxsnalc5/dyg8fkjzrzhfeiqce9nl">Cloudinary</a>, <a href="https://mux.com/">Mux</a> and <a href="https://developers.cloudflare.com/stream/">Cloudflare</a> all provide video streaming options. These services send down appropriately sized video to a device and adjust bitrate to account for network conditions as well. In doing so, you go a long way towards ensuring that the video content served to a website visitor is as optimised as it can possibly be.</p><h3>If you’re using YouTube ...</h3><p>It might be the case that videos are being posted to YouTube, and there’s a need to embed them on the website. YouTube’s regular embed code pulls in a whole bunch of additional data that isn’t needed for the video. Out of the box it does everything on page load too, even if the video is further down the page. Here are some ways to optimise YouTube content for your site:</p><ul><li><strong>Using YouTube’s embed code</strong> - If you’re using the embed code from YouTube’s website, then add the <code class="language-markup">loading="lazy"</code> attribute to the <code class="language-markup">iframe</code> , especially if the content is not visible when the page first loads.</li><li><strong>Use a facade</strong> - Alternately, you can use an image facade which loads in place of the video. Only when the user clicks on the facade will the YouTube content start downloading. I’ve written about this with some sample code <a href="https://fershad.com/writing/optimising-embedded-content#youtube">in a previous post</a>.</li><li><strong>Use a custom element</strong> - For better performance, you can use a custom element like Paul Irish’s <a href="https://github.com/paulirish/lite-youtube-embed">Lite YouTube Embed</a>.</li></ul><h3>And then a hero comes along 🎶</h3><p>Handling hero video elements gets a bit trickier. Since they’re almost always at the top of the page we can’t defer loading content. If you’re using a video hosting service, then be sure to have a <code class="language-markup">preconnect</code> link tag in the head of your page to get the connection started sooner.</p><p>If you’re self-hosting video, then there are a few key things you can do:</p><ul><li>Have a <code class="language-markup">poster</code> attribute on your <code class="language-markup"><video></code> tag. The poster will be shown while the rest of the video content loads.</li><li>Be sure to set an <code class="language-markup">aspect-ratio</code> for the video element to avoid layout shift once it loads.</li></ul><p>Simon Hearne has a great post with these and other more detailed tips for <a href="https://simonhearne.com/2021/fast-responsive-videos/">delivering performant hero videos</a>.</p><h3>A couple more tips</h3><p>To finish off, here are a couple of extra tips that can come in handy especially if you are self-hosting video on your website.</p><ul><li>Read this post by Doug Sillars on <a href="https://dougsillars.com/2020/03/03/video-playback-on-mobile-devices/">serving video on mobile devices</a>. You won’t regret it.</li><li>If you <strong><em>don’t</em></strong> need an autoplay video, then use a <code class="language-markup">poster</code> attribute on the <code class="language-markup"><video></code> tag. Also, you can add the <code class="language-markup">preload="none"</code> attribute to prevent too much video content being downloaded. There’s a <a href="https://web.dev/fast-playback-with-preload/#video-preload-attribute">useful post on web.dev</a> about this.</li><li>You can also use a facade in place of the video element on the page. This can be a chance for web designers to get really creative! Once the user interacts with the facade, then the video can start loading.</li><li>For short, decorative videos that you need to autoplay only when they’re shown in the viewport <a href="https://www.youtube.com/watch?v=mV4tnQkqhmI">check out this video</a> from Chris Coyer & Dave Rupert.</li></ul></div>Using Treo's free Site Speed Test2024-02-20T13:25:46Zhttps://fershad.com/writing/using-treos-free-site-speed-test/<div><p>Treo's free Site Speed Test presents data from Google's Chrome User Experience Report (CrUX) in a very easy-to-digest visual interface. It's a great tool for businesses to start understanding how their website performs for users around the world.</p><div class="callout"><p></p><p>This is not a sponsored post. All opinions presented here are my own.</p><p></p></div><h2>Getting started</h2><p>Head over to <a href="https://treo.sh/sitespeed">https://treo.sh/sitespeed</a> and enter in a website domain. Note that the tool only takes domains (e.g. <code class="language-markup">www.bbc.co.uk</code>), not whole page URLs. Hitting enter will bring up the test report page.</p><h3>About CrUX data & this report</h3><p>The data surfaced by Treo's Site Speed Test comes from the CrUX dataset. There are a few things to remember when looking at CrUX data:</p><ul><li>The data is from sessions of Google Chrome users that have <em>opted in</em> to sharing this data with Google.</li><li>The data excludes those using Chrome on iOS.</li><li>If a site doesn't get a lot of traffic (enough for meaningful, anonymised data to be provided) then you won't see any results.</li><li>The CrUX dataset is updated monthly.</li></ul><p>Another thing to note is that the data you'll see in the Treo report is for <strong>the entire origin</strong> (in our case <code class="language-markup">www.bbc.co.uk</code>, rather than just the homepage or a single page.</p><h2>Report sections</h2><h3>Past year</h3><p>The first section of data you'll see in the report shows how the site you're testing has performed for Core Web Vital metrics over the past calendar year. You can use the <strong>Configure metrics</strong> button to add other metrics if you want.</p><p>The headline figure for each metric is the performance at the 75th percentile for that month. For example, 75% percent of recorded sessions on the site experienced an LCP of 1.7 seconds or faster. Hovering over each month will show you a more detailed distribution.</p><p>This section allows you to see changes in your site's performance over time. Since it's updated monthly the feedback isn't instant, but it's helpful for spotting larger trends.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/f0412155fd314d895e1726dcc0b533d269b38a87-869x613.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/f0412155fd314d895e1726dcc0b533d269b38a87-869x613.png?auto=format" alt="BBC.co.uk Core Web Vitals - FCP 1s, LCP 1.7s, FID 0ms, CLS 0" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Treo Site Speed Test - Past Year results for the BBC website</figcaption></figure><h3>Past 28-days</h3><p>The next part of the report shows data for each Core Web Vital metric at the 75th percentile for the past 28-days. This gives a slightly faster feedback loop than the monthly graphs above. If you test your site regularly and notice these numbers going in the wrong direction, then it's a sign that you might want to review any changes you've recently made.</p><h3>Geography</h3><p>The Geography section of the Site Speed Test report gives a visual representation of how users around the world are experiencing your website. This can be impactful when talking about performance with managers and decision makers within a company.</p><p>Say you're an online store wanting to expand to, or increase sales in, a specific country. This section of the report will allow you to see if users in that country are experiencing a slow, unresponsive, or janky website. There's also a filter at the top of the report that allows you to show results for a specific country.</p><p>Since better Core Web Vitals <a href="https://wpostats.com/">does lead to increase business revenue</a>, seeing how your site is experienced around the world allows you to be better placed to grow your business across more markets.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/5144e28778409724a14aa0509025c34f448f265c-814x526.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/5144e28778409724a14aa0509025c34f448f265c-814x526.png?auto=format" alt="World map showing BBC.co.uk LCP performance globally." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Treo Site Speed Test - Geography results for the BBC website</figcaption></figure><h3>Form factors & Connections</h3><p>The report also gives a breakdown of the type of devices being used to access your website. To be fair, you can probably use your own site analytics to get this data.</p><p>It is useful, however, in understanding why what device types are contributing to your site's CrUX data. If you have more visitors on Desktop than Phone, then you can use the filters at the top of the report to show <em>only Desktop specific</em> data.</p><p>The connections data is also not all that handy. Connections are often split into two categories:</p><ul><li>4G and faster</li><li>3G (occasionally you might see 2G as well)</li></ul><p>In most results you'll see a disproportionate percent of sessions recorded as 4G or faster. This is because the CrUX definition of a 4G connection is actually <em>'an effective 4G connection'</em>. This covers anything between 720kbps through to infinity. That's a massive range, and it does reflect in the results. On the other hand, if you do see a > 20% share of 3G connections, then it's a sign you should consider optimising your site for those users.</p><h2>Other things</h2><p>The Treo Site Speed Test dataset is updated monthly with new data from CrUX. If you want to start using it to track the performance of your site, then there's an option at the top of the report to sign up for email updates.</p><p>The report also allows you to compare your site's results to other websites. This can be great when trying to make the case for website performance within a company. Decision makers might not fully digest the numbers, but if they see that competitor X has a 20% better LCP than your site that might just kick them into action.</p></div>Core Web Vitals meets sustainability2024-02-20T13:25:46Zhttps://fershad.com/writing/core-web-vitals-meets-sustainability/<div><p>This article was <a href="https://ecoping.earth/blog/core-web-vitals-and-sustainability">originally published on the EcoPing blog</a>.</p><p>There's been a lot of talk about Core Web Vitals recently. Rightly so too. Early data is showing that <a href="https://www.sistrix.com/blog/core-web-vitals-is-a-measurable-ranking-factor">Core Web Vitals does seem to be playing a factor in Google Search rankings</a>. With all the talk has come a fair bit of action too. Website owners are looking for ways to improve their base Core Web Vitals scores in a bit to capture some vital SEO juice.</p><p>But, besides the SEO upside (and the commercial benefits too - see <a href="https://wpostats.com/">https://wpostats.com/</a> to learn what I mean), there are also digital sustainability benefits to having better Core Web Vitals. Read on to learn where Core Web Vitals and web sustainability intersect, and what you can start doing on your website.</p><h2><strong>A Core Web Vitals refresher</strong></h2><p>If you're reading the term Core Web Vitals for the first time or need a quick reminder of what they are this section is for you.</p><p>Put simply, Core Web Vitals are a set of metrics that aim to quantify real-world user experiences across the web. They measure page interactivity, content loading, and content stability during page load. The three metrics that form Core Web Vitals are:</p><ul><li><strong>Largest Contentful Paint (LCP)</strong>: A timing of how long it takes for the largest above-the-fold element to be painted on screen. This is usually a hero image/video or large text block.</li><li><strong>First Input Delay (FID):</strong> Measures the time it takes before the browser can react to a user input (like a click or tap).</li><li><strong>Cumulative Layout Shift (CLS):</strong> Indicates the movement of visible elements as the user loads and navigates a page. You know when you start reading an article, then an ad loads above it & all the content gets pushed down? CLS measures things like that.</li></ul><p>Each Core Web Vital metric has thresholds upon which a user's experience can be measured. The <a href="https://web.dev/vitals/">below graphic from Google</a> illustrates this. In short, LCP should be under 2.5 seconds, FID less than 100ms, and CLS less than 0.1 for the entire life of the pageview. To be considered as passing Core Web Vitals, a web page (or website) should meet the "Good" target for all three metrics at the 75th percentile.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/7b4561af6756caa19c168f2915644488bec09912-1165x321.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/7b4561af6756caa19c168f2915644488bec09912-1165x321.png?auto=format" alt="Core Web Vitals guidelines. Image by Google." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center"></figcaption></figure><h2>Better Core Web Vitals, better for the planet</h2><p>So just what are the intersections of these Core Web Vitals metrics and web sustainability? Let's go through some of the ways to ensure your site meets the thresholds above and examine how they can help your site's digital sustainability too.</p><h3>Improving LCP</h3><p>We all want our websites to be fast. So how do we go about ensuring our site's LCP is less than 2.5 seconds? It's all about ensuring the content which is visible when the page first loads (also known as "above-the-fold") gets rendered on the screen as soon as possible.</p><p>Using a Content Delivery Network (CDN) is one way you can help content reach your website's visitors faster. CDNs keep copies of static content (like images, videos etc.) in worldwide data centers. Say your site is hosted in Iceland, and someone from Melbourne, Australia visits it. With a CDN in place, the visitor would receive a copy of your file from a nearby data center (probably in Australia or Singapore), rather than having to wait for data to travel all the way from Iceland.</p><p>Besides a CDN, there are a few things you can do on your website itself to improve LCP and make it more sustainable at the same time.</p><h4><strong>Think twice about hero images/videos</strong></h4><p>Hero images or videos are the large (often full height and width) elements found at the top of many web page designs. They're eye-catching, but they can also be very large in terms of file size. This is both bad for a page's LCP and its sustainability.</p><p>What can you do:</p><ul><li>Look to avoid using large hero image/video elements on your site. Make this a factor when you assess designs and templates.</li></ul><div class="callout"><p></p><p>💚 This is the most sustainable approach.</p><p></p></div><ul><li>If you do still use hero elements, consider the following:<ul><li>Optimise the image to within an inch of its life.</li><li>Avoid auto-playing any hero video content. Be creative with how you design around hero video elements - could it start playing only after the user has interacted with the page? (<a href="https://justdiggit.org/">Justdiggit</a> have a great example on their homepage).</li></ul></li></ul><div class="callout"><p></p><p>💙 These are the next best options for a sustainable website.</p><p></p></div><ul><li>Text renders faster than anything else on the web. So, if you've got only text content above-the-fold then you're off to a good start to have better LCP scores (there are plenty of ways to muck this up though).</li><li>Use the <code class="language-markup">loading="lazy"</code> attribute on any images that are not visible when the page first loads. This will defer their download, freeing up bandwidth for more important resources.</li></ul><h4><strong>You can optimise fonts</strong></h4><p>Believe it or not, you can actually optimise the fonts used on your website too! Most content on the web is text, and there's a strong chance a visitor to your site has come there looking for some information. The sooner you can render this information on the user's screen, the happier they will be.</p><p>What can you do:</p><ul><li>The most impactful thing you can do is to not use web fonts at all on your site, and instead rely on the fonts that come built into every OS (system fonts).</li><li>If you are using web fonts:<ul><li>Use modern font formats like WOFF2 or WOFF. They are a fraction of the size of other formats.</li><li>If your font license allows, you can also try removing unused characters from your font files. This is known as subsetting.</li></ul></li></ul><div class="callout"><p></p><p>💚 Either, ideally both, of these options will help greatly reduce the weight of font files on your site.</p><p></p></div><h4><strong>Remove any render blocking CSS and JavaScript</strong></h4><p>Web browsers try to parse and render web pages as quickly as possible. However, if during this process the browser encounters CSS or JavaScript (JS) then it will stop rendering the page until after it has finished working on the CSS/JS. If this occurs, we say that the responsible CSS or JS code is "render blocking". Render blocking code is <em>the worst</em> for LCP.</p><p>What can you do:</p><ul><li>Regularly audit the JavaScript code on your site - especially third-party code (like analytics, tag managers etc.). Routinely remove scripts you find which are no longer needed.</li><li>Try to reduce the CSS you use on your page.<ul><li>If your site has a build step, can you create unique CSS files for each page which contains only the classes and selectors used for a given page?</li><li>Break CSS up into separate files for different sections of a page. For example, you might separate the CSS for your header section, footer, blog posts, and image gallery into four separate files. This will allow you to load those files only when they are needed (if a page doesn't have an image gallery, it doesn't need that CSS). It will also allow you to benefit from caching for files that are used across multiple pages on your site.</li></ul></li></ul><div class="callout"><p></p><div><p>💡 There are other techniques you can use as well, like <a href="https://www.filamentgroup.com/lab/async-css.html">asynchronously loading CSS</a> and <a href="https://flaviocopes.com/javascript-async-defer/">using defer/async attributes appropriately</a> for JavaScript files. However, the strategies above will reduce the amount of CSS/JS you end up shipping to the user. 💚 This is better for the planet.</p><p></p></div><p></p></div><h3>Improving CLS</h3><p>Layout Shifts are annoying. We've all visited a website where content jumps around as we're trying to read it. Or a site with a slow loading ad at the top of the page which pushes everything down when it finally loads seconds after the rest of the page's content.</p><p>The Cumulative Layout Shift (CLS) metric provides <a href="https://web.dev/cls/">a measure of a web page's visual stability</a>. The lower the CLS score for a page, the less content is jumping around while the user navigates the page.</p><p>There's not a lot of overlap between CLS optimisations and website sustainability, but here a couple of impactful things to keep in mind:</p><h4><strong>Reduce the impact of font loading</strong></h4><p>The <code class="language-markup">font-display: swap</code> CSS property lets developers instruct the browser to show a fallback font while a web font is downloaded. The browser then swaps in the web font once it is ready. This "swap" can result in some shift in content on the page as the new font's style takes effect.</p><p>The most sustainable way to reduce this CLS impact is by using system fonts for your site's content.</p><div class="callout"><p></p><p>💚 As above, this is the most sustainable approach you can take with fonts.</p><p></p></div><h4><strong>Reserve space for lazy-loaded images</strong></h4><p>Lazy-loading images is super easy in modern browsers. Just by adding the <code class="language-markup">loading="lazy"</code> attribute to an image tag it is possible to defer the loading of that image until just before it enters the viewport. This is great for sustainability, as it means we're not downloading content the user might never see.</p><p>However, this can cause content to get pushed around as the lazy-loaded image gets downloaded and painted on the screen. Thankfully, accounting for this is really easy too. Along with the <code class="language-markup">loading="lazy"</code> attribute, setting the <code class="language-markup">height</code> and <code class="language-markup">width</code> attributes for each image allows the browser to keep space for it when it first renders a page.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Notes</p><p></p><ol><li>If you're not sure what height or width to set on an image, just go with dimension of the original image file you uploaded.</li><li>The <code class="language-markup">aspect-ratio</code> CSS property can be used for the same purpose, but it's more efficient to use HTML attributes.</li></ol><p></p></div><div class="callout"><p></p><p>💚 Remember to also compress, optimise, and resize your images so that you're serving more sustainable versions to website visitors.</p><p></p></div><h3>Improving FID</h3><p>First Input Delay (FID) is a metric that <a href="https://web.dev/fid/">measures the responsiveness of a web page</a> when a user initially interacts with it. Without getting into the weeds too much, FID measures any delay the browser experiences upon a user's first interaction on the page (click, tap, key press etc.). A poor FID measurement indicates that a page is either unresponsive or could feel laggy to a user.</p><p>Browsers have a "main thread" on which all processing and execution is performed. This main thread also handles user interactions, rendering, and layout. JavaScript is also processed on the main thread. If the main thread is busy, user interactions cannot be processed until the main thread is free. This is more pronounced on mobile devices which have less processing power than laptop/desktop computers.</p><p>JavaScript is one of the most process intensive things browsers have to deal with. Too much of it often blocks the main thread and as a result can cause poor FID timings. Using the <code class="language-markup">defer</code> attribute on non-critical scripts can go some way to improving things, but when looking at FID from a sustainability perspective there are three main things you can do:</p><p><strong>1. Reduce the amount of JavaScript you ship</strong></p><p>Sounds obvious right? If too much JavaScript is a cause of FID issues, then reducing the amount of JS your page loads will help improve things. This is easier said than done though, especially in modern web development. Here are some things to consider:</p><ul><li>If your site's content is mostly static (infrequent changes, same for all visitors) then consider using a Static Site Generator, or a framework that allows you to output static pages as plain HTML.</li><li>For sites that are more dynamic, look into using smaller JS frameworks (like Preact instead of React), or one that ships compiled code (like Svelte).</li><li>Periodically review your own code. Are you using a JavaScript library, plugin or polyfill that is no longer required? Could you replace a library with a native (vanilla) JavaScript implementation instead? Could you use CSS to solve a problem you're currently using JavaScript for?</li></ul><p><strong>2. Remove/reduce third-party JavaScript</strong></p><p>This ties in with the point above, but I've separated it to give a bit more emphasis. Regularly review any external scripts that are loaded by your site. Do you still use or get value from using the services you are loading scripts for? If not, then remove the script entirely.</p><p><strong>3. Break JavaScript into smaller files</strong></p><p>With the JavaScript that remains on your page, try to split functions into smaller, asynchronous tasks. Not only does this help shorten tasks that run on the main thread, but it can have the added benefit of giving you more control over what JavaScript you load on individual pages.</p><div class="callout"><p></p><p>💚 There's an added sustainability benefit to using less JavaScript on your website, and it's one that's felt by your users. Processing JavaScript can be a heavy task, especially on mobile devices resulting in faster battery drain. Having less JavaScript on your website can help extend the battery life of a user's device, meaning they have to charge it less frequently.</p><p></p></div><h2>Wrapping up</h2><p>The tips above can help you deliver better performance and a smoother user experience on your website. They'll also help you get started down the path of having a more sustainable website frontend. Combine with hosting your website on <a href="https://www.thegreenwebfoundation.org/">a green web host</a> and using a CDN to serve static assets to really start reducing your site's overall impact.</p></div>Improving Webflow Core Web Vitals2024-02-20T13:25:46Zhttps://fershad.com/writing/improving-webflow-core-web-vitals/<div><p>I've recently spent some time diving into a few sites made with <a href="https://webflow.com/">Webflow</a>. If you've never heard of Webflow it is an online service for building and hosting websites. At its heart is a visual editing interface that allows anyone to design, build, and launch a website. Webflow also has CMS and e-commerce features.</p><p>Webflow makes it remarkably easy to quickly spin up snappy landing pages, websites, and blogs. But with the ease of use also comes the potential for things to get out of hand, and for Core Web Vitals to suffer.</p><p>In this post, I'll go over a few of the things to keep in mind to help with Core Web Vitals when building a site on Webflow.</p><h2>First, an observation</h2><p>Webflow has good caching out of the box. It uses hashes to create unique filenames for stylesheets, scripts, and other content that is created in the editor. Caching these files means that they'll be available locally on the next page a user navigates to (or when they next return to the site).</p><p>This is important, since Webflow uses global CSS and JavaScript files. These files contain all the styles and scripts required for the entire site. It's a bummer that they are hosted on Webflow's own assets CDN, meaning the first time they're requested we do have to go through DNS lookup and TLS negotiation (see rows 2 & 4 of the waterfall chart below). But, the caching helps speed up subsequent navigation & return visits.</p><p>It's worth noting that Webflow's caching rules appear to set cache <code class="language-markup">max-age</code> of 1 day for CSS & JS files, and 1 year for image files. I wasn't able to find <code class="language-markup">cache-control</code> headers to video.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/5cc0e68d1f229bab8a42557efbfb8303c4822a94-1080x567.jpg?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/5cc0e68d1f229bab8a42557efbfb8303c4822a94-1080x567.jpg?auto=format" alt="Truncated waterfall showing a Webflow site loading global JS and CSS files from the assets-global CDN." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Truncated waterfall showing a Webflow site loading global JS and CSS files from the assets-global CDN.</figcaption></figure><h2>Bring your own fonts</h2><p><strong>Helps with:</strong> Cumulative Layout Shift (CLS)</p><p>Webflow allows you pull in fonts from Google Fonts or Adobe Fonts with just a couple of clicks. A couple of sites I've looked into do just that. You can see the initial request on row 3 of the waterfall above. That JS file, then goes off and requests a CSS file which then loads the fonts needed for the page. Below is a truncated waterfall chart of the process.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/df5eb4badcbdb938e2fd18732ef86d0fdaa2b801-938x306.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/df5eb4badcbdb938e2fd18732ef86d0fdaa2b801-938x306.png?auto=format" alt="Truncated waterfall chart showing Google fonts loading on Webflow." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Truncated waterfall chart showing Google fonts loading on Webflow.</figcaption></figure><p>That's three third-party domain we're having to hit to get the first font file downloaded. There's a couple of ways we can tackle this:</p><h3>Option 1: Preconnect to font domains</h3><p>We <em>could</em> use <code class="language-markup">preconnect</code> to warm up the <code class="language-markup">fonts.googleapis</code> and <code class="language-markup">fonts.gstatic</code> connections (rows 18 & 29 respectively). This would bring forward DNS lookup and TLS negotiation and allow us to start requesting the CSS & font files a bit sooner.</p><p>You'd do this by adding the code snippets below as a custom code block in the Head of your page. Here's <a href="https://university.webflow.com/lesson/custom-code-in-the-head-and-body-tags#head-code">a Webflow tutorial</a> on how to do that.</p><pre class="language-html"><code class="language-html"><!-- preconnect to google font apis -->
<link rel="preconnect" href="https://fonts.googleapis.com/" crossorigin>
<link rel="preconnect" href="https://fonts.gstatic.com/" crossorigin></code></pre><h3>Option 2: Upload your own fonts</h3><p>This is the option I'd personally recommend. In part because it gives you more control over the fonts you use. And, in part because the extra effort involved might make you think twice about whether you really need that custom font in the first place.</p><p>Let's say you want to use the Merriweather font family from Google Fonts. Here is how you'd go about uploading that to your site.</p><ul><li><strong>Download the font files </strong>- Head over to Google Fonts, select the font weights & styles you need, and then download them. Google will give you TTF formats for each.</li><li><strong>Optimise the fonts</strong> - There's many websites & tools out there for font optimisation. We'll use <span>Fontie</span> and keep things very simple - removing unused characters from the font files and changing them to modern formats. Below is a sample of the settings to set in Fontie.<ul><li><strong>Formats:</strong> Check only <code class="language-markup">Web Open Font Format</code> (WOFF) and <code class="language-markup">Web Open Font Format 2</code> (WOFF2).</li><li><strong>Subsetting:</strong> If your site is in English only, check <code class="language-markup">Include Latin characters</code> and <code class="language-markup">Include HTML entities</code>. You can choose any other you might feel apply for multi-lingual sites.</li><li><strong>Hinting:</strong> <code class="language-markup">Keep existing hinting</code></li><li><strong>Styleheets:</strong> Uncheck <code class="language-markup">Generate CSS @font-face</code></li><li>Click <strong>Generate & download your @font-face package</strong>. After a short time, a .zip file will be downloaded containing the new fonts.</li></ul></li></ul><p>Next, we have to upload the new fonts we've created to Webflow. You can find out how to do so in <a href="https://university.webflow.com/lesson/custom-fonts%E2%80%8B">this tutorial</a>. </p><p>When uploading the custom fonts, be sure to upload both the <code class="language-markup">WOFF</code> and <code class="language-markup">WOFF2</code> versions of the font files, and make sure they have the same Font Family name.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Fallback font</p><p></p><p>Setting a fallback font tells the browser what font to use while a) your custom font downloads, or b) if the custom font isn't available. Select a fallback that closely matches the style of your custom font. Here's <a href="https://university.webflow.com/lesson/custom-fonts#defining-fallback-fonts">a Webflow tutorial</a> with more information.</p><p></p></div><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Font display</p><p></p><div><p>When using custom web fonts, we need to tell the browser what to do while the fonts are downloading. That is where the Font Display property comes into play. We have a few choices when setting font display, but the most common are:</p><ul><li><strong>Swap:</strong> The browser will show a fallback font while the custom font is downloaded. It will then swap in the custom font when it is ready. This can lead to some jank on the page as fonts are swapped.</li><li><strong>Optional:</strong> The browser will give the custom font a 100 millisecond window to download. If it is not ready in that time, the browser will show a fallback font instead. The custom font is stored in cache, ready to use on the next page view.</li></ul><p>I suggest you use the Optional property, since it will ensure a faster page load (LCP) the first time a visitor comes to your site. Subsequent page loads will use your custom font.</p></div><p></p></div><h2>Take a moment for images</h2><p><strong>Helps with:</strong> Cumulative Layout Shift (CLS) & Largest Contentful Paint (LCP)</p><h3>When (and when not) to lazy-load</h3><p>Lazy-loading helps us tell the browser to defer the downloading of certain images until just before they're going to be seen. It helps ensure that our page load (and LCP critical path) is not clogged up by the downloading of images that are waaaaay down the page.</p><p>Webflow lets you set one of three options for image loading. Here's how I recommend you should approach deciding which one to select for each image:</p><ul><li><strong>Lazy:</strong> Select this for any image that's <em>definitely</em> off-screen when a page loads (on desktop or mobile).</li><li><strong>Eager:</strong> Set this for any large images that are going to be visible when the page loads. These will most likely be your LCP element.</li><li><strong>Auto:</strong> Set this for smaller images that are visible when the page loads (company logos for example).</li></ul><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/2273f16e924347d6a7e055cf6df0ea71de235e63-747x416.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/2273f16e924347d6a7e055cf6df0ea71de235e63-747x416.png?auto=format" alt="Webflow's image settings interface." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Webflow's image settings interface.</figcaption></figure><h3>Set width and height</h3><p><strong><em>I cannot stress how important this is.</em></strong> Setting the width and height attributes for images helps the browser guestimate how much space it should reserve for images in the layout. This prevents content from shifting around, especially as lazy-loaded images are downloaded and displayed.</p><p>You can set the height and width based on your design if you want (e.g. if you know an image will always be 300 x 300px then you can set that). Otherwise, use the dimensions of the original image you uploaded as a guide.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Make images accessible</p><p></p><p>While we're on image settings, please also take a moment to explicitly set Alt Text for images you upload. This helps make your images accessible to people browsing the web with assistive technologies. <a href="https://www.youtube.com/watch?v=flf2vS0IoRs">This video from HTTP 203</a> is a good guide to setting image alt text.</p><p></p></div><h3>Image optimisation before uploading</h3><p>At the time of writing, Webflow doesn't have support for modern image formats like WebP or AVIF. I'm sure they're working on it, and it will be a huge win for Webflow sites when this is available.</p><p>In the meantime, it's advisable to spend some time optimising images before uploading them to Webflow. This helps to ensure the images on your site are as small as they could be, before Webflow applies further optimisations such as image resizing. Here's a few tools (of the many available) that you can use:</p><ul><li><a href="http://compressor.io/"><strong>Compressor.io</strong></a> </li><li><a href="https://imageoptim.com/mac"><strong>ImageOptim</strong></a> - App for Mac</li><li><a href="https://imagecompressor.com/"><strong>Optimizilla</strong></a> </li><li><a href="https://www.imgbot.ai/compress-image"><strong>Imgbot.ai</strong></a> </li><li><a href="https://squoosh.app/"><strong>Squoosh</strong></a> - Use for single images, offers fine-grain control.</li></ul><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">WebP with JavaScript trickery</p><p></p><div><p>It should be possible to use custom JavaScript code to go through a page & apply the <a href="https://cloudinary.com/documentation/fetch_remote_images">Cloudinary Fetch API</a> (or similar) to image URLs as a means to deliver WebP images to browsers that support it. Since this would run on page load, you'd want to only apply to images that are lazy-loaded (or that are definitely off-screen).</p><p>Another alternative would be to use an edge worker (Cloudflare Worker for example) to rewrite image URLs before sending the page back to the client.</p><p>⚠️ In theory these should work, but I have yet to try it on a real site. Happy if anyone wants to volunteer their site as a test subject 😉.</p></div><p></p></div><h3>Background video instead of GIFs</h3><p>If your site is using GIFs for animated content, then swap them out for a background video component. One site I looked at had a 1.9MB animated GIF on their homepage. Converting this to an MP4 video and uploading it as a background video component saw that size come down to 20kB!</p><p>Here's a guide from Webflow on <a href="https://university.webflow.com/lesson/background-styles-overview#background-video">how to use background video</a>.</p><h2>Go easy on animations & interactions</h2><p><strong>Helps with:</strong> Largest Contentful Paint (LCP) & First Input Delay (FID)</p><p>The animations and interactions on Webflow sites are mostly JavaScript driven. So the more you use, the larger your site's JavaScript file will be. This has a cascading impact, since a larger file takes the browser longer to download and parse. There's the potential here to block the rendering of LCP elements on a page. Equally as important is the chance that JavaScript execution might prevent user interactions, impacting your site's FID metric.</p><p>It's worth regularly checking your site for unused interactions which can be removed. Webflow makes it <a href="https://webflow.com/feature/clean-up-unused-interactions">really easy to do this</a>. You can <a href="https://university.webflow.com/lesson/style-manager#deleting-all-unused-styles-in-the-style-manager">do the same for styles</a> as well.</p><h2>Speed up key site navigation</h2><p><strong>Helps with:</strong> Largest Contentful Paint (LCP)</p><p>Slow loading pages can confuse, frustrate, and deter users. So, it is even more important that pages on our site which have key user actions are optimised to render and be usable fast.</p><p>One way of doing this is by hinting to the browser where we expect the user to navigate next. The browser can then start preparing resources for that page once it's done with the page the visitor is currently on.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Use with care</p><p></p><p>This should be used very selectively since it consumes additional bandwidth which could come at a cost to the end user.</p><p></p></div><p>Within the <a href="https://university.webflow.com/lesson/link-settings">Link Settings menu in Webflow</a> there are three link loading options available to us:</p><ul><li><strong>Default</strong>: Does nothing until the link is clicked.</li><li><strong>Prefetch</strong>: The browser will start downloading resources for the linked page once the current one has finished rendering.</li><li><strong>Prerender</strong>: The browser will start downloading resources for the linked page immediately.</li></ul><p>For most links you should leave this property set as <strong>Default</strong>. For links to important action pages you can use <strong>Prefetch</strong>.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Resist the temptation</p><p></p><p>I would almost always advise against using Prerender. With Webflow's caching in place, Prefetch should be enough to get you fast page navigations to critical routes.</p><p></p></div><p>These tips should get you a long way to having a Webflow site that meets Google's Core Web Vitals best practice thresholds. </p><p></p></div>COP26.org: A quick sustainability check2024-02-20T13:25:46Zhttps://fershad.com/writing/cop26-a-quick-sustainability-check/<div><p>COP26 is taking place in the first two weeks of November 2021. For two weeks, global leaders (or their representatives) will discuss their climate commitments and how countries can collaborate to ensure emissions reductions. COP26 is important, especially off the back of <a href="https://news.un.org/en/story/2021/08/1097362">the latest IPCC report</a> which paints a stark outlook for the planet unless huge steps are taken to cut greenhouse gas emissions within the decade. </p><p>This isn't a climate blog, though. If you're looking for one, <a href="https://heated.world/">Heated by Emily Atkin</a> is a good read. Rather, in this post I thought we'd take a look at the COP26 website. Instead of the usual performance and Core Web Vitals focus though, I want to see how the site stacks up in terms of sustainability. What does it do well, and what can it do better? Let's dive in.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">A quick update</p><p></p><div><p><strong><span style="text-decoration:underline;">November 9, 2021</span></strong></p><p>**Spoiler** With the help of Tim Paul from UK Government Digital Services, the large footer image identified in this article has been replaced with an optimised version.</p><p></p></div><p></p></div><h2>What we'll look at</h2><p>When looking at the sustainability for a website, there are three key areas of focus:</p><ul><li>Servers (data centres and hosting)</li><li>Networks (data transferred over the wire to load site content)</li><li>Devices (the devices users use to view the site)</li></ul><p>Daniel Hartley has written <a href="https://www.the-public-good.com/web-development/measuring-the-web">a really good explainer</a> if you want to understand things in more detail.</p><p>We'll touch on all three for this quick website sustainability check, but the bulk of our time will be spent looking at sustainability over the network. Too keep things concise as well, I will just look at the homepage (<a href="https://ukcop26.org/">https://ukcop26.org/</a>) on desktop.</p><h2>Making a start</h2><p>To start with, let's collect some data about the page. We'll use <a href="https://digitalbeacon.co/">Beacon</a> to get an initial estimate of its sustainability in CO2e terms, and The Green Web Foundation's API to quickly check how the site is hosted. We'll also run the homepage through WebPageTest to collect some site diagnostics. You can play along at home using the links below if you like.</p><ul><li><strong>Beacon:</strong> <a href="https://digitalbeacon.co/report/ukcop26-org">https://digitalbeacon.co/report/ukcop26-org</a> (tested on <em>28.Oct.2021</em>)</li><li><strong>Green Web Foundation:</strong> <a href="https://www.thegreenwebfoundation.org/green-web-check/?url=https%3A%2F%2Fukcop26.org%2F">https://www.thegreenwebfoundation.org/green-web-check/?url=https%3A%2F%2Fukcop26.org%2F</a> </li><li><strong>WebPageTest</strong>: <a href="https://webpagetest.org/result/211028_BiDcJJ_6e1700968d79e1eb40c40f0bde65d074/">https://webpagetest.org/result/211028_BiDcJJ_6e1700968d79e1eb40c40f0bde65d074/</a> </li></ul><h2>An initial assessment</h2><p>A few things stand out from the tests above. They are:</p><h3><strong>Initial visits are large, but caching is very good.</strong></h3><p>The results from Beacon don't paint the rosiest of pictures. The page downloads 6.13 MB of data for users on a cold cache (i.e. someone who's never visited the page/site before). That's something we'll look to address later in this article. This equates to an estimated 5.136g of CO2e produced for each pageview (or <code class="language-markup">5.136 / 6.13 = 0.837g</code> per MB).</p><p>There is some good news though. Our results on WebPageTest show us what things are like for returning visitors to the site. Caching is set up very effectively, with only 717 KB downloaded on a repeat view.</p><p>The caching across the site does mean that as a user navigates the site, much less data is downloaded compared to their first pageview.</p><h3><strong>Website hosting could be greener</strong></h3><p>Our test from the Green Web Foundation comes back "grey". That serves to hint that the site might not be on a host that is <em>known</em> to use 100% green energy. There is every chance that the site is still hosted in a country that generates a higher amount of clean energy. The UK is decent in this aspect - <a href="https://app.electricitymap.org/zone/GB">https://app.electricitymap.org/zone/GB</a>.</p><p>The site does use Amazon Cloudfront as a CDN, which is a good start. Using a CDN means that static assets can be stored closer to end users. This reduces the distance it has to travel when requested, and in turn can reduce the amount of electricity required to transmit the data. Since COP26 is a global event, this is a very effective measure.</p><h2>A word on design</h2><p>Overall, the design of the site is simple. There's a focus on three colours - white, green, and a purple-ish blue which the internet tells me is called <a href="https://www.color-name.com/hex/38318c">Cosmic Cobalt</a> 🤷🏾. The use of duotone images in parts is a nice touch and can go some way to reducing the size of image files. It would be great to see the sponsors logos also show in duotone or black & white.</p><p>Some design modifications around the Twitter & Instagram feeds would go a long way towards improving the sustainability of this webpage. Currently, the scripts and content for these sections are all downloaded when the page first loads. With some redesigning of each section, these could be requested on user interaction instead. In that way, only those website visitors interested in the social content would download the additional data. This has the potential to reduce page size by about 1.4 MB.</p><h2>Tackling page size</h2><p>Okay, design changes aside, there are still a few things we can do to get the webpage size down to something more reasonable. For some perspective, the average weight of a webpage in October 2021 is just over 2 MB (<a href="https://httparchive.org/reports/page-weight?start=2017_04_15&end=latest&view=list#bytesTotal">HTTP Archive</a>). Let's set that as our target. So, we're looking to reduce at least 4 MB overall.</p><h3>One change gets us 75% of the way there</h3><p>In the footer of the site there is a half-globe image, similar to the one that is used in hero section at the top of the page. <em>Similar</em>, but not the same. The image in the hero seems designed specifically for that section and comes in at just over 237 kB.</p><p>The image in the footer, however, comes in at 3 MB in size. As a bonus, the image is set as a <code class="language-markup">background-image</code> in the footer's CSS declaration. This means we can't defer loading it using the browser's native lazy-loading capabilities.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/824e84eb25ed38f36e502d60421454bd5a6be37e-1180x224.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/824e84eb25ed38f36e502d60421454bd5a6be37e-1180x224.png?auto=format" alt="Footer of the COP26 website featuring a large globe cut off at the bottom of the page." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">The footer image in question.</figcaption></figure><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Update</p><p></p><p>This image has now been replaced by a 462 kB version. A big thank you to Tim Paul from UK Government Digital Services and the Cabinet office making the update.</p><p></p></div><p>There are a handful of ways to go about making the situation better with this image. Here are two that I'd go with:</p><ol><li><strong>Move it into an <code class="language-markup"><img></code> tag</strong> Removing the image from the CSS, and instead requesting it using an <code class="language-markup"><img></code> tag within the page's HTML lets us instantly apply the <code class="language-markup">loading="lazy"</code> attribute to the image. Since it's in the footer, it most certainly won't be requested when the page first loads. There'd be a tiny bit of CSS added to position the image appropriately.</li><li><strong>Re-use the hero image in the footer</strong> The image in the hero could be used in the footer with a little CSS to rotate the image to match the current design.</li></ol><p>If done in combination, the two steps above will remove pretty much an entire 3 MB from the total size of the page.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">An impactful change</p><p></p><div><p>Since this image is in the footer, it will be downloaded by every single website visitor the first time they visit any part of the COP26 site. Removing this download would have the single biggest impact on the entire site's sustainability profile.</p><p>Let's assume a million unique visitors visit some part of the COP26 site over the next two weeks. We're looking at a potential CO2e reduction of 2500 kg. with this one change alone (<code class="language-markup">3 MB * 0.837g per MB = 2.511g * 1000000 / 1000 = 2511 kg.</code> ).</p></div><p></p></div><h3>Almost there with modern image formats</h3><p>Of the remaining 1.5 MB worth of images on the homepage, most are lazy loaded. However, every image on the page could benefit from some further optimisation. By using newer image formats like WebP or even AVIF we can:</p><ul><li>Take over 300 kB off the initial page load.</li><li>Reduce about 600 kB from the entire page.</li></ul><p>This assumes there's nothing we can do about the images that are loaded by the Twitter and Instagram plugins.</p><h3>Replacing icon fonts with SVG</h3><p>Looking through the network requests, there appear to be at least 8 requests for CSS files that relate to icon fonts. Looking around the homepage, I'm not able to see where these icons are used. Running a Coverage test in Edge DevTools confirms just that. Of the 143 kB of icon CSS loaded, only 1.5 kB is used on the homepage.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">There's caching, but ...</p><p></p><p>It might be the case that these icons are used in other parts of the website. The site's caching does help here, but users are still being made to download a chunk of data that they might never actually need.</p><p></p></div><p>As an alternative, all these icons files could be replaced with SVGs. This would have two benefits:</p><ol><li>The icons would only be downloaded on the pages where they are actually used.</li><li>Requesting the SVG icons using either <code class="language-markup"><img></code> tags or through CSS will allow the site's caching policy to still be effective.</li></ol><p>Not only would there be a saving made on the CSS loaded by the homepage. As a result of moving off icon fonts we'd eliminate 630 kB worth of font files.</p><h3>Using WOFF or WOFF2 for fonts</h3><p>The main fonts used for text content on the site are Tungsten (Bold) and Rubik (Bold, Medium). These fonts are hosted locally, which is great for performance. That performance gain gets negated somewhat by the fact that the fonts are being transferred in TTF format, totalling 320 kB.</p><p>Providing WOFF or WOFF2 alternatives (which both have <a href="https://caniuse.com/?search=woff">very wide browser coverage</a>) would deliver <em>at least</em> as 50% reduction in file size. That's without going further and subsetting the font files.</p><h2>How have we ended up?</h2><p>Let's recap. Starting out with a page that loads 6.13 MB, we can make the following changes to reduce page weight:</p><ul><li>Remove/replace the 3 MB image in the footer. <strong>Saving 3 MB.</strong></li><li>Use modern image formats for most images. <strong>Saving 300 kB on page load</strong>.</li><li>Replace icon fonts with SVG. <strong>Saving about 770 kB.</strong></li><li>Using WOFF/WOFF2 font formats. <strong>Saving <em>at least</em> 160 kB.</strong></li></ul><p>That gets us down to <strong>about 1.9 MB</strong> page weight on first load. 🎉 Hooray! Mission accomplished.</p><p>If a design change can be made to load the contents of Twitter and Instagram feeds only after user interaction, then the page weight could go as low as <strong>500 kB.</strong></p><h3>What does this mean in terms of CO2e?</h3><p>In terms of CO2e, let's do some back of the envelope math. Beacon had one pageview (6.13 MB) generating 5.136g of CO2e. Breaking that down <code class="language-markup">5.136 / 6.13 = 0.837g per MB</code>.</p><p>Reducing the page size to 1.9 MB would instead generate <code class="language-markup">0.837 * 1.9 = 1.590g CO2e</code> per pageview (<strong>a reduction of 3.546g</strong>). Going all the way with the social media design changes, CO2e could be reduced to as little as <code class="language-markup">0.837 * 0.5 = 0.419g</code> per pageview (<strong>a reduction of 4.717g</strong>).</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Incremental gains</p><p></p><div><p>Let's bear in mind that the numbers above relate only to visitors who first enter the UKCOP26 website via the homepage. Every 10,000 visits like this would result in <em>at least</em> 35 kg. (up-to 47 kg.) less CO2e being released into the atmosphere.</p><p>As we saw with the image in the footer, applying the changes mentioned in this post across the entire site would have an even greater impact.</p></div><p></p></div></div>Tracking real Core Web Vitals scores2024-02-20T13:25:46Zhttps://fershad.com/writing/tracking-real-core-web-vitals-scores/<div><p>There's been a lot of talk about Core Web Vitals. Rightly so too. Early data is showing that <a href="https://www.sistrix.com/blog/core-web-vitals-is-a-measurable-ranking-factor/?utm_source=Perf.email&utm_campaign=1c5a924166-Perf+Email+%2384&utm_medium=email&utm_term=0_7cba5dc7bd-1c5a924166-1288835902">Core Web Vitals does seem to be playing a factor in Google Search rankings</a>. So, this week we're going to look at a few different ways you can keep track of your site's own real world Core Web Vitals.</p><h2>A Core Web Vitals refresher</h2><p>If you're reading the term Core Web Vitals for the first time or need a quick reminder of what they are this section is for you.</p><p>Put simply, Core Web Vitals are a set of metrics that aim to quantify real-world user experiences across the web. They measure page interactivity, content loading, and content stability during page load. The three metrics that form Core Web Vitals are:</p><ul><li><strong>Largest Contentful Paint (LCP)</strong>: A timing of how long it takes for the largest above-the-fold element to be painted on screen. This is usually a hero image/video or large text block.</li><li><strong>First Input Delay (FID):</strong> Measures the time it takes before the browser can react to a user input (like a click or tap).</li><li><strong>Cumulative Layout Shift (CLS):</strong> Indicates the movement of visible elements as the user loads and interacts with the page. You know when you start reading an article, then an ad loads above it & all the content get pushed down? CLS measures things like that.</li></ul><p>Core Web Vitals are "a Google thing". They are metrics that were developed by teams at Google a few years ago. Now, these metrics have become part of Google's search ranking algorithm.</p><h2>Real user monitoring</h2><p>As the name implies, real user monitoring (RUM) collects data from actual user sessions on a website. Since they capture real world usage, RUM is considered the gold-standard when it comes performance monitoring.</p><p>RUM data can help expose issues that might not be picked up in synthetic (also called lab or simulated) testing. RUM data can also help serve as a guide when setting up automated performance testing as part of a workflow.</p><p>Most importantly when looking through the lens of Core Web Vitals, RUM data is what Google feeds into their search ranking algorithm. So how can your site use RUM?</p><h3>Google Search Console</h3><p>In the Experience section of Google Search Console, you'll find a section entirely focused on your site's Core Web Vitals. This section uses data from the Chrome User Experience (CrUX) Report to present a breakdown of how the indexed URLs of a website perform for the Core Web Vitals metrics.</p><p>CrUX data is updated monthly. So, if you want more immediate feedback on any website changes you've made then you'll need wait (or use one of the other sources mentioned below). Also, if your site is new, or doesn't get that much traffic to have a meaningful dataset, then you won't see anything in this tab.</p><p>That said, this using Google Search Console to keep track of your site's Core Web Vitals is a free, easy way to get started.</p><h3>Treo site speed report</h3><p>Another really great way to surface CrUX data is the Treo Site Speed report (<a href="https://treo.sh/sitespeed">https://treo.sh/sitespeed</a>). This free report presents a nice visual breakdown of a site's Core Web Vitals. It's worth noting that the data here is shown at the origin level (e.g. your domain - <a href="http://www.fershad.com/">www.fershad.com</a> in my case). It lets you quickly get a view of the overall health of your website.</p><p>At the very top of the report, you'll see Core Web Vitals data for your site aggregated over the past 28 days. This is a great way to get more timely feedback on changes made on your site. This is the same data you'll see if you run a test on <a href="https://developers.google.com/speed/pagespeed/insights/">Google's PageSpeed Insights tool</a>.</p><p>Treo's site speed report is free, and you can sign up to get updated when new CrUX data is available for your website. They also have paid services for more regular and detailed monitoring.</p><h3>Paid RUM services</h3><p>There are a host of services that allow you to capture RUM data beyond just Core Web Vitals. They don't come cheap. But when you consider the return on investment that can be gain from web performance improvements, they are worth considering. If you don't know the business gains web performance improvements can bring, I urge you to check out <a href="https://wpostats.com/">https://wpostats.com/</a>.</p><p>Here are a few options you can consider and compare:</p><ul><li>Speedcurve: <a href="https://www.speedcurve.com/">https://www.speedcurve.com/</a> </li><li>Pingdom: <a href="https://www.pingdom.com/">https://www.pingdom.com/</a> </li><li>Sentry: <a href="https://sentry.io/">https://sentry.io/</a> </li><li>Datadog: <a href="https://www.datadoghq.com/">https://www.datadoghq.com/</a> </li></ul><h3>Roll your own script</h3><p>If you want to keep things in-house, you can write your own script to send custom performance related events to your website analytics platform. Using the browser's own <a href="https://developer.mozilla.org/en-US/docs/Web/API/Performance">Performance API</a>, or libraries like <a href="https://zizzamia.github.io/perfume/">Perfume.js</a> you can capture a host of data on how your website (and individual pages) perform in the wild.</p><p>If you want to focus on Core Web Vitals alone, the Chrome team have released a <a href="https://github.com/GoogleChrome/web-vitals">Web Vitals library</a> which can be integrated into a website. There are also instructions on how to send this data to analytics endpoints.</p></div>Using Cloudflare Workers to inline external CSS2024-02-20T13:25:46Zhttps://fershad.com/writing/cloudflare-workers-inline-external-css/<div><p>CSS is render blocking. What I mean by that is that when the browser comes across a section of CSS on a page, it will stop what it's doing and start working on parsing the CSS. This happens with both CSS in external files, as well as CSS in `<style>` tags.</p><p>That makes sense, right. CSS is what makes the web beautiful. It provides layout, typography, colour schemes and a lot more. So naturally the browser wants to make sure it gets all the layout and style information from the CSS before it goes on with parsing the HTML.</p><p>So why might we want to inline CSS in a `<style>` versus fetching it via a `<link>` tag? The biggest benefit comes not having to wait for the CSS to download before it can be parsed. Since the `<style>` tag is inside the HTML document itself, it'll come along with the initial page request. This can really help improve First & Largest Contentful Paint times (FCP & LCP). </p><p>In an ideal world, you'd have a build step that generates critical CSS for a page and inlines that in the `<head>` of the document. All other CSS can be loaded asynchronously using a `<link>` tag. The world often isn't ideal though.</p><h2>Enter Cloudflare Workers</h2><p>Cloudflare Workers allow you to intercept page requests, and using the HTMLRewriter API you can modify the content of a page before sending it to the browser. Using Workers we can find any synchronous external stylesheets on a page, and replace their `<link>` tag with CSS inlined with a `<style>` tag.</p><p>The script below is a simplified example of how to do this. One thing to note first:</p><ul><li>The script below finds any `<link rel="stylesheet">` tag and replaces them (in the same position within the HTML) with a `<style>` tag. </li></ul><pre class="language-javascript"><code class="language-javascript">// worker.js
addEventListener('fetch', (event) => {
event.respondWith(handleRequest(event.request));
});
async function handleRequest(request) {
const response = await fetch(url.toString(), request);
return new HTMLRewriter().on('link[rel="stylesheet"]', new cssInline('href')).transform(response);
}
async function fetchCSS(url) {
const response = await fetch(url);
return response.text();
}
class cssInline {
constructor(attributeName) {
this.attributeName = attributeName;
}
async element(element) {
const attribute = element.getAttribute(this.attributeName);
if (attribute) {
const styles = await fetchCSS(attribute);
element.replace(`<style>${styles}</style>`, {
html: true,
});
}
}
}</code></pre></div>Reducing website carbon emissions2024-02-20T13:25:46Zhttps://fershad.com/writing/reducing-website-carbon-emissions/<div><p>I've <a href="https://fershad.com/writing/the-environmental-case-for-website-performance/">written about the carbon impact of the internet before</a>. Here's a quick recap:</p><ul><li>Global information and communication technology (ICT) accounts for around <a href="https://theshiftproject.org/wp-content/uploads/2019/03/Lean-ICT-Report_The-Shift-Project_2019.pdf">4% of global CO2 emissions</a>.</li><li>Just over one-third of that is attributable to data centers and networks (i.e. the internet).</li><li>Some forecasts put ITC as high as 8% of global carbon emissions by 2025 if left unchecked.</li><li>For a bit of perspective, if the internet were a country it would be the 6th largest polluter.</li></ul><p>A lot of the carbon impact from websites and apps comes from data centers and networks. So how do we, as frontend developers, go about reducing the carbon emissions associated with what we build?</p><h2>Start by measuring</h2><p>If you want to improve an existing website, then start off by getting a sense of its current impact. Using tools like <a href="https://www.websitecarbon.com/">Website Carbon Calculator</a>, <a href="https://digitalbeacon.co/">Beacon</a>, or <a href="https://ecoping.earth/">EcoPing</a> you can get an estimate of carbon emissions for a given web page. By using these figures alongside your site's analytics you can estimate your total carbon emissions.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Note</p><p></p><p>You should consider whatever figures you get from these tools as the <strong><em>minimum estimated</em></strong> carbon emissions for a web page. They are all based on data transferred for the initial page load, and so do not take into account lazy-loaded images, javascript, or other data that might be fetched through user interaction.</p><p></p></div><h2>Emissions reductions starts with design</h2><p>There are a lot of things you can do when designing a new website or app that can ensure you start from a base that's already optimised for lower emissions.</p><h3>Use energy efficient colours</h3><p>This helps reduce the energy a user's device consumes when using your site or app. Consider a darker colour palette or offering a dark mode option (you can switch to this automatically based on user preferences too!). Interestingly, blues are about <a href="https://www.youtube.com/watch?v=N_6sPd0Jd3g">25% more energy intensive than reds or greens</a>.</p><h3>Seriously question jumbotron videos, hero images, and image carousels</h3><p>Large autoplay videos at the top of web pages force an incredible amount of data to be transferred over the network. Often these videos are purely aesthetic. Ask yourself if it's really needed, or if you can instead play the video only if the user interacts with it. <a href="https://justdiggit.org/">Justdiggit</a> have a really creative example of this practice on their homepage.</p><p>The same applies for large hero images or carousels. Carousels, in particular, can result in multiple images being downloaded some of which may never be seen by the user. Plus, there's evidence that they're <a href="https://thegood.com/insights/ecommerce-image-carousels/">not as effective</a> as your marketing manager might think. If you've got no option but to use a carousel/hero image then ensure its optimised (more on this later), and that any images not required for the initial page load are lazy loaded.</p><h3>Reduce the number of fonts used</h3><p>In an ideal world you'd be using only system fonts for your site's content. That would mean no additional data download for end users. A lot of the time that's not possible, so ensure the fonts you are using a kept to a minimum, and are optimised. This will improve your site's Core Web Vitals too. I've <a href="https://fershad.com/writing/introduction-to-optimising-web-fonts/">written about font optimisation</a> in an earlier issue.</p><h2>Green hosting plus a CDN</h2><p>Hosting your site on a green provider can go a long way to reducing your site's footprint. The more people who move to green web hosts, the stronger the message will be to the rest of the industry that green options should become the norm.</p><p>The Green Web Foundation maintain a <a href="https://www.thegreenwebfoundation.org/directory/">directory of verified green web hosts</a>. You can reference this list to find a provider in your region, or one that's located close to your customers.</p><p>Back up your hosting strategy with a CDN that can cache static content closer to your users. By doing so you reduce the amount of electricity required for data transmission. Some providers like <a href="https://blog.cloudflare.com/cloudflare-committed-to-building-a-greener-internet/">Cloudflare</a> and <a href="https://www.akamai.com/company/corporate-responsibility/sustainability">Akamai</a> have varying degrees of sustainability commitments outlined on their websites.</p><h2>Optimise images</h2><p>Images often make up the bulk of data transferred for a web page. Effective image optimisation can instantly take megabytes off the total size of your page. This topic is really deep, and I've written more about it <a href="https://optimised.email/issues/issue-5-optimising-images-reducing-image-size">previously in my newsletter, Optimised</a>. A quick checklist of impactful optimisations:</p><ul><li>Use modern image formats like AVIF or WebP.</li><li>Compress JPEG and PNG images.</li><li>Refrain from using GIFs.</li><li>Lazy-load any images that are not visible when the page initially loads.</li><li>Properly size images based on how they appear in the viewport.</li><li>Use an image CDN to take care of most of the above for you 😉.</li></ul><p>Taking this a step further, you can even use lightweight images as placeholders and load a higher quality image one user interaction. <a href="https://lowimpact.organicbasics.com/usd/products/accessories-recycled-wool-starter-pack">Organic Basics' low impact website</a> is a great example of this practice in action.</p><h2>Other significant steps you can take</h2><p>The above few items can take a long way towards ensuring your website or app has a small carbon impact. Below is a list of a few extra steps that will move the needle even further.</p><ul><li>Use facades for heavy content like videos or JavaScript dependent widgets. Load the actual content only when the users interacts with the element.</li><li>Minify your CSS & JS. Check that any third-party code you use is also minified.</li><li>Ensure you have GZIP or Brotli compression enabled on your host or CDN.</li><li>Set <a href="https://csswizardry.com/2019/03/cache-control-for-civilians/">effective cache headers</a> for static assets.</li><li>If you're running a WordPress website, check if your host or CDN can help you cache HTML pages. This will also improve your performance, and reduce the load on your server too!</li></ul><h2>Systemic change is needed</h2><p>Let's be real. Reducing the carbon emissions of my personal website, with its 700 odd pageviews each month, isn't going to make a difference. However, if we as a web community can increase awareness of the impacts that our sites, apps, and platforms are having then we'll be in a better place to drive system-level change.</p><p>A sustainable web is also a faster web. By standardising sustainable web development practices, we can as an industry do our part to provide a cleaner more sustainable future for the planet.</p></div>Stress testing site performance2024-02-20T13:25:46Zhttps://fershad.com/writing/stress-testing-site-performance/<div><p>As developers, we're normally building and testing websites on devices with decent specs. That's fine for productivity, but it does leave a lot of room for performance issues to be missed.</p><p>The truth is that most people who use the web do not have access to the latest iPhone, an ultra-fast laptop, or even high-speed internet. Heck, they <a href="https://shkspr.mobi/blog/2021/01/the-unreasonable-effectiveness-of-simple-html/">might not even be using a phone, laptop, or desktop at all</a>. Browsers come installed with a lot of low-powered devices - from portable games consoles, to TVs, to fridges. These devices don't necessarily have powerful CPUs to handle heavy processing tasks, and portable devices might be further restricted by weak network connectivity.</p><p>Of course, it's next to impossible to test your website for every possible usage scenario. But effective stress testing will greatly increase the likelihood that your site is available and usable to visitors, even in less-than-ideal situations.</p><h2>Testing on your device</h2><p>Regularly testing a web page as you build it is a great way to prevent performance issues from creeping in. However, there's one important thing to keep in mind when running performance tests on your own device.</p><p>Local testing tools will often use your device's CPU, GPU and network to run tests. This means that if you're on a souped-up laptop and connected to high-speed internet, your test will be reflective of that environment.</p><h3>Have extra testing devices handy</h3><p>It's always worth keeping some older devices handy for testing. Not only is this good for performance testing, but it also <a href="https://www.siliconrepublic.com/enterprise/gerry-mcgovern-digital-pollution-e-waste">helps reduce e-waste</a> by extending the useful life of those devices.</p><p>Personally, I've taken over my girlfriend's old 2012 MacBook and iPhone 8 to use as testing devices for iOS & Safari. Despite also recently upgrading to a faster laptop, I've held onto my Surface Pro 6 to serve as lower-spec laptop testing device.</p><p>Of course, this solution is possible for everyone. Space, budget, and access to devices can all get in the way.</p><h3>Apply throttling to your tests</h3><p>Another way to test site performance during development is to use your browser's DevTools. Once again, though, these tests will all be run using your device's own CPU, GPU and network. Throttling is one way you can turn down the speed of your processors and network to simulate how your site performs on lower spec devices.</p><p>If you're checking requests in the Network tab, you can apply Network throttling to get a feel for how requests come in over slower networks. You can also disable caching, allowing you to experience the site as a first-time visitor.</p><p>The same applies in the Performance tab. Here you can also apply CPU throttling, to simulate how processor-intensive tasks may perform on low-powered devices.</p><p>Finally, if you're using DevTools to run Lighthouse audits then be sure that you've enabled the <em>Simulated throttling</em> option which will apply network and CPU throttling to the tests being run.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Note</p><p></p><div><p>It's worth noting that network throttling here will slow down the download time for assets, but TCP/SSL connections will not be slowed down. This is important to keep in mind if you've got a lot of third-party resources coming in. </p><p>TCP/SSL connections add about a second to requests and in the real world will be impacted by slow network conditions.</p></div><p></p></div><h2>Going further with WebPageTest</h2><p><a href="https://webpagetest.org/">WebPageTest</a> is an insanely powerful tool for understanding website performance. WebPageTest allows you run tests from different locations, on a wide range of devices (some real, other simulated), alongside applying network throttling and a host of other settings.</p><p>When working with WebPageTest, I strongly recommend using network settings that are a few notches down from what you might expect your average user to be on. For example, if you're testing the desktop version of your site, then run a test using a 4G (or even Fast 3G) network connection just to simulate how it might feel for someone that doesn't have fast cable internet.</p><p>Another huge benefit of using WebPageTest is that it allows you script clicks and navigations into your tests. This means that you can test performance of page navigation, or even simulate a user journey while collecting performance metrics at the same time. It's a bit too much to get into right now, but if you're interested here's <a href="https://docs.webpagetest.org/scripting/">some documentation</a> to get you started.</p></div>Quick Performance Audit - Taiwan COVID Vaccination Website2024-02-20T13:25:46Zhttps://fershad.com/writing/quick-performance-audit-taiwan-covid-vaccination-website/<div><p>With COVID-19 still spreading around the globe, governments are striving to vaccinate as many of their citizens as possible. Here in Taiwan, vaccine registration and bookings are made through a government website (<a href="https://1922.gov.tw/">https://1922.gov.tw</a>). Members of the public can use that site to 1) register/change their vaccine preference, and 2) book their vaccination when they are eligible.</p><p>The website, therefore, gets a lot of traffic. I used it last weekend to update my registration details since I had moved back to the capital, Taipei. While going through the process, I wondered (as you do ...) "what's going on behind the scenes here?". In this post, I'll take a look at how the site stacks up in terms of performance, what it does well and where improvements can be made.</p><h2>Background</h2><p>Taiwan's Covid-19 vaccination registration website is mostly text based, with a few forms for users to input their details. There's likely a bit of client-side validation which requires JavaScript too. Besides that, it's a pretty simple site on the surface. Yet the homepage still transfers over 450 kB when first visited (caching drops this down a bit more than 150 kB for subsequent visits). For a web page of this nature a page weight of nearly 500 kB seems a bit strange. So, what's going on?</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Scope</p><p></p><p>For the purpose of this quick audit we're only going to look at the homepage (<a href="https://1922.gov.tw/">https://1922.gov.tw</a>).</p><p></p></div><h3>Real user performance</h3><p>A page like this, which gets a lot of traffic will almost certainly have Chrome User Experience (CrUX) data available. This is data that's collected from Google Chrome users that have usage statistics reporting enabled on their browser. It's not a complete representation of <em>all</em> users by any means, but it's a helpful reference.</p><p>There are a few places we can check CrUX data for a site, the easiest being to run the page URL through Google's <a href="https://developers.google.com/speed/pagespeed/insights/">PageSpeed Insights</a> (PSI) tool. When we run the <a href="https://1922.gov.tw/">https://1922.gov.tw</a> URL through PSI, we're presented with a "Field Data" section at the top of the report. This shows us the CrUX data for the page's Core Web Vitals metrics. At the time of writing here's how the page performed on mobile:</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/aba69c430d0a6143a2f883c3251130258277410b-739x247.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/aba69c430d0a6143a2f883c3251130258277410b-739x247.png?auto=format" alt="Screenshot of PageSpeed Insights results for 1922.gov.tw" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center"></figcaption></figure><ul><li><strong>First Contentful Paint (FCP) 1.6s</strong> <br />Good: 80%, Needs improvement: 11%, Poor: 9%</li><li><strong>Largest Contentful Paint (LCP) 1.5s</strong> <br />Good: 88%, Needs improvement: 5%, Poor: 6%</li><li><strong>First Input Delay (FID) 11ms</strong> <br />Good: 98%, Needs improvement: 2%, Poor: 0%</li><li><strong>Cumulative Layout Shift (CLS)</strong><br />Good: 100%, Needs improvement: 0%, Poor: 0%</li></ul><p>Overall, that's pretty good. It's worth remembering, though, that Taiwan is a very well-connected country. High-speed internet access and 4G mobile connectivity are commonplace. Running a (generous) <a href="https://www.notion.so/4af017ea23beb960afc0793f87d3de2b">simulated test on WebPageTest</a> using an emulated iPhone X & Fast 3G network sees FCP & LCP times get pushed to 2.7s.</p><p>The stats above show us that there's some room to improve the Paint related metrics on the web page. We'll get to what can be done to address these shortly. First, let's take a quick look at some of the things that developers behind Taiwan's vaccination registration site have done well.</p><h2>A few good notes</h2><h3>Self-hosting assets</h3><p>The first thing that struck me when I saw the WebPageTest waterfall chart for the Taiwan COVID Vaccine Website was that every single request was made to the same domain. Every JavaScript, CSS, image, and font file is self-hosted on the <a href="http://1922.gov.tw/">1922.gov.tw</a> domain.</p><p>This small step is one of the most impactful web performance measures you can take early in the life of a project. Now granted, it might be that because this is a government website there are some conditions that forced the developers down this path, but it's the right path to be on.</p><h3>Caching with Cloudflare</h3><p>Almost all the requests on the site are served with cache headers. It looks as though Cloudflare is being used to provide a caching layer for the site. This can help ease the load off the website host during times of high traffic. That said, the cache max-age is set to 4 hours (<code class="language-markup">max-age=14400</code>). This could be extended, and we'll touch on how later.</p><h2>Improving FCP and LCP</h2><p>What can be done to improve the First & Largest Contentful Paint metrics for <a href="https://1922.gov.tw/">https://1922.gov.tw</a>? Looking at results from PageSpeed Insights as well my browser's developer tools (DevTools) the main area to address is reducing the amount of unused code that's being shipped with the page.</p><h3>CSS accounts for 1/4 of total bytes transferred</h3><p>For a fairly simple looking web page, CSS manages to account for around 25% of the total page weight (122 kB in total). More than 90% of this comes from two CSS files - <code class="language-markup">icons.css</code> and <code class="language-markup">bootstrap.css</code> (111 kB).</p><p>That's a fair chunk of CSS. Looking at the Coverage tab in DevTools shows us that a lot of the code in these two files is unused. 99.9% of the <code class="language-markup">icon.css</code> file, and 96.6% of the <code class="language-markup">bootstrap.css</code> file are redundant bytes. Not only are these bytes transferred over the network but, since CSS is render blocking, the browser must parse all the code in each file before it can continue rendering the page.</p><p>Ideally, we'd like to remove the unused CSS declarations from these files. For a small site like this, they could be added to <code class="language-markup"><style></code> tags in the HTML document itself. In this way, the browser doesn't even need to spend time downloading external CSS files.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Savings</p><p></p><p>The PageSpeed Insights report tells us that removing the unused bytes from these two files would take their size down from 111 kB to just 1.2 kB. This would bring the overall weight of CSS to a more reasonable 11 kB.</p><p></p></div><h3>Bootstrap & jQuery</h3><h4>Doubling up on Bootstrap</h4><p>The web page loads two different bootstrap JavaScript files - <code class="language-markup">bootstrap.min.js</code> and <code class="language-markup">bootstrap.bundle.min.js</code>. I'm not a CSS frameworks guy, and I haven't touched Bootstrap in a very, <em>very</em> long time so I'm not sure what the difference between these two files are. What I do know is that the DevTools coverage report highlights 82.1% and 77.8% of unused code in these files respectively.</p><p>Ideally, when working with a framework, you'd like to be able to remove any unused JavaScript code before shipping a site. This is a practice known as <em>tree shaking</em>.</p><p>It might be that these two bootstrap files include code that's used in other parts of the website. In those cases, you'd instead look to break your code into modules which are loaded as needed.</p><p>The developers have also used Bootbox, a Bootstrap plugin for building alerts and dialogs. Bootbox comes in at around 15 kB with only 3 kB being used on the page. Again, tree-shaking might help to remove the unused functions from this file.</p><h4>Replace jQuery with JavaScript</h4><p>The homepage also loads about 32 kB of jQuery (compressed). This isn't a lot when transferred over the network, but it more than doubles in size after it's downloaded and parsed.</p><p>With the advancements in web APIs in recent years, it's worth considering if this jQuery library can be replaced with plain JavaScript code. <a href="http://youmightnotneedjquery.com/">You Might Not Need jQuery</a> is a handy resource for anyone looking into this.</p><p>If moving away from jQuery isn't an option, then using a tool like <a href="http://projects.jga.me/jquery-builder/">jQuery Builder</a> to create a minified bundle of only the required modules is a good step. A builder is <a href="https://jqueryui.com/download/">also available for jQuery UI</a>.</p><h3>Some other small things</h3><ul><li>There are four icons/logos that are loaded as PNG images. These could be replaced with inline SVGs instead. They'd then be included in the main HTML file itself and save a few 100ms in download time.</li><li>Likewise, the one icon that is used from the <code class="language-markup">icon.css</code> file could be replaced with an inline SVG.</li><li>Cache max-age could be extended to a much longer duration - like a week or month even. If any files are changed, they can be cache busted using a hash or version number.</li><li>The favicon for the site is close to 70 kB! It's an <code class="language-markup">.ico</code> file, so looking to provide an SVG alternative for modern browsers to use could bring down the size.</li></ul></div>Proxying AWS S3 content with Cloudflare Workers2024-02-20T13:25:46Zhttps://fershad.com/writing/proxy-aws-s3-content-cloudflare-workers/<div><p>A common use case for proxying requests would be when hosting content in Amazon Web Services S3 buckets. Not only is this handy for performance, but it can also reduce your AWS bill if you cache assets that are delivered by the proxy. Another benefit is that it allows you to use international S3 buckets to serve content to users that might be located in China - which has special conditions in place for storing & delivering AWS content.</p><p>From a performance perspective, proxying requests through your own domain can remove the requirement for the browser to open up a new HTTP connection, then go about resolving the DNS, before finally opening a TCP connection and performing SSL negotiation. This can save up-to (sometimes over) a second to a request.</p><p>In the example below I use a Cloudflare worker to proxy requests from a website to AWS S3. The workers is set up so that any traffic to the <code class="language-markup">[website-domain]/file</code> route gets intercepted & redirected to the S3 bucket being used to fetch data. You can set up something similar on most CDNs that provide edge-compute, or even on your web host.</p><pre class="language-javascript"><code class="language-javascript">addEventListener('fetch', event => {
event.respondWith(handleRequest(event))
})
async function handleRequest(event) {
let requestFolder = '/files'
let s3Folder = '/uploads'
let url = new URL(event.request.url)
const cache = caches.default
let origPathname = url.pathname
const filename = url.toString().split('/').pop()
url.hostname = '[YOUR_S3_BUCKET_URL_HERE]'
url.pathname = origPathname.replace(new RegExp('^'+escapeRegExp(requestFolder)), s3Folder)
response = await fetch(url)
response = new Response(response.body, { ...response})
return response;
}
function escapeRegExp(string) {
return string.replace(/[.*+?^${}()|[\]\\\/]/g, '\\$&');
}</code></pre><p>As an additional step, you can set up caching on the returned files, so that subsequent requests for the same file are served using the cache. You can read more about that in this <a href="https://developers.cloudflare.com/workers/tutorials/configure-your-cdn">guide from Cloudflare</a>.</p></div>Proxying Cloudinary image requests with Cloudflare Workers2024-02-20T13:25:46Zhttps://fershad.com/writing/proxy-cloudinary-with-cloudflare-workers/<div><p>I've written about the perils of relying on third-party resources previously in my newsletter, Optimised (see <a href="https://optimised.email/issues/issue-2-third-party-resources-a-cautionary-tale">Issue 2 - Third-party resources - A cautionary tale</a>). That said, most of the time you'll be unable to avoid using at least some third-party hosted assets on your website. Whether it's an analytics provider, image hosting, advertising, or even a cookie consent manager.</p><p>There are a few ways third-party resources can impact on site performance. But there is one that is common to any and all third-party resources. That is, that for each third-party domain you use the browser must open up a new HTTP connection, then go about resolving the DNS, before finally opening a TCP connection and performing SSL negotiation. This can add up-to (sometimes over) a second to a request. In a world where <a href="https://www2.deloitte.com/ie/en/pages/consulting/articles/milliseconds-make-millions.html">milliseconds makes millions</a>, that's a hefty delay.</p><p>One way to get around this is to host resources on your own domain. Barring that, another workaround is to shift the work off the browser, and onto your host/CDN. Since hosts/CDNs process millions of requests a minute they're better optimised to resolve DNS fast, which can shave several hundred milliseconds of request times. By setting up a proxy, you can trick the browser into believing a resources is being requested from your own domain, when in fact it is coming from a third-party.</p><p>On this website, I use Cloudinary to host and serve my images. Normally, this would mean that for the first image requested on a page would incur the DNS-TCP-SSL penalty I mentioned earlier. Instead, I can use a Cloudflare Page Rule that routes all traffic from the <code class="language-markup">fershad.com/image</code> route to a Cloudflare Worker. The Worker intercepts the request, and fetches the requested asset from Cloudinary. As far as the browser is concerned, it's requesting and receiving a file from my domain.</p><p>You can set up something similar on most CDNs that provide edge-compute, or even on your web host.</p><pre class="language-javascript"><code class="language-javascript">addEventListener('fetch', event => {
event.respondWith(handleRequest(event));
});
const CLOUD_URL = `https://res.cloudinary.com/${CLOUD_NAME}`;
async function serveAsset(event) {
const url = new URL(event.request.url);
const cloudinaryURL = `${CLOUD_URL}${url.pathname}`;
response = await fetch(cloudinaryURL);
const headers = {
'timing-allow-origin': '*'
};
response = new Response(response.body, { ...response, headers });
return response;
}
async function handleRequest(event) {
if (event.request.method === 'GET') {
let response = await serveAsset(event);
if (response.status > 399) {
response = new Response(response.statusText, { status: response.status });
}
return response;
}
return new Response('Method not allowed', { status: 405 });
}</code></pre><p>The script I use in production includes a couple of extra steps to cache the returned images, so that subsequent requests for the same file are served using the cache. You can read more about that in this <a href="https://developers.cloudflare.com/workers/tutorials/configure-your-cdn">guide from Cloudflare</a>. If you're using Netlify to host your site, Tim Kadlec has a tutorial showing how to <a href="https://timkadlec.com/remembers/2020-11-17-netlify-proxy-requests/">do the same thing with Netlify redirects</a>.</p></div>Building a fast, sustainable personal website2024-02-20T13:25:46Zhttps://fershad.com/writing/building-fast-sustainable-personal-website/<div><p>This website is my home on the internet. It's a place where people can get to know me as a person, and as a professional. This was front of mind when I decided to give the site a redesign in May 2021. Besides being a place to highlight the things I've worked on, I also wanted the site to embody something that I truly care about - sustainability.</p><p>The environmental impact of our digital lives is growing rapidly. I've <a href="https://fershad.com/writing/the-environmental-case-for-website-performance/">written about it</a> previously on my blog. To be honest, I wasn't fully aware just how much of a negative impact digital was having on our planet until early in 2020. My moment of enlightenment came when reading Gerry McGovern's book <a href="https://gerrymcgovern.com/books/world-wide-waste">World Wide Waste</a>. Since then, I've sought to continue understanding more about digital sustainability, particularly in relation to the web.</p><p>When I started planning out the redesign of this site I wanted to ensure I applied as many <a href="https://sustainablewebdesign.org/">sustainable design and development</a> principles as possible. I also wanted to ensure the site was as fast and performant as I could make it. This case study looks at some of the decisions I made during the planning and development of the redesigned website.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/2fe48fec288606f7bb2cb60ef1f4cf8ca6c8bc31-2690x1531.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/2fe48fec288606f7bb2cb60ef1f4cf8ca6c8bc31-2690x1531.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">A screenshot of my old homepage. Much more cluttered and "wall of text"-ish than the current version.</figcaption></figure><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Legend</p><p></p><div><p>Through this article you will see these two icons next to some of the headings/bullet points. </p><ul><li>💚 signifies something that helps with website sustainability </li><li>🚀 indicates something that helps with website performance</li></ul></div><p></p></div><h2>Jamstack FTW!</h2><p>This website has used Jamstack architecture for a long time. It is a "static site" if you really want to categorise it. Each page you visit is pre-rendered in a build process before the whole site goes live. This build process runs each time site content is updated and committed to GitHub. Since I'm not frequently updating content, this approach works just fine. I can live with waiting a couple of minutes for a new blog post or code change to go live. This approach doesn't work for every website. It does, however, allow me to implement many website performance and sustainability optimisations directly to the site before it ships. I'll get onto those shortly.</p><h3>Built with Eleventy</h3><p>Like the previous iteration, I've stuck with <a href="https://www.11ty.dev/">Eleventy (11ty)</a> as the static-site generator (SSG) that builds this site. One of the Eleventy features I lean on most heavily to ensure this site is fast and sustainable are <em>Transforms</em>. These allow developers to modify the output of an Eleventy template. I've used transforms for several of the performance and sustainability optimisations I talk about later such as finding and handling critical CSS, applying image placeholders, and creating a simple table of content for blog posts.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Learn Eleventy</p><p></p><div><p>If you're interested in learning more about Eleventy then I strongly recommend this (now free!!) course by Andy Bell - <a href="https://piccalil.li/course/learn-eleventy-from-scratch/">Learn Eleventy From Scratch</a>. </p><p><em>This is not a paid ad or affiliate link. I took this course when it first launched and am recommending it from my own experience.</em></p></div><p></p></div><h3>Hosting on Cloudflare Pages</h3><p>Being a static site, each page of my website is a simple HTML file. This means that I can host my site virtually anywhere. However, one of the benefits of a Jamstack approach is integration with a CI/CD (Continuous Integration/Continuous Deployment) pipeline. This allows a new site build to be automatically triggered whenever code or content changes. I could try building that process myself, but several hosting providers offer it out-of-the-box for static sites like mine.</p><p>Cloudflare Pages is one of the newest entrants into the Jamstack hosting space. Leveraging Cloudflare's global CDN, it is the first set of key performance and sustainability wins for this site.</p><ul><li>💚🚀 Cloudflare's global CDN means reduced latency when serving data to site visitors. It also means data travels a shorter distance to reach the user. This minimises energy usage on the network.</li><li>💚🚀 Enabling Brotli compression reduces the size of static assets, further minimising energy usage when serving the site. The smaller sizes also mean downloading the assets takes less time.</li><li>💚 <a href="https://blog.cloudflare.com/the-climate-and-cloudflare/">Cloudflare neutralises the carbon footprint</a> of its global operations through Renewable Energy Certificates (RECs). While running on 100% renewable energy is the best possible approach, Cloudflare is still taking a step in the right direction here.</li></ul><h2>Image optimisations</h2><p>I tend not to use video in my content, so images are the heaviest assets on my site. Optimising them goes a long way towards ensuring each page on this site loads fast and has as small of an environmental footprint as possible.</p><h3>💚🚀 Serving modern formats</h3><p>Using modern image formats like WebP or AVIF is arguably the easiest way to go about reducing image size. For my site, I use <a href="https://cloudinary.com/invites/lpov9zyyucivvxsnalc5/dyg8fkjzrzhfeiqce9nl">Cloudinary</a> (<em>affiliate link</em>) to do most of the heavy lifting. With very little effort I'm able to serve AVIF images to browsers that support it, WebP to those that don't, and JPEG as a fallback. Most of the traffic to my site comes from Google Chrome users, so they're being served AVIF. Since the size of these images is much smaller than JPEG they are downloaded faster. Once again, their small size also reduces the energy required to transfer them over the network.</p><p>As a further step, I upload and process images using Cloudinary only as and when they are needed. Using Cloudinary's <a href="https://cloudinary.com/documentation/fetch_remote_images#remote_image_fetch_url"><em>remote image fetch URL</em></a> I'm able to keep a local version of each image, which is only uploaded & processed by Cloudinary when it is first requested. Subsequent requests for that image are then served from Cloudinary's cache. I've also added an <a href="https://github.com/wesbos/cloudflare-cloudinary-proxy">extra layer of caching using Cloudflare</a>. This helps keep my usage well within Cloudinary's free tier.</p><h3>💚🚀 Delivering appropriately sized images</h3><p>Presenting users with images that are appropriately sized for their viewport/device/screen is another performance and sustainability gain. Rather than making a mobile user download a 1600px wide hero image, they're served a 320px wide image instead (640px for retina displays). Again, this reduces the size of the files downloaded by the user. And yep, you guessed it, this approach also requires less electricity to transfer over the network.</p><p>Delivering responsive images can be rather tedious to do by hand. Thankfully the <a href="https://www.11ty.dev/docs/plugins/image/">Eleventy Image Plugin</a> handles all of this for me. By just passing in the original image alongside some parameters for sizing the plugin creates local, resized copies for each image. It also injects the required HTML directly into each page.</p><h3>Other image optimisations</h3><ul><li>💚🚀 I lazy-load images as much as possible using the native <code class="language-markup">loading="lazy"</code> attribute. This means users aren't forced to download images further down the page until they need to see them. For some images that I know will always appear "above-the-fold", I use <code class="language-markup">loading="eager"</code> to help the browser prioritise them.</li><li>🚀 Using the <code class="language-markup">decoding="async"</code> attribute on all images means they're processed off the main thread by the browser. This means they won't block JavaScript from executing, and shouldn't impact page usability while images load.</li><li>💚🚀 The Eleventy Image Plugin generates resized images with <em>immutable</em> file names. This means that any time the image is changed, the name will too. With this I'm able to set very aggressive caching for images.</li><li>🚀 On each image, I include <code class="language-markup">height</code> and <code class="language-markup">width</code> attributes. These allow modern browsers to imply the space that should be reserved for the image before it loads. This helps to prevent Cumulative Layout Shift.</li></ul><h2>💚🚀 Minify all the things!</h2><p>I include a couple of code minification steps as part of the build process for this site. As the name suggests, minification of code makes it smaller and results in less data needing to be transferred. Minification steps on this site include:</p><ul><li>Minify JS</li><li>Minify CSS (both inline CSS & external stylesheets)</li><li>Minify HTML</li></ul><h2>🚀 Critical & non-critical CSS</h2><p>During the build process of my site, all the CSS is dumped into one temporary file. I then have a build step that looks at the HTML output of each page individually and uses the <a href="https://github.com/addyosmani/critical">Critical</a> package to inline any critical CSS directly into the HEAD of that page. This process allows the browser to quickly find and parse the minimum CSS required to render the content that a visitor will first see on any given page.</p><p>What this also allows me to do is then to take the remaining (non-critical) CSS, and perform a further optimisation step. Once the critical CSS for a page has been inlined into the HTML, I then take the remaining non-critical CSS and pass it through <a href="https://github.com/FullHuman/purgecss">PurgeCSS</a>. PurgeCSS looks at the HTML output of the page and finds what non-critical CSS rules will be required. Those rules are then saved in a hashed file, and a <code class="language-markup"><link></code> tag referencing that file is added to the page.</p><p>This entire process has several benefits:</p><ul><li>The page renders sooner since the critical CSS comes along with the initial HTML document.</li><li>Purging the remaining CSS of rules that are not required for a given page significantly reduces the amount of CSS shipped. The smaller files are faster to download and parse.</li><li>Since the non-critical CSS is saved with a hashed filename it automatically becomes immutable. This allows me to set very aggressive caching rules knowing that any change to a page's CSS will bust the cache by changing the filename.</li></ul><h2>🚀 Using instant.page</h2><p>To make navigation around the site even faster I've used the <a href="http://instant.page/">Instant.page</a> script. Anytime a visitor to the site hovers over a link, Instant.page intercepts that request and begins preloading the page that is linked. By the time the click (or touch) action is finished, the content for the next page has already started to load (sometimes it may even have finished loading!), making page navigation feel really fast!</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Sustainability implications</p><p></p><p>Because the Instant.page script begins to download content for a page before it is viewed, there is the prospect that a user could hover over a lot of links without visiting any. This would result in unused content being downloaded. While that might happen, the other optimisation steps I've taken on the site aim to ensure content downloaded is kept to a minimum.</p><p></p></div><h2>💚🚀 Using less JavaScript</h2><p>JavaScript is one of the main reasons for performance issues on websites. I'm not saying that it's bad, but just that it has to be used carefully.</p><p>From a performance perspective, poorly implemented JavaScript can result in layout shifts, late loading resources, and can block the main processing thread which can lead to the browser becoming unresponsive to interactions.</p><p>From a sustainability perspective, using just the JavaScript I require for each page results in less data being transferred, and less computational power being used by the user's device.</p><p>Most pages on my website ship with < 20kB of JavaScript. Some blog pages have lazy-loaded Codepen embeds adding ~50kB more JavaScript on those pages alone.</p><p>On my site, I only load JavaScript for the following:</p><ul><li>Website analytics (I <a href="https://usefathom.com/ref/CEHKLY">use Fathom Analytics</a> - <em>affiliate link</em>).</li><li>Measuring the carbon footprint of each pageview (I cover this a bit later on).</li><li><a href="http://instant.page/">Instant.page</a> (see above) requires JavaScript to function.</li><li>On content pages to provide a share link on supported browsers using the navigator share API.</li><li>On content pages to render and update the reading progress scrollbar at the top of the window.</li></ul><p>Where pages might require some dynamic data, <a href="https://fershad.com/writing/dynamic-page-content-with-cloudflare-workers/">I've used Cloudflare Workers</a> to generate this at the time the page is requested. This means the dynamic parts of the page are built on Cloudflare's edge network. As a result, I avoid shipping additional JavaScript files to visitors on those pages. Examples of where I've used this approach are:</p><ul><li>Randomly displaying a testimonial on the <a href="https://fershad.com/services/">Services page</a>.</li><li>Fetching the latest website carbon data on the <a href="https://fershad.com/carbon/">Carbon page</a>.</li></ul><h2>💚 Table of content for posts</h2><p>Making it quicker and easier for visitors to find information is a key part of sustainable website design. Each pageview a website visitor makes while searching for information is extra data and electricity that could otherwise be avoided.</p><p>Adding a small table of content for blog posts and case studies on my website is a first step for me in terms of making this site even easier to navigate. I create this table for each page during the site's build process. Using JavaScript, I look at the HTML output of each page, identify all the 2nd level headings (<code class="language-markup"><H2></code>) and then build out the table of content linking to each section.</p><p>This is a small improvement from the previous version of my site, but should hopefully make content easier to find in long-form posts (like this one!).</p><h2>💚 Tracking site carbon emissions</h2><p>At the bottom of every page on this website is a Website Carbon measurement. It uses the <a href="https://websitecarbon.com/"><strong>Website Carbon calculator</strong></a> by Wholegrain Digital to give an estimate of the carbon emissions produced by a given page.</p><p>For each pageview, I save the estimated carbon emissions to an <a href="https://airtable.com/invite/r/1p0yKl4x">Airtable</a> base (<em>affiliate link</em>). This allows me to sum all estimated emissions over the life of the site. I can use this data to analyse and improve the most polluting pages of my site, or even offset the emissions associated with operating the site (which is something I plan on doing).</p><p>Currently, Website Carbon Calculator records CO2 generated on page load, not accounting for lazy-loaded images and scripts. I'm planning on building my own calculator that recalculates emissions as a user interacts with a page. This will improve the accuracy of the data in the long run.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Update Nov. 2021</p><p></p><p>I've been having some issues with the Website Carbon API over recent weeks. For the time being I have stopped using it to track emissions on each page until I can build my own lightweight calculator.</p><p></p></div><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/9807fb69091d3ea8435fb00963d106b204f65c87-1943x1163.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/9807fb69091d3ea8435fb00963d106b204f65c87-1943x1163.png?auto=format" alt=" " loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Screen capture of the Lighthouse tests for my homepage. 🥳🥳🥳🥳 </figcaption></figure><h2>Still room to improve</h2><p>I still feel there's room for me to improve this site - especially from a sustainability perspective. Performance-wise, the site scores 100s on Google Lighthouse tests and passes all Core Web Vital metrics.</p><p>On the sustainability front, there are a few areas I can still address that would lead to reduced data and energy usage. These are:</p><ul><li>💚 <strong>Search</strong> - Adding search functionality to this site would make it much, much easier for visitors to find information. This is especially true for the <a href="https://fershad.com/writing/"><strong>Writing</strong></a> section which has several pages worth of posts at this stage.</li><li>💚 <strong>Dark mode</strong> - A dark mode would help give visitors a slight energy saving, especially on mobile devices. That said, there is <a href="https://www.howtogeek.com/423717/dark-mode-isn%E2%80%99t-better-for-you-but-we-love-it-anyway/">research out there</a> that points to light text on dark backgrounds being harder for reader's to focus on. Since my site is mostly text content, it's something I'm still considering.</li><li>💚🚀 <strong>Removing more unused JS</strong> - Even though I'm already shipping very little JavaScript, there's still a bit more I can shave off. Ideally, most pages would eventually have around 10kB of JavaScript.</li><li>💚 <strong>More accurate CO2 measurement</strong> - As mentioned earlier, including lazy-loaded images and scripts in the CO2 measurements on each page is something that I'm looking into.</li><li>💚 <strong>Green hosting</strong> - Look into moving website hosting to a green hosting provider, while keep Cloudflare as my CDN. If you know a green web host that can provide CI/CD for a static website, <a href="mailto:itsfish@fershad.com">let me know</a>!</li></ul></div>Frontend tips to speed up your WordPress website2024-02-20T13:25:46Zhttps://fershad.com/writing/frontend-tips-speed-up-you-wordpress-website/<div><div class="callout"><p></p><p>This post was <a href="https://optimised.email/issues/issue-19-speed-up-your-wordpress-website">originally published</a> in my fortnightly newsletter, Optimised.</p><p></p></div><p>WordPress powers over 40% of the internet. That's a lot of websites. When it was first released, WordPress revolutionised online publishing. It democratised it, opening up the ability to publish content online to anyone.</p><p>WordPress has grown from its early days as a blogging platform to now provide a rich ecosystem of plugins that power some of the web's leading content publishers and ecommerce sites. With that has also come performance issues, as everyday users add multiple plugins to their site only to begin experiencing performance bottlenecks.</p><h2>First, understand your site</h2><p>Before getting started, it's important you determine whether your website features mostly static or dynamic content.</p><ul><li><strong>Static content:</strong> This is content that is not updated frequently. For example, company information pages, general product pages, blog posts etc.</li><li><strong>Dynamic content:</strong> Content that's updated frequently/presented in real-time or content that requires user authentication. Examples include product inventory information, membership pages, forums etc.</li></ul><h2>Cache, cache, cache</h2><p>Even if your website has mostly dynamic content, you still almost definitely have static assets that don't change very frequently. Things like logos, JavaScript files, and stylesheets are all perfect caching candidates.</p><p>For static sites, you can even cache entire pages. In this way, when a user visits that page the server can serve it straight from the cache, rather than having to build the page from scratch. This can give you a performance boost, as well as reducing the load on your website's server.</p><p>Almost all dynamic sites also have pages that can be cached. These would include your privacy policy, company details pages, and FAQ pages. Be sure that you're not caching pages that show content that is unique to a user, or time-sensitive content.</p><p>Check with your website host if they have caching options as part of their service. If they don't you can try out one of the plugins below:</p><ul><li><a href="https://wp-rocket.me/">WP Rocket</a> (paid - recommended)</li><li><a href="https://wordpress.org/plugins/comet-cache/">Comet Cache</a> (freemium)</li><li><a href="https://wordpress.org/plugins/cache-enabler/">Cache Enabler</a> (free)</li><li><a href="https://wordpress.org/plugins/w3-total-cache/">W3 Total Cache</a> (free)</li></ul><h2>Use a CDN</h2><p>CDNs work by saving cached version of your pages and other assets on a distributed network of servers. With this in place, when someone visits your website from Melbourne, Australia they'll be served with content from the closest server. As a result, you can dramatically reduce the response times when people visit your site. This also helps further reduce the load on your website's server.</p><p>Kinsta has a <a href="https://kinsta.com/knowledgebase/install-cloudflare/">thorough blog post</a> about getting set up with Cloudflare CDN for your WordPress site. Other CDN providers like Fastly also have <a href="https://wordpress.org/plugins/fastly/">plugins for their services</a>.</p><h2>Optimise your images</h2><p>Images are most likely the heaviest resources on your WordPress site. Optimising them is one of the most important steps you can take to improve the performance of your website. We've covered different <a href="https://optimised.email/series/optimising-images">image optimisation techniques</a> in previous issues of this newsletter. For WordPress specifically, there are a few things to note:</p><h3>Upgrade to WordPress 5.5 for native lazy-loading</h3><p>With the release of WordPress version 5.5, image lazy-loading was added to the core build. With this in place, WordPress will automatically add the loading=lazy attribute to all images. This can be overridden by developers for more control.</p><h3>Upgrade to WordPress 5.8 for WebP support</h3><p>The recent 5.8 release of WordPress <a href="https://make.wordpress.org/core/2021/06/07/wordpress-5-8-adds-webp-support/">introduces WebP support</a>. This allows you to compress and upload WebP images for use on your site. WebP now has really good browser support and delivers quality images at a fraction of the file size when compared to JPEG or PNG.</p><h3>Use a plugin if you can't update</h3><p>If something is stopping you from updating your version of WordPress, then you can still achieve great image compression and optimisations using one of the plugins below.</p><ul><li><a href="https://wordpress.org/plugins/imagify/"><strong>Imagify</strong></a> </li><li><a href="https://wordpress.org/plugins/wp-smushit/"><strong>WP Smush</strong></a> </li><li><a href="https://wordpress.org/plugins/optimole-wp/"><strong>Optimole</strong></a> </li><li><a href="https://wordpress.org/plugins/ewww-image-optimizer-cloud/"><strong>EWWW Cloud</strong></a> </li><li><a href="https://wordpress.org/plugins/shortpixel-image-optimiser/"><strong>ShortPixel</strong></a> </li></ul><h2>Pick a fast theme</h2><p>The theme you choose can also make a big difference in terms of site performance. After all, it is the underlying foundation upon which your website is built. You'll want to find a theme that gives you just the features you need, without extra bloat. For example, if you're not using FontAwesome icons (and in 2021 <a href="https://optimised.email/issues/issue-7-web-icons-in-2021">you really don't need to</a>), then make sure the theme you use does not ship with it (or at least has a way to disable it).</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Keep in mind</p><p></p><p>Another important consideration is the design of the template. Those that have feature large, hero/jumbotron images at the top of each page might end up proving problematic to your <a href="https://optimised.email/issues/issue-4-core-web-vitals-and-googles-search-update">Core Web Vital scores</a>. These might require you to take extra care in optimising the images you upload.</p><p></p></div><p>If you're after a fast WordPress theme then try out <a href="https://www.wholegraindigital.com/blog/granola-starter-theme/">Granola from Wholegrain Digital</a>. The Wholegrain team is passionate about digital sustainability, and that's part of the ethos behind Granola. For more options, you can check out this <a href="https://kinsta.com/blog/fastest-wordpress-theme/">detailed list</a> from Kinsta.</p><h2>Audit the plugins you're using</h2><p>Plugins can be another source of performance pain for WordPress sites. Sometimes, they're installed to solve one particular problem but then just left in place even after they're no longer required. This often causes sites to become bloated.</p><p>The easiest way to keep on top of this is to set a yearly reminder to review the plugins you have installed on your WordPress site. Ask yourself (or others in your organisation) if those plugins are all required for the site to function. Remove those which are not.</p></div>Dynamic page content with Cloudflare Workers2024-02-20T13:25:46Zhttps://fershad.com/writing/dynamic-page-content-with-cloudflare-workers/<div><p>I recently went about giving my website a small redesign. This mainly involved a fresh lick of paint and a revised layout. One other aim I had was to minimise the amount of client-side JavaScript required by my site.</p><h2>A brief background</h2><p>My website is built with <a href="https://fershad.com/writing/dynamic-page-content-with-cloudflare-workers/11ty.dev">Eleventy</a>, a static site generator that gives a tonne of power to developers. Eleventy outputs the pages on my site as good old fashion HTML pages. The content is already rendered on each site build. This is great for performance and sustainability - especially for a site like mine that is mainly written content with a long shelf-life.</p><h2>Handling dynamic content</h2><p>But how about when I do need to include some dynamic content on the site? It's impossible to avoid using JavaScript, but my plan all along was to keep it to a minimum. I use client-side JavaScript in the following places on the site:</p><ul><li>For analytics using <a href="https://usefathom.com/ref/CEHKLY">Fathom</a> </li><li>To speed up navigation with <a href="http://instant.page/">instant.page</a> </li><li>To keep track of website carbon estimates on each page view</li><li>In some blog posts for CodePen embeds or syntax highlighting</li></ul><p>On most pages, that comes to 11kB of JavaScript (compressed). I'm happy with that, and want to avoid adding to it.</p><p>So I had a small problem when I decided it would be a good idea to have some randomised client testimonials on different pages of the site. I wanted these testimonials to be randomly selected from a pool of quotes, but also wanted them to (possibly) change on each page view. This meant that randomly generating them at build time wasn't an option.</p><h2>Including content in HTML on-the-fly</h2><p>In order to achieve what I wanted, I needed to find a way to modify the HTML that was being returned on each page view. That's where Cloudflare Workers come in handy. Workers allow you to intercept and modify HTTP requests and responses. That was exactly what I needed. It would allow me to add randomly selected quotes to a page and return that to the user without a single line of client-side JavaScript!</p><p>Here's the Workers code to achieve this. You can see it in action on my <a href="https://fershad.com/">homepage</a> and <a href="https://fershad.com/services/">services page</a>.</p><pre class="language-javascript"><code class="language-javascript">// This worker randomly selects a client quote to add on the Services index page of my website.
const quotes = [
{
quote:
'Numbers tell more than words: As a result to his involvement and actions, the time to first byte when starting the app was cut by 75%. This had an immediate impact on user satisfaction and revenue.',
client: `Christian Schraeder, Founder of <a href="https://readle-app.com/">Readle App</a>`,
},
{
quote:
'Fershad has transformed the journey and engagement of visitors on our website. The new homepage design delivers significant improvements in key web performance measurements as well as introducing a more attractive and intuitive interface that helps people navigate the site better.',
client: `Ben Hall, Marketing Manager at <a href="https://www.displaylink.com/">DisplayLink</a>`,
},
{
quote:
"Fershad is passionate about sustainability, and his enthusiasm for the topic and deep knowledge of the web development space make him a really valuable partner. His insights are really practical and particularly important for today's businesses to grow responsibly.",
client: `Annalee Bloomfield, CEO of <a href="https://www.sustain.life/">Sustain.Life</a>`,
},
]
class quoteHandler {
async element(element) {
const item = quotes[Math.floor(Math.random() * quotes.length)]
element.setInnerContent(`<blockquote>${item.quote}</blockquote><figcaption>${item.client}</figcaption>`, {
html: true,
})
}
}
async function handleRequest(req) {
const res = await fetch(req)
const rewritter = new HTMLRewriter().on('#client-quote', new quoteHandler())
return rewritter.transform(res)
}
addEventListener('fetch', event => {
event.respondWith(handleRequest(event.request))
})</code></pre><p>The <a href="https://fershad.com/carbon/">carbon page</a> on this site also uses HTMLRewriter to provide the latest <em>total website carbon</em> figure when a user visits that page. The data is store externally, so the Worker first fetches it via an API before updating and returning the HTML for the page to render.</p><p>These are some basic use cases for Cloudflare Workers HTMLRewriter. It feel powerful, as a frontend developer, to be able to control and manipulate data at the edge. Reducing the burden on end-users and their devices, while allow us to deliver faster more sustainable websites.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Disclaimer</p><p></p><p>This post contains affiliate links for some products/services.</p><p></p></div></div>Readle - App Optimisation & Consultancy2024-02-20T13:25:46Zhttps://fershad.com/writing/readle-app-optimisation-consultancy/<div><p>I've cut my teeth in website-related performance and optimisation. So I was a bit reluctant when <a href="https://www.linkedin.com/in/christian-w-a04742122/">Christian Schraeder</a>, the founder of <a href="https://readle-app.com/">Readle</a>, reached out for my help on their native mobile apps. I'd never worked on a mobile app before and wasn't too sure if I'd be able to help. To his credit though, Christian was very persistent. In the end, I agreed to take a look at the app.</p><p>As it turned out, a lot of my knowledge of website performance and optimisation enabled me to identify key areas for improvement within the app.</p><div class="callout"><p></p><p>Christian and I know each other from a previous life when we both worked in the Marketing team of a Taiwanese software company.</p><p></p></div><h2>Readle App - Background</h2><h3>What is it?</h3><p>Readle is a language learning app available on both <a href="https://apps.apple.com/us/app/%E5%AD%B8%E5%BE%B7%E6%96%87-hello-german-%E6%9C%89%E8%81%B2%E5%BE%B7%E8%AA%9E%E6%96%87%E7%AB%A0-%E5%8D%B3%E6%99%82%E5%BE%B7%E6%96%87%E5%AD%97%E5%85%B8/id1483552317?ls=1">iOS</a> and <a href="https://play.google.com/store/apps/details?id=com.hello.german">Android</a>. Rather than simply giving learners a list of vocabulary to memorise, Readle delivers a <a href="https://cwschraeder.medium.com/why-i-started-readle-the-first-graded-reader-to-learn-german-a265471a5204">more immersive experience</a>. Across over 500+ short articles (all with audio) language learners can expand their vocabulary, test their comprehension, and track their progress.</p><h3>Great app, but a bit slow</h3><p>When Christian first asked me to help his main concern for the app was speed. Readle was seeing steady downloads growth and new user acquisition. At the same time, they were also receiving a growing number of complaints from users that interactions within the app were frustratingly slow - especially in East Asia, and the US.</p><p>The biggest source of speed-related user frustration came from slow response times when users selected words to learn. Christian was keen to address these speed issues. He knew it was important to his plans for expanding the app and continued growth.</p><h2>Initial assessment</h2><p>After a quick onboarding and introduction to Readle's globally distributed team, I started diving into the app. My initial aim was to identify and prioritise the areas within the app that might be contributing to slowness. Two things stood out:</p><ul><li><strong>Response times for data requests</strong> - Requests for data inside the app were being sent back to the origin server on every request. This created around a one-second roundtrip as the request was made, after which the data was downloaded. This happened each time an article was opened, a word selected, or an audio clip played.</li><li><strong>The size of data being returned</strong> - The data being returned by these requests, especially on the initial app load, was a significant chunk of JSON data. Downloading and parsing this data slowed the initial app load time. The same was found when articles were opened, with surplus data being sent down the line.</li></ul><h2>Improving app speed</h2><p>Working alongside <a href="https://www.linkedin.com/in/marcel-kipp-6011691b5/">Marcel Kipp</a>, Readle's app developer, we set about the task of improving the speed of key areas. We focused on the following parts of the app:</p><ul><li>App launch</li><li>Opening articles</li><li>Selecting words</li></ul><p>These are the three actions within the app that users will regularly make, especially new users trying the app for the first time.</p><h3>1.5 seconds faster app launch</h3><p>After going through the JSON data for the app launch, Marcel was quickly able to identify which fields were surplus to requirements. Making the changes to remove these fields from the response presented the opportunity to create a new API that would better facilitate Readle's growth in the future.</p><p>In addition to sending data that wasn't required, the app was also requesting data for 100 articles when it launched. The data, however, was only used on the app's home screen. This was way more articles than would be needed, so we agreed to reduce it by half.</p><h4>The results</h4><p>Our efforts saw a massive reduction in the data being received on app launch. What was originally about 37,000 lines of JSON had now become just 600. In terms of kilobytes being sent, we went from over 100kB to less than 10kB.</p><p>This all added up to a <strong>1.5-second reduction</strong> to the app's initial launch time.</p><h3>Reducing data for articles</h3><p>We undertook a similar process for articles data. Again, Marcel was able to reduce the data being sent and consolidated most of it into a new API. The savings here were not as large as for the app launch. We took about 2kB off the data transfer. This didn't directly improve the load time of article content but set us up nicely to start reducing the response time for requests.</p><h2>Faster content delivery with Cloudflare</h2><p>With data sizes reduced (and a new API to boot!) it was time to tackle response times. With every request being sent back to the origin server, responses were taking around 800 milliseconds to be returned (that's before downloading the data). This was a large source of frustration for users who were otherwise really enjoying the app.</p><p>We looked into a few possibilities to get this time down. We discussed an approach that would use a distributed database to better serve users around the globe. After some investigation, we decided to go down a different route - to cache as many of the requests as possible.</p><p>Since Readle publishes new content every day, we decided to use edge caching to achieve better response times. Requests on the app were already passed through the Cloudflare network, which meant we could use Cloudflare page rules to set edge cache durations for different request types.</p><h4>The results</h4><p>Setting up edge caching on Cloudflare's global CDN allowed us to achieve improvements across the board. The most telling improvements were seen in the time taken to display story and word content to users. Once data was cached (after the first visit by any user), we saw:</p><ul><li>A further 1-second improvement in app launch time</li><li>Story content ready for the user in under 200ms</li><li>Word content shown to the user in under 200ms</li><li>Audio content ready for playback almost instantly</li></ul><p>With the changes in place, we also started seeing positive app reviews - especially from users based in Asia!</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Client testimonial</p><p></p><div><blockquote>Numbers tell more than words: As a result to his involvement and actions, the time to first byte when starting the app was cut by 75%. This had an immediate impact on user satisfaction and revenue.</blockquote><p><a href="https://www.linkedin.com/in/christian-w-a04742122/">Christian Schraeder</a>, Founder of Readle App</p></div><p></p></div><h2>Serving images in milliseconds</h2><p>The one other significant change we made in the app was centred around image optimisation. Originally, the content creators at Readle would manually run images through an optimisation tool and then upload the resulting JPEG file. Automating this process would allow us to free the content editors of one task, as well as enable better image optimisation and delivery.</p><p>After looking into some options that could have seen us put image optimisation into Readle's backend CMS, we opted instead to use <a href="https://cloudinary.com/invites/lpov9zyyucivvxsnalc5/dyg8fkjzrzhfeiqce9nl">Cloudinary</a>. This enabled us to:</p><ol><li>Optimise images</li><li>Deliver images in the best format (JPG or WebP)</li><li>Cache images on a global CDN (speeding up delivery to the user)</li></ol><p><a href="https://cloudinary.com/invites/lpov9zyyucivvxsnalc5/dyg8fkjzrzhfeiqce9nl">Cloudinary</a> has a generous free plan which was ideal for us. Making the change was easy using Cloudinary's API, and we started seeing results instantly.</p><h4>The results</h4><p>Testing on an Android phone, images were being served using WebP format. Total image file size at launch was reduced by around 100kB. Response times for image requests also fell by 600ms on average.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Disclaimer</p><p></p><p>This post includes some affiliate links for products/services.</p><p></p></div></div>An introduction to optimising web fonts2024-02-20T13:25:46Zhttps://fershad.com/writing/introduction-to-optimising-web-fonts/<div><p>During April & May, 2021, I ran a series about web font optimisation over on <a href="https://optimised.email/">Optimised</a> - a bi-weekly email newsletter I publish that focuses on website performance. This post is a consolidation of some of the key points from those newsletters. For a more detailed guide, you can <a href="https://optimised.email/issues/issue-13-optimising-web-fonts-part-1">read the series</a> over on the Optimised website.</p><h2>Web fonts & Core Web Vitals</h2><p>Poor font loading can also negatively impact your Core Web Vital scores, especially for Largest Contentful Paint (LCP) and Cumulative Layout Shift (CLS).</p><p>LCP measures the time taken for the largest above-the-fold element to be rendered on screen. If you've got slow loading fonts, then there's a risk the LCP timing gets pushed out.</p><p>CLS indicates the movement of visible elements as the user loads and interacts with the page. Initially a page may use fallback fonts to show content. These can be replaced by web fonts once they've loaded. However, this can lead to content on the page being moved around as the weighting and spacing of the custom font take effect.</p><h2>Consider system fonts</h2><p>It's worth asking the people responsible for designing your website whether you can do without web fonts for some or all of the textual content. Leveraging the fonts that come pre-loaded on different operating systems can give a real performance boost to any site.</p><p>Here are some questions to ask when designing a new site:</p><ul><li>Could system fonts be used for all the web page text?</li><li>Could system fonts be used for all paragraph text & a custom font be used only for headings?</li><li>If you need to use a custom font, could you just stick to one font family and use different weighting for headings and paragraph text?</li></ul><p>Iain Bean <a href="https://iainbean.com/posts/2021/system-fonts-dont-have-to-be-ugly/"><strong>wrote a nice article</strong></a> early last month that has some handy demonstrations of attractive system fonts. <a href="https://systemfontstack.com/"><strong>Systemfontstack</strong></a> gives standard code snippets for serif, sans-serif and mono typefaces. <a href="https://meowni.ca/font-style-matcher/"><strong>Font style matcher</strong></a> is a handy tool that can generate the code you need to have system fonts closely match popular web fonts. You can also apply the code to fallback fonts, which can help reduce CLS.</p><h2>Consider Variable Fonts</h2><p>Variable fonts allow for multiple font variations to be served in a single file. By changing a few CSS properties we're now able to generate any combination of weight and style. All this without the need for multiple requests. They also allow for animations & transitions. You can find and play around with variable fonts at <a href="https://v-fonts.com/">Variable Fonts</a> or <a href="https://www.axis-praxis.org/">Axis-Praxis</a>. To learn more <a href="https://css-tricks.com/one-file-many-options-using-variable-fonts-web/">I recommend this great post</a> on CSS Tricks.</p><h2>Use a modern format</h2><p>Common font formats you're likely to see around the web are:</p><ul><li>TrueType font (TTF)</li><li>Embedded Open Type (EOT)</li><li>Web Open Font Format (WOFF/WOFF2)</li></ul><p>The most modern, and best optimised, of the list above is WOFF2. Support is good <strong><a href="https://caniuse.com/woff2">across modern browsers</a>.</strong> In 2021 there's really no reason not to be using it on your website. If you need support for older browsers you can still serve WOFF2 and provide WOFF & TTF as fallbacks.</p><p>If you need to create WOFF2 variants of your font files, use one of the tools below:</p><ul><li><a href="https://www.fontsquirrel.com/tools/webfont-generator"><strong>Font Squirrel's Generator</strong></a> </li><li><a href="https://transfonter.org/"><strong>Transfonter's Online @font-face Generator</strong></a> </li><li><a href="https://fontie.pixelsvsbytes.com/webfont-generator"><strong>Fontie</strong></a> </li></ul><h2>Remove unused characters (subsetting)</h2><p>Fonts can often contain contain glyphs for languages and character sets that we simply don't need. Removing these additional glyphs can help shave hundreds of kilobytes off our font files.</p><p>There are a few ways of subsetting fonts:</p><ul><li>Command-line tools like <a href="https://github.com/filamentgroup/glyphhanger"><strong>Glyphhanger</strong></a> </li><li>Online tools like Font Squirrel, Fontie & Transfonter</li><li>Graphical tools like <a href="https://fontforge.org/en-US/"><strong>FontForge</strong></a> </li></ul><p>Be aware that some web fonts have licenses that prevent modification.</p><h2>Self-host font files</h2><p>Hosting resources like fonts on your own domain can help speed up the request time for assets since there's no need to initiate new TCP/SSL connections each time.</p><h2>Use preload effectively</h2><p>Preload is extremely handy in prompting the browser to download certain assets earlier than it otherwise might. However, <em>if everything is a priority, then nothing is</em>.</p><p>The <code class="language-markup">preload</code> tag you'd place in the head of your HTML page should end up looking something like this:</p><pre class="language-html"><code class="language-html"><link rel="preload" href="webfont.woff2" as="font" type="font/woff2" crossorigin></code></pre><h2>Use font-display effectively</h2><p><code class="language-markup">font-display</code> is a CSS property you can add to the <code class="language-markup">@font-face</code> blocks of your code to tell the browser how to handle that displaying content before a web font is fully downloaded.</p><p>There are two values you should consider:</p><ul><li><code class="language-markup">font-display: swap</code> - Tells the browser to show text immediately using the first available fallback font, and then swap in the web font when it's downloaded. Better for LCP, but can impact CLS.</li><li><code class="language-markup">font-display: optional</code> - Instructs the browser to hide text for 100ms and then load the web font only if it's available. If it's not ready, then the browser will use a fallback font instead for that page view. Great for LCP & CLS, but not great for highly branded designs.</li></ul><p></p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Learn more</p><p></p><p>For a more detailed guide, you can <a href="https://optimised.email/issues/issue-13-optimising-web-fonts-part-1">read the three-part series</a> over on the Optimised website. There you'll find explainers on the topics above, as well as more useful links and resources.</p><p></p></div></div>This website is a FLoC-free zone2024-02-20T13:25:46Zhttps://fershad.com/writing/this-website-is-a-floc-free-zone/<div><p>In November 2020 I switched my website analytics away from Google. I now (very happily) <a href="https://usefathom.com/ref/CEHKLY">use Fathom Analytics</a> - a privacy-focused analytics service from Canada. Unlike Google Analytics, Fathom is a paid service that provides a much simpler collection of data.</p><p>So why the switch? Well there's a few reasons:</p><ol><li>These days Google is first and foremost and advertising company. They collect data on users in a few ways, one of which is via analytics. I'd rather not have data about users from this website being used for that kind of profiling.</li><li>Fathom shows me pretty much all the information I need, rather than the excessive amount of data available in Google Analytics.</li></ol><p>The first of those two reasons played a much bigger role in my decision. </p><p>It's also the same reason why I've now made a small change to this site that will exclude it from Google's latest targeted advertising experiment - <em>Federated Learning of Cohorts </em>(FLoC).</p><p>FLoC is an attempt to profile and fingerprint users based on their web browsing history. As third-party cookies start to phase out, FLoC is a means of filling that void for the data hungry digital advertising industry.</p><h2>Adding headers in Cloudflare</h2><p>To exclude a website from the FLoC calculation you can add the following HTTP response headers:</p><p><code class="language-markup">Permissions-Policy: interest-cohort=()</code></p><p>This website is hosted on Cloudflare (using Cloudflare Pages). I've used a Cloudflare Worker to add the custom response headers. The code for the worker is below. I've assigned this worker to the route <code class="language-markup">fershad.com/*</code> so that it applies to every page.</p><pre class="language-javascript"><code class="language-javascript">// Cloudflare Worker
async function handleRequest(request) {
// Make the headers mutable by re-constructing the Request.
request = new Request(request)
const URL = request.url
// URL is set up to respond with dummy HTML
let response = await fetch(URL, request)
// Make the headers mutable by re-constructing the Response.
response = new Response(response.body, response)
response.headers.set("Permissions-Policy", "interest-cohort=()")
return response
}
addEventListener("fetch", event => {
event.respondWith(handleRequest(event.request))
})</code></pre><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Previously on Netlify ...</p><p></p><div><p>When this post was first published, this site is hosted on Netlify. <a href="https://docs.netlify.com/routing/headers/">Their docs</a> show a couple of ways to add custom response headers. I've decided to make the change in my site's <code class="language-markup">netlify.toml</code> file.</p><pre class="language-text"><code class="language-text">[[headers]]
for ="/*"
[headers.values]
Permissions-Policy = "interest-cohort=()"</code></pre><p></p></div><p></p></div><p>More information about FLoC can be found here:</p><ul><li><a href="https://web.dev/floc/">What is Federated Learning of Cohorts (FLoC)?</a> </li><li><a href="https://github.com/WICG/floc">Federated Learning of Cohorts (FLoC)</a> </li><li><a href="https://www.eff.org/deeplinks/2021/03/googles-floc-terrible-idea">Google’s FLoC Is a Terrible Idea</a> </li><li><a href="https://spreadprivacy.com/block-floc-with-duckduckgo/">Block FLoC With Duckduckgo</a> </li><li><a href="https://www.eff.org/deeplinks/2021/04/am-i-floced-launch">Am I FLoCed?</a> </li></ul><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Disclaimer</p><p></p><p>This article includes an affiliate link for Fathom Analytics. Using that link will get you $10 off your first month, and I'll also get a small fee from Fathom.</p><p></p></div></div>The environmental case for website performance2024-02-20T13:25:46Zhttps://fershad.com/writing/the-environmental-case-for-website-performance/<div><p>What's the link between a faster website and climate change? You might be surprised.</p><p>It's easy to not consider the environmental impact of the digital world. It's one of those cases of <em>out of sight, out of mind</em>. The greenhouse gas emissions produced when a webpage is loaded are very much hidden from the end-user. But the harmful environmental impact of our digital activities is growing, and it's growing fast.</p><p>It's estimated that global information and communication technology (ICT) accounts for <a href="https://theshiftproject.org/wp-content/uploads/2019/03/Lean-ICT-Report_The-Shift-Project_2019.pdf">around 4% of global CO2 emissions</a>. That makes it more polluting than the civil aviation sector. Just over half of those emissions come from the usage of ICT products and services (that's through data centres, networks, and terminals/devices). For a bit of perspective, if the internet were a country it would be the 6th largest polluter.</p><p>It's forecasted that if left unchecked emissions from ICT could go as high as 8% of total global emissions by 2025. That would make usage responsible for 4% of total global emissions, putting it <a href="https://internethealthreport.org/2018/the-internet-uses-more-electricity-than/">behind only China, the United States & India</a>.</p><h2>What's this got to do with websites?</h2><p>Of total global ICT emissions, about one-third comes from data centres (19%) and networks (16%). So each webpage view, each byte sent over the wire (or through the air for mobile), each bloated third-party library has an environmental impact.</p><p>How? Storing, transmitting, and loading all those bytes requires electricity. The day will come where all the planet's electricity is generated from renewable energy. Until then though, we'll be mostly relying on fossil fuel-powered electricity to host and load our websites.</p><h2>So what can we do?</h2><p>As developers or website owners, we can help to significantly reduce the carbon emissions of online assets. Reducing the size of our websites, serving optimised content, and being strategic about loading resources are just some of the steps we can take. Developer Danny van Kooten estimated that he <a href="https://dannyvankooten.com/website-carbon-emissions/">reduced emissions by 59,000kg of CO2 per month</a> when he shaved 20kB off the Mailchimp for WordPress he maintains.</p><p>To get a sense of your site's current emissions you can run your pages through <a href="https://websitecarbon.com/">Website Carbon Calculator.</a> If you're ready to start reducing it's impact, then check out the list of action items below.</p><h3>Reduce page size</h3><ul><li>Audit any third-party assets you're using, and remove any that don't deliver value.</li><li>Ensure you have GZIP or Brotli compression enabled.</li><li>If you use a JavaScript framework to build your site, use <a href="https://bundlephobia.com/">Bundlephobia</a> to find lighter versions of any dependencies you use.</li><li>Consider lightweight JavaScript frameworks such as Svelte or Preact.</li><li>Minify all your CSS, JavaScript and even HTML.</li><li>Subset custom fonts to remove characters that you won't use.</li></ul><h3>Serve optimised content</h3><ul><li>Use WebP or AVIF image formats.</li><li>Avoid using GIFs. Just don't. Use a video format instead.</li><li>Serve WOFF2 for custom fonts.</li><li>Appropriately size images for different viewport sizes.</li><li>Replace icon fonts with SVGs.</li></ul><h3>Strategically load assets</h3><ul><li>Lazy-load assets so that they're not downloaded until they're needed.</li><li>Avoid auto-playing videos.</li><li>Go against using large image carousels. If you do, lazy-load images that aren't seen on first load.</li><li>Use a lightweight facade for embedded videos.</li><li>Consider implementing a Service Worker to cache frequently used assets.</li></ul><h3>Some extras</h3><ul><li>Host with a <a href="https://www.thegreenwebfoundation.org/">green web host</a>.</li><li>Use a CDN to reduce the distance data has to travel.</li><li>Consider pre-building (or caching) any pages that don't have dynamic content.</li><li>Set effective cache headers for static assets.</li></ul><h2>Resources</h2><ul><li><a href="https://gerrymcgovern.com/books/world-wide-waste"><strong>World Wide Waste</strong></a> - This book by Gerry McGovern really opened my eyes to the impact of the digital world. It prompted me to start examining what I could do, as a web performance consultant, to reduce the impact of the web.</li><li><a href="https://www.websitecarbon.com/"><strong>Website Carbon Calculator</strong></a> - A very handy tool from Wholegrain Digital which gives you an estimate of a webpage's carbon impact.</li><li><a href="https://www.thegreenwebfoundation.org/"><strong>The Green Web Foundation</strong></a> - Check how your current website host is powered, and find a green host that you could move to.</li><li><a href="https://theshiftproject.org/wp-content/uploads/2019/03/Lean-ICT-Report_The-Shift-Project_2019.pdf"><strong>Lean ICT Report</strong></a> - A thorough report from The Shift Project which looks at the impact of ICT devices throughout their lifetime.</li><li><a href="https://internethealthreport.org/2018/the-internet-uses-more-electricity-than/"><strong>Internet Health Report 2018</strong></a> - A report from Mozilla which puts some perspective on the electricity demands of the internet.</li><li><a href="https://dannyvankooten.com/website-carbon-emissions/"><strong>CO2 emissions on the web</strong></a> - An interesting post from developer Danny van Kooten where he examines the emission savings he's been able to achieve by reducing the size of WordPress plugins he maintains.</li></ul></div>CSS can probably do that2024-02-20T13:25:46Zhttps://fershad.com/writing/css-can-probably-do-that/<div><p>In the not too distant past, web developers would almost instinctively reach for jQuery when starting out a new project. The features and capabilities it opened up were immense and were simply not present in native JavaScript at that time.</p><p>Fast forward to today, and a lot of the things we once used jQuery for are now easily done with modern JavaScript. This is a huge performance win, being one less library for the browser to download, parse and execute. Check out <a href="http://youmightnotneedjquery.com/">You Might Not Need jQuery</a> to see what's now possible.</p><h2>Using JavaScript still comes at a cost</h2><p>But JavaScript still puts a heavy burden on the browser, especially to compile and execute it. By using CSS whenever possible we can remove that burden from the browser, keeping the main thread free to handle other tasks.</p><ul><li>Smooth scrolling</li><li>Lazy-loading content</li><li>Keeping elements stuck in the viewport while scrolling</li><li>Smoothly scrolling in & snapping elements into place</li></ul><p>All the tasks listed above were once only capable of using JavaScript, and or jQuery. They're now all things that <a href="https://calendar.perfplanet.com/2020/html-and-css-techniques-to-reduce-your-javascript/">modern browsers can handle natively</a> with CSS. In the case of lazy-loading content, there is actually a native HTML attribute for that.</p><p>Over on CSS Tricks, Håvard Brynjulfsen has a very slick demonstration of creating a <a href="https://css-tricks.com/how-to-create-a-shrinking-header-on-scroll-without-javascript/">shrinking sticky site header entirely controlled with CSS</a>. This was definitely a case where once JavaScript would have been required to keep track of the user's scroll position and apply classes to change the header.</p><p class="codepen" data-height="350" data-theme-id="light" data-default-tab="result" data-user="fishintaiwan" data-slug-hash="KKgEJep" data-preview="true" data-codepen-url="https://codepen.io/havardob/pen/KKgEJep" data-pen-title="Styled dropdown with smooth expanded effect" style="height:350px;box-sizing:border-box;display:flex;align-items:center;justify-content:center;border:2px solid;margin:1em 0;padding:1em;"></p><h2>Even cooler stuff coming up</h2><p>Scroll-linked animations is an upcoming CSS addition that will open up a world of even more amazing page effects, all without having to run any JavaScript. It will replace a lot of the things we might normally use IntersectionObserver for, such as scroll-triggered animations. There's still a way to go before we get support for it in browsers, but Bramus Van Damme has been <a href="https://www.bram.us/2021/02/23/the-future-of-css-scroll-linked-animations-part-1/#more-demos--full-screen-panels-with-snap-points">playing around with it on his blog</a>.</p><h2>Resources</h2><ul><li><a href="https://calendar.perfplanet.com/2020/html-and-css-techniques-to-reduce-your-javascript/"><strong>HTML and CSS techniques to reduce your JavaScript</strong></a> - Anthony Ricaud covers a few of the very common tasks we used to need JavaScript for, but which are now very much possible with CSS. It's a good read, with explanations of the JavaScript solutions and CSS alternatives.</li><li><a href="https://css-tricks.com/how-to-create-a-shrinking-header-on-scroll-without-javascript/"><strong>How to Create a Shrinking Header on Scroll Without JavaScript</strong></a> - A very slick use case for CSS position: sticky which starts out nice and large, and shrinks (while remaining fixed to the top of the screen) as the user scrolls.</li><li><a href="https://www.bram.us/2021/02/23/the-future-of-css-scroll-linked-animations-part-1/#more-demos--full-screen-panels-with-snap-points"><strong>The Future of CSS: Scroll-Linked Animations with @scroll-timeline (Part 1)</strong></a> - Bramus Van Damme's very detailed experimentations with CSS scroll-linked animations.</li></ul></div>Google Analytics Alternatives2024-02-20T13:25:46Zhttps://fershad.com/writing/google-analytics-alternatives/<div><p>If you've ever wanted to find out how many people are visiting your website then you've probably heard of Google Analytics. It's a free service from Google that allows you to collect a plethora of information when a user visits your site. However, it also gives the world's largest digital ad seller access to a trove of data too. That digital ad seller, in case you're wondering, is Google and DoubleClick (its advertising arm).</p><p>The Markup has found <a href="https://themarkup.org/blacklight/2020/09/22/blacklight-tracking-advertisers-digital-privacy-sensitive-websites">Google Analytics in use on 69% of the top 800,000 websites</a> they scanned with their <a href="https://themarkup.org/blacklight">Blacklight website privacy inspector</a>. If these sites want to access user demographics through Google Analytics, they have to allow that data to be collected by DoubleClick as well.</p><p>But there are alternatives that both keep you user data private, and aren't regularly blocked by ad-blockers (which is another problem Google Analytics presents). Here are just a few:</p><h2>Fathom Analytics</h2><p><a href="https://usefathom.com/ref/CEHKLY">Website</a> (affiliate link, starts from $14/month for 100,000 pageviews)</p><p>I've been using Fathom for about 6 months now, and I'm really glad I made the switch. Fathom markets itself as <em>privacy-focused analytics</em>, and that's something which they take very, very seriously. Fathom has a simple dashboard which is easy to digest, making it ideal for sharing with different teams within your company. The team behind Fathom are working on Version 3 release for April which will include the ability to drill down data, track campaigns, provide an API, and more.</p><p>Another bonus of using Fathom is that 2% of all the revenue they make goes towards environmental endeavours. 1% of our gross revenue is funding next-generation carbon removal technologies in partnership with Stripe. 1% of our gross revenue is being donated to the Rainforest Trust.</p><h2>Plausible Analytics</h2><p><a href="https://plausible.io/">Website</a> (starts from $4/month for 10,000 pageviews)</p><p>Plausible Analytics is much like Fathom, in that they are an analytics service with privacy front of mind. Plausible can be used as a service or can be self-hosted. One of the most promising things about Plausible is the size of its tracking script - less than 1kB.</p><p>The folks behind Plausible Analytics also contribute 5% of their revenue to a combination of environmental efforts and open source project sponsorships.</p><h2>Cabin Analytics</h2><p><a href="https://withcabin.com/">Website</a> (Currently invitation only beta)</p><p>Cabin is analytics with a difference. Besides being privacy-first, it's also carbon-aware. So beyond just page visits and events, you can use Cabin to track the carbon footprint of your website. As more companies become carbon conscious the ability to track and identify energy intensive pages will definitely be a plus.</p><p>At the time of writing access to Cabin is only via invitation only.</p><h2>Self-hosted</h2><p>Self-hosted solutions give you total control over your data and analytics setup. They do, however, also require you/your team to have the technical know-how to set them up. You'll first need to find hosting services for the analytics dashboard and database. Once you've registered for these services you'll then need to setup the database, and connect it to the dashboard. Most self-hosted analytics packages have scripts and clear instructions you can follow to get started.</p><p>I've listed a couple of self-hosted analytics options below, but a <a href="https://www.ecosia.org/search?q=self-hosted+website+analytics">quick search on Ecosia </a>will give you a few more choices if you need.</p><ul><li><a href="https://umami.is/">Umami</a> </li><li><a href="https://ackee.electerious.com/">Ackee</a> </li></ul></div>A simple introduction to HTML resource hints2024-02-20T13:25:46Zhttps://fershad.com/writing/introduction-to-resource-hints/<div><p>This blog post is a consolidation of information from <a href="https://optimised.email/issues/issue-9-resource-hints-part-1">two newsletter issues</a> that I <a href="https://optimised.email/issues/issue-10-resource-hints-part-2">sent out recently</a>.</p><p>As you might be able to guess by the name, resource hints allow developers to indicate to the browser that particular network connections or files might be important for the current (or future) page.</p><p>Resource hints come in the form of a single lines of HTML code that you place within the HEAD of your page. They may be small, but they have the potential to improve site performance immensely. That said, if overused they can also degrade performance. I'm going to focus on the four <a href="https://almanac.httparchive.org/en/2020/resource-hints#hints-adoption"><strong>most commonly used resource hints</strong></a> - <code class="language-markup">dns-prefetch</code>, <code class="language-markup">preconnect</code>, <code class="language-markup">preload</code>, and <code class="language-markup">prefetch</code>.</p><ul><li><code class="language-markup">dns-prefetch</code> - resolves the IP address for a given domain ahead of time.</li><li><code class="language-markup">preconnect</code> - resolves the IP address + opens a TCP/TLS connection for a given domain.</li><li><code class="language-markup">preload</code> - instructs that particular resources be downloaded early.</li><li><code class="language-markup">prefetch</code> - downloads resources that might be needed for subsequent navigations.</li></ul><p>These definitions might not mean much now, but once you understand how each of these can be used they'll serve as a quick refresher if you ever need it.</p><h2><strong>DNS-Prefetch</strong></h2><p><code class="language-markup">dns-prefetch</code> is handy if you're using third-party resources on your site. For example, if you're hosting your images on Cloudinary, using Fathom for your website analytics, and you've got a YouTube embed on the page. Connecting to each of these providers starts with IP resolution for their domain. This operation can between 80 - 300ms (sometimes longer) for each domain. Normally this would happen when the browser first comes across a resource from an external domain. However, using <code class="language-markup">dns-prefetch</code> you can tell the browser that it would be a good idea to start this process early. By doing this the IP resolution step will be (most likely) complete by the time the browser first comes across an external resource from that domain.</p><h3><strong>What it looks like</strong></h3><p>Using YouTube as an example, you'd include this line early on in the HEAD of your page</p><p><code class="language-markup"><link rel="dns-prefetch" href="https://www.youtube.com"></code></p><h3><strong>When to use it</strong></h3><ul><li>If you're using resources hosted on an external domain, and especially if you're using multiple resources from one domain on the same page (e.g. a page with multiple images hosted on Cloudinary).</li><li><code class="language-markup">dns-prefetch</code> has <a href="https://caniuse.com/link-rel-dns-prefetch"><strong>browser support</strong></a> all the way back to IE10, so it's useful if you're having to support legacy browsers.</li></ul><h3><strong>Gotchas</strong></h3><ul><li>Google Chrome has a <em>limit of 6</em> in-flight DNS requests, so be judicious in which domains you use <code class="language-markup">dns-prefetch</code> for. Prioritise domains that host important resources for your site's Largest Contentful Paint (LCP).</li></ul><h2><strong>Preconnect</strong></h2><p><code class="language-markup">preconnect</code> gives you the same IP resolution as <code class="language-markup">dns-prefetch</code> but goes a step further by 'warming up' the connection as well. What this means is that on top of resolving the IP of the domain, <code class="language-markup">preconnect</code> also prompts the browser to establish a TCP/TLS connection with the domain. This means that when a browser first comes across an external resource it can simply start downloading it, rather than first having to establish a connection. There's a saving of somewhere in the range of 100ms here for each domain, though it does vary.</p><h3><strong>What it looks like</strong></h3><p>Sticking with the YouTube example above, you can including this line early in the HEAD of your page.</p><pre class="language-html"><code class="language-html"><link rel="preconnect" href="https://www.youtube.com"></code></pre><h3><strong>When to use it</strong></h3><p>Same as <code class="language-markup">dns-prefetch</code> though <a href="https://caniuse.com/link-rel-preconnect"><strong>browser support</strong></a> is not as broad (IE & Firefox don't support <code class="language-markup">preconnect</code>). That said, you can use <code class="language-markup">preconnect</code> with <code class="language-markup">dns-prefetch</code> as a fallback by just putting them one after another in your code.</p><pre class="language-html"><code class="language-html"><link rel="preconnect" href="https://www.youtube.com">
<link rel="dns-prefetch" href="https://www.youtube.com"></code></pre><h3><strong>Gotchas</strong></h3><ul><li><code class="language-markup">preconnect</code> is best used only for resources early in the page load. This is because the browser will close the connection after 10 seconds if it's unused.</li><li>Using <code class="language-markup">preconnect</code> can put load on device CPU so try to limit requests (common practice is 3, no more than 6).</li></ul><h2><strong>Preload</strong></h2><p><code class="language-markup">preload</code> is probably the most dangerous of all the resource hints. In fact, it isn't a hint at all. It's more of a command to the browser to download a resource that <em>will</em> be needed on the current page. This heavy-handedness means that, if used incorrectly, <code class="language-markup">preload</code> has the potential to actually negatively impact page load. Why? Well, the more you prioritise for download with <code class="language-markup">preload</code> the later other (perhaps more important) resources will be downloaded.</p><h3><strong>What it looks like</strong></h3><p>Like all other resource hints, you can include <code class="language-markup">preload</code> tags in the HEAD of your document.</p><p>You'll notice the <code class="language-markup">as="image"</code> attribute in the code below. This tells the browser what kind of resource is being fetched, and thus helps it determine priority. You can find a list of all possible values <a href="https://developer.mozilla.org/en-US/docs/Web/HTML/Element/link#attr-as"><strong>on MDN</strong></a>.</p><pre class="language-html"><code class="language-html"><link rel="preload" href="main-image.webp" as="image" type="image/webp" /></code></pre><h3><strong>When to use it</strong></h3><p><a href="https://caniuse.com/link-rel-preload"><strong>Browser support is pretty good</strong></a> across modern browsers (IE11 won't recognise <code class="language-markup">preload</code> though).</p><p><code class="language-markup">preload</code> is best used for bringing forward the download of <em>late discovered resources</em>. These are things like <strong>fonts</strong>, <strong>background images</strong>, or your <strong>app CSS bundle</strong>. All these are normally found by the browser after all the HTML, CSS and/or JavaScript has been parsed, which is often > 1 second into the page load. By using <code class="language-markup">preload</code> for these resources you can have them ready for the browser to load as soon as it discovers them.</p><h3><strong>Gotchas</strong></h3><ul><li>Just remember - <em>if everything is a priority, then nothing is a priority.</em> Overuse of <code class="language-markup">preload</code> will often lead to a degradation in site performance. It's best used judiciously.</li><li>Matt Hobbs has <a href="https://nooshu.github.io/blog/2021/01/23/the-importance-of-font-face-source-order-when-used-with-preload/"><strong>a very good write-up</strong></a> on an important gotcha to keep in mind when using <code class="language-markup">preload</code> for fonts.</li><li>The <code class="language-markup">crossorigin</code> attribute is required when preloading web fonts, even if they're hosted on your own domain. Just add <code class="language-markup">crossorigin</code> to the end of the preload link tag.</li></ul><h2><strong>Prefetch</strong></h2><p><code class="language-markup">prefetch</code> is handy at helping improve perceived performance for website visitors. It allows you to pre-emptively fetch and cache resources that might be required for future navigations. This is very handy when you know (or can predict) your user's journey.</p><p>Think of an online store for example. Your analytics tell you that most visitors that go to a products listing page click through to a product details page (which all use the same CSS file for styling). Using <code class="language-markup">prefetch</code> <strong>on the product listing page,</strong> you can have the browser download and cache the CSS file for the product details page. Once a user navigates there the CSS is ready to go. This can significantly speed up the rendering time for the page.</p><h3><strong>What it looks like</strong></h3><p>As with all other resource hints, you can include <code class="language-markup">prefetch</code> in the HEAD of your HTML.</p><pre class="language-html"><code class="language-html"><link rel="prefetch" href="/css/product.css" /></code></pre><h3><strong>When to use it</strong></h3><p><code class="language-markup">prefetch</code> is best used when you are almost certain of a user's intended action. Since it downloads resources with a low priority it won't block the current page. However, prefetching too many resources (especially if they aren't used) will eat up your visitors bandwidth for no real gain.</p><h3><strong>Gotchas</strong></h3><ul><li><code class="language-markup">prefetch</code> downloads & caches files. It does not execute them.</li><li>If you need to support IE9, note that it treats <code class="language-markup">prefetch</code> like <code class="language-markup">dns-prefetch</code>. Go figure 🤷🏾♂️.</li></ul></div>Optimising embedded content2024-02-20T13:25:46Zhttps://fershad.com/writing/optimising-embedded-content/<div><p>I want to cover a few ways you can optimise pages that have Twitter/YouTube content embeds. The idea for this stems from a tweet by Matt Hobbs in which he points out just how much bloat an embedded tweet can add to a webpage.</p><blockquote>LCP: 600ms slower <br />2.7MB more JS! <br />25 more requests <br />LH score dropped 50%<br /><em>- <strong>Matt Hobbs</strong>, Twitter (<a href="https://twitter.com/TheRealNooshu/status/1350578919389470721">Link</a>)</em></blockquote><p>Embedding content hosted on a third-party is a great way to keep visitors engaged, without them having to leave the site. However, as with most third-party resources we load, there are costs both in terms of extra network requests, file size, as well as performance penalties.</p><p>We're going to look at a few different methods you can apply to Twitter & YouTube embeds. These can help significantly reduce page size and remove/reduce JavaScript that the user has to download and execute, while still delivering content to the user and keeping them on your site.</p><h2>Twitter</h2><h3>Drop the script tag</h3><p>If you want to keep things very simple, just remove the <code class="language-markup"><script></code> tag that's tacked onto the end of the standard Twitter embed script. It looks like this <code class="language-markup"><script async src="<https://platform.twitter.com/widgets.js>" charset="utf-8"></script></code> and you'll find it at the very end of the code Twitter generates for you when you want to embed a tweet.</p><pre class="language-html"><code class="language-html"><blockquote class="twitter-tweet" data-dnt="true" data-theme="dark"><p lang="en" dir="ltr">Bernie...go home already <a href="<https://t.co/Ok1WpgjgJS>">pic.twitter.com/Ok1WpgjgJS</a></p>&mdash; The Daily Show (@TheDailyShow) <a href="<https://twitter.com/TheDailyShow/status/1352074243911999489?ref_src=twsrc%5Etfw>">January 21, 2021</a></blockquote> <script async src="<https://platform.twitter.com/widgets.js>" charset="utf-8"></script></code></pre><pre class="language-html"><code class="language-html"><blockquote class="twitter-tweet" data-dnt="true" data-theme="dark"><p lang="en" dir="ltr">Bernie...go home already <a href="<https://t.co/Ok1WpgjgJS>">pic.twitter.com/Ok1WpgjgJS</a></p>&mdash; The Daily Show (@TheDailyShow) <a href="<https://twitter.com/TheDailyShow/status/1352074243911999489?ref_src=twsrc%5Etfw>">January 21, 2021</a></blockquote></code></pre><p>The scripts above are with & without the <code class="language-markup"><script></code> tag respectively. I ran a quick test on my local machine & found that comparing the two code snippets above, the one without the <code class="language-markup"><script></code> tag came in a whopping 2.8MB smaller.</p><p>It's worth noting that this method isn't great for tweets with images (the image won't display). I'd very, very strongly recommend using it for tweets that are pure text though.</p><h3>Replace with an image</h3><p>Say you do want to embed a tweet that contains an image. In this case, you can use a screenshot of the tweet in place of the code above. You can use a service like <a href="https://tweetcyborg.com/">Tweet Cyborg</a> or a Chrome extension to do the work for you.</p><p>With the initial screenshot in hand, I'd recommend running it through image compression and creating WebP and/or AVIF versions of it as well. That way you can be sure you're serving the lightest possible file to users.</p><p>As a final step, when publishing the image on your site be sure to add ALT text to the image element to describe it's content for people using screen readers. It's a nice touch to also link to the tweet as well.</p><h2>YouTube</h2><h3>Lazy-load the iframe</h3><p>We've seen that it's possible to lazy-load images, but did you know that the same <code class="language-markup">loading=lazy</code> attribute can also be added to <code class="language-markup"><iframe></code> tags?</p><pre class="language-html"><code class="language-html"><iframe width="560" height="315" src="<https://www.youtube.com/embed/YM3KszYmn58>" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe> <iframe loading="lazy" width="560" height="315" src="<https://www.youtube.com/embed/YM3KszYmn58>" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></code></pre><p>I <a href="https://www.fershad.com/blog/posts/lazy-loading-embedded-iframes/">wrote about this approach</a> in October last year if you want to get some more details. It's best suited for when you're embedding a YouTube video further down the page - saving about 1.8MB on the initial page load.</p><h3>Use an image placeholder</h3><p>This <a href="https://css-tricks.com/lazy-load-embedded-youtube-videos/">sick little trick</a> comes courtesy of Arthur Corenzan. It works by replacing the regular (bulky) YouTube embed code with a placeholder image. The YouTube video content only gets loaded if the user clicks on the play button that's programmagically (yep, that's a thing) place on the placeholder. The code looks like this:</p><pre class="language-html"><code class="language-html"><iframe
width="560"
height="315"
src="https://www.youtube.com/embed/YM3KszYmn58"
srcdoc="<style>*{padding:0;margin:0;overflow:hidden}html,body{height:100%}img,span{position:absolute;width:100%;top:0;bottom:0;margin:auto}span{height:1.5em;text-align:center;font:48px/1.5 sans-serif;color:white;text-shadow:0 0 0.5em black}</style><a href=https://www.youtube.com/embed/YM3KszYmn58?autoplay=1><img src=https://img.youtube.com/vi/YM3KszYmn58/hqdefault.jpg alt='Improving The Page Loading Experience To Reduce Layout Shift by Jen Simmons'><span>▶</span></a>"
frameborder="0"
allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture"
allowfullscreen
title="Improving The Page Loading Experience To Reduce Layout Shift by Jen Simmons"
></iframe></code></pre><p>Using this trick I was able to get the YouTube embed size down to 18kB on initial page load. Adding <code class="language-markup">loading=lazy</code> onto this iframe would bring that initial hit down to 0kB if the asset was further down the page.</p><div class="callout"><p></p><p>Be sure to replace the the YouTube video code in the code above - it appears in 3 places. You should also replace the title & alt fields with relevant text.</p><p></p></div></div>Web icons in 20212024-02-20T13:25:46Zhttps://fershad.com/writing/web-icons-in-2021/<div><p>In modern web apps icons are heavily leaned on by designers to convey state, functions, or actions. Even if you've got a simple website you'll most likely want to include some links to social media platforms, or maybe contact information. Icons come in handy here too, making content & links stand out visually from the rest of the page. The strategies I cover in this issue are best suited to websites that need just a few icons sprinkled around the site rather than larger web apps (which tend to lean more heavily on icons).</p><p>Like many things in web development, you've got a few options if you need to add icons to your website:</p><ol><li>Use raster images (like PNG)</li><li>Use a CSS sprite (when you create a single image file containing all the icons you need, then load them by manipulating the image position in CSS)</li><li>Use an icon font (like Font Awesome)</li><li>Use SVGs</li></ol><h2><strong>Raster Images & CSS Sprites</strong></h2><p>Using raster images, either as a single file or through a CSS sprite, has one major drawback. Due to the nature of raster images, they don't scale very well. This might not be a problem if the icons on your site will be a fixed size across all displays. However, you'll reach a point where the image starts to pixelate the moment you need to adjust the size in either direction.</p><p>Another drawback of using raster images is that they result in more requests on the network, which can slow down site performance (especially if you use them at the top of your page). CSS sprites solve this somewhat because only a single file is downloaded & then reused repeatedly. Image do allow you to utilise caching to speed things up for repeat visitors.</p><p>To use raster images for icons on your site you load them like you would any other image, using the <code class="language-markup"><img></code> HTML tag. If you'd like to use a CSS sprite instead there's <a href="https://css-tricks.com/css-sprites/">a good beginners guide</a> on CSS Tricks.</p><h2><strong>Icon Fonts</strong></h2><p>If you've been in web development for any length of time you've probably used an icon font at some stage. They offer more flexibility when compared to raster images, especially since they can be controlled by the same CSS properties that govern regular fonts.</p><p>The CSS files that power most icon fonts are hosted on CDNs. You link to these fairly early on in your HTML document (especially if you're using icons above the fold). Once upon a time this did mean that more popular icon font libraries benefited from caching, but <a href="https://www.stefanjudis.com/notes/say-goodbye-to-resource-caching-across-sites-and-domains/">that is no more</a> a thing. Being hosted externally means the requests for these font files go cross-origin (to another domain that's not our own), and open us up to a host of issues that third-party resources can cause (<a href="https://www.fershad.com/optimised/issue/2/third-party-resources-a-cautionary-tale/">see Issue 2</a>).</p><p>What's more, some browsers block custom fonts or just don't handle them well. This is hard to pick up in testing since you can't possibly test on every iteration of every browser. So, if the browser blocks the resource from loading, or there's a network failure somewhere along the line then your icons will be replaced by those empty little boxes you've no doubt seen from time to time.</p><h2><strong>#SVGcanhelp</strong></h2><p>SVG offers a lot of flexibility that raster images & icon fonts simply can't match. Firstly, and as the first word of the name suggests, they're scalable. That means that they'll look as crisp on smaller mobile screens as they do on a large 4K monitor. They're also easier to manipulate, animate, and are far more accessible to screen readers & assistive technologies (especially when compared to icon fonts). You (or your graphics designer) can also easily creat customised (or edit existing) SVG icons using tools like Adobe Illustrator or Figma.</p><p>Another benefit of SVG is that you can inline them directly into the HTML of your page (if you have access to that). This results in fewer network requests, which can help slightly speed up page loading performance. Of course that does add some extra weight to you HTML file. Using compression tools like <a href="https://github.com/svg/svgo">SVGO</a> (command-line tool) or <a href="https://jakearchibald.github.io/svgomg/">SVGOMG</a> (visual interface) allows you to minify your SVGs just like you would do to any other image.</p><p>There are also a lot of sources online that you use to find and download/copy the SVG icon code for free. I've listed a few good ones below that I often reach for:</p><ul><li><a href="https://css.gg/app">CSS.gg</a> </li><li><a href="https://teenyicons.com/">Teeny Icons</a> </li><li><a href="https://tablericons.com/">Tabler Icons</a> </li><li><a href="https://faicons.dev/">FA Icons</a> </li><li><a href="https://basicons.xyz/">Basicons</a> </li></ul><p>You can even add SVGs to a sprite, and easily call them from within your code. Florens Verschelde has a <a href="https://fvsch.com/svg-icons">really detailed rundown</a> of how you can create & use SVGs as icons on your website which I strongly recommend you check out.</p></div>Core Web Vitals and Google Search2024-02-20T13:25:46Zhttps://fershad.com/writing/core-web-vitals-google-search/<div><p>A couple of weeks ago Google put out a <a href="https://developers.google.com/search/blog/2020/11/timing-for-page-experience">short blog post</a> to announce something rather significant. Core Web Vitals metrics will be added to the page experience signals mix as part of the May 2021 search update.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Update</p><p></p><div><p>Google has delayed the implementation of Core Web Vitals in its search rankings. The new timeframe for its inclusion is mid-June to August.</p><p>Read Google's <a href="https://developers.google.com/search/blog/2021/04/more-details-page-experience">update announcement here</a>.</p></div><p></p></div><p>Page experience signals include things like whether a page is mobile-friendly, served over HTTPS, and isn't spamming the user with intrusive popups.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/c8abad6da1db91f7596092cae6c203133e752998-960x540.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/c8abad6da1db91f7596092cae6c203133e752998-960x540.png?auto=format" alt="A diagram illustrating the components of Search’s signal for page experience." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Image from Google showing page experience signals for search.</figcaption></figure><p>So that means, if you've got an SEO plan in place then you really should start looking at the overall experience of your website. Catering for Core Web Vitals should be as important as identifying the right keywords to add into your content.</p><h2>So, what are Core Web Vitals?</h2><p>Put simply they're a set of metrics that aim to quantify real-world user experiences across the web. They measure page interactivity, content loading, and content stability during page load. The three metrics that form Core Web Vitals are:</p><ul><li><strong>Largest Contentful Paint (LCP)</strong>: A timing of how long it takes for the largest above-the-fold element to be painted on screen. This is usually a hero image/video or large text block.</li><li><strong>First Input Delay (FID):</strong> Measures the time it takes before the browser can react to a user input (like a click or tap).</li><li><strong>Cumulative Layout Shift (CLS):</strong> Indicates the movement of visible elements as the user loads and interacts with the page. You know when you start reading an article, then an ad loads above it & all the content get pushed down? CLS measures that.</li></ul><p>To look at it a different way, these three metrics allow Google to better understand a page's: </p><ul><li>Perceived load time (LCP)</li><li>Responsiveness (FID)</li><li>Page experience (CLS)</li></ul><h2>Beyond search rankings</h2><p>Google's announcement goes beyond just possibly impacting the search ranking of pages. They've also been testing visual indicators showing page experience <a href="https://blog.chromium.org/2020/08/highlighting-great-user-experiences-on.html">within the Chrome browser</a>. Now they plan to starting <a href="https://developers.google.com/search/blog/2020/11/timing-for-page-experience#a-new-way-of-highlighting-great-experiences-in-google-search">testing the same indicators</a> directly on search results pages. If they deem the tests to be successful, then expect to see these visual indicators rolling out in May 2021 as well.</p><p>That's a pretty big deal. It allows users to actively choose to visit a page with better overall experience.</p><h2>Checking your site for Core Web Vitals</h2><p>I'd recommend using <a href="https://developers.google.com/speed/pagespeed/insights/">PageSpeed Insights</a> or <a href="https://webpagetest.org/">WebPageTest</a> to check web pages on your site for the Core Web Vitals listed above. If your site as a whole receives a high number of visitors per month then you can also find real user Core Web Vitals measurements through Google's Search Console dashboard.</p><p>That will give you a baseline to start working from. While you're at it, you might also want to have a peak at your competitor's websites to see how they're performing.</p><h2>Fixing Core Web Vital Issues</h2><p>Each site is different, but there are a few common issues that pop up time and time again for Core Web Vitals. Simon Hearne covers them in great detail <a href="https://simonhearne.com/2020/core-web-vitals/">over on his blog</a>. I've included links below that will take you directly to the relevant section for each of the Core Web Vitals.</p><ul><li><a href="https://simonhearne.com/2020/core-web-vitals/#largest-contentful-paint-lcp">Largest Contentful Paint</a></li><li><a href="https://simonhearne.com/2020/core-web-vitals/#first-input-delay-fid">First Input Delay</a></li><li><a href="https://simonhearne.com/2020/core-web-vitals/#cumulative-layout-shift-cls">Cumulative Layout Shift</a></li></ul><p>These things can take some time to fix depending on the size and complexity of your site. Of course you also want to be ensuring that any changes you make don't hurt other parts of your site's performance, or impact user experience.</p><p>Google's given us all a six month heads up. There's time, but especially with the holiday period just around the corner May 2021 will be upon us before we know it.</p></div>Lazy-loading embedded iframes2024-02-20T13:25:46Zhttps://fershad.com/writing/lazy-loading-embedded-iframes/<div><p>Lazy-loading has been a fairly common practice in web development for many years now. At it's core it aims to deliver faster page loads by deferring the loading of elements that are outside the initial viewport area. As the user scrolls down the page these elements (normally images) are loaded "just in time".</p><p>Often lazy-loading is applied to images. But did you know that you can also apply it to <code class="language-markup">iframe</code> elements? With <a href="https://caniuse.com/loading-lazy-attr">support in Chromium browsers</a> now it's as easy as adding the <code class="language-markup">loading=lazy</code> to an iframe tag. And, doing so could give your web pages a nice boost in some Web Vital metrics as well.</p><blockquote>Based off Chrome's research into automatically lazy-loading offscreen iframes for Data Saver users, lazy-loading iframes could lead to 2-3% median data savings, 1-2% First Contentful Paint reductions at the median, and 2% First Input Delay (FID) improvements at the 95th percentile. <br /><strong>~ Addy Osmani (<a href="https://web.dev/iframe-lazy-loading">link</a>)</strong></blockquote><p>I created a quick and dirty test just to see what kind of savings can be gained by lazy-loading a single YouTube embed on a web page. Here's what I found:</p><p><strong>Without lazy-loading (<a href="https://unsuitable-cushion.surge.sh/">site</a>)</strong></p><ul><li>Requests: 18</li><li>Page size (compressed): 1.8MB</li></ul><p><strong>With lazy-loading (<a href="https://unsuitable-cushion.surge.sh/index1.html">site</a>)</strong></p><ul><li>Requests: 2</li><li>Initial load size: 2kb</li></ul><p>Given, the pages I made to test with were extremely barebones, but the difference a single attribute can make is still stark. Addy Osmani has <a href="https://web.dev/iframe-lazy-loading/#what-impact-might-we-see-from-lazy-loading-popular-iframe-embeds">more detailed examples</a> and guides to using lazy-loading for iframes.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">On the other hand ...</p><p></p><p>On the flip-side, if you've got an embedded iframe at the top of your page and want to make sure it's loaded in a timely manner of page visitors, then use the <code class="language-markup">loading=eager</code> attribute instead.</p><p></p></div></div>Which CMS, e-commerce platforms, and CDNs deliver the best core web vitals?2024-02-20T13:25:46Zhttps://fershad.com/writing/which-cms-e-commerce-platforms-cdns-best-core-web-vitals/<div><p>The SISTRIX team have compiled a really detailed look at 18.5 million domains and how they stack up against Google Core Web Vitals metrics. You can <a href="https://www.sistrix.com/blog/core-web-vitals-wix-vs-wordpress-shopify-vs-shopware-whats-fastest">read their full study here</a>.</p><h2>Summary</h2><p>The report provides some really interesting categorised analysis too - especially looking at various CMS offerings & e-commerce applications. Some results are surprising but on the whole the message is that there's no secret combination of technologies that guarantee great web performance results.</p><ul><li>WordPress is the 2nd poorest performing CMS, while WooCommerce (a WordPress plugin) is the worst performing e-commerce platform. That's not surprising considering WordPress fundamentally allows non-technical creatives to put together complex websites.</li><li>Popular cloud CMS platforms like Wix and Squarespace lag behind other cloud CMS providers in performance.</li><li>AMP (Accelerated Mobile Pages) isn't the most performant technology, despite being heavily pushed by Google.</li><li>Fastly is by far the best performing of the CDNs that were called out in this study.</li><li>Despite Google announcing that they'll be focusing on mobile first indexing, performance on mobile devices still lags behind desktop.</li></ul></div>The why of website optimisation: Better user experience2024-02-20T13:25:46Zhttps://fershad.com/writing/website-optimisation-better-user-experience/<div><p>This is the fifth post in a series where I'll outline the benefits that can be derived through website optimisation. Not only will we touch on monetary aspects, but we'll also get into environmental impact, and customer experience.</p><div class="callout"><p></p><div><p>You can read the other posts in this series using the links below (they'll be at the end of this post too):</p><ol><li><a href="https://www.fershad.com/blog/posts/website-optimisation-increase-conversions-engagement/">Increase conversions (or engagement)</a> </li><li><a href="https://www.fershad.com/blog/posts/website-optimisation-reduce-operational-costs">Reduce operational costs</a> </li><li><a href="https://www.fershad.com/blog/posts/website-optimisation-reduce-environmental-impact">Reduce environmental impact</a> </li><li><a href="https://www.fershad.com/blog/posts/website-optimisation-improve-search-ranking">Improve your search ranking</a> </li><li><strong>Deliver a better user experience - This post</strong> </li></ol></div><p></p></div><p>Now, let's get onto how optimising your site can help you leave a lasting first impression on visitors, and boost the chances of them coming back.</p><h2>We're all busy (or at least it seems)</h2><p>People value their time. Sometimes it feels like there's an ever growing list of tasks that need to be accomplished, and not enough hours in the day to get them done. For some, hustle is life. Increasingly people are surfing the web on their phones, while commuting, or trying to perform multiple tasks at once (just don't use your phone and drive, please). They want information quickly, they want to be able to browse, decide, and purchase in a minute. So yeah, speed is important.</p><p>Indeed, Google’s own research has shown that the longer website visitors must wait for content to load and be interactive, the more likely they are to “bounce” or abandon the website.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/fb414ca8a9833f1165c57874210dcb18bf6e6a86-1000x698.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/fb414ca8a9833f1165c57874210dcb18bf6e6a86-1000x698.png?auto=format" alt="Image showing incremental increase in bounce rate based on load time." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Image source: Think with Google</figcaption></figure><h2>Online user experience = customer experience</h2><p>Having a website that loads fast and allows visitors to interact with it quickly allows you to deliver a better customer experience. The impression this leaves on your customers can be the difference between them making a purchase or not. It can also result in them being more likely to recommend your site to their friends.</p><p>This is backed by a survey of online shoppers by Kissmetrics. Their study found:</p><ul><li>52% of shoppers said web pages loading quickly were important to their site loyalty.</li><li>There was a 16% decrease in customer satisfaction for each 1 second of delay in page load.</li><li>44% of shoppers would tell their friends about poor online experiences.</li></ul><p>The <a href="https://blog.kissmetrics.com/wp-content/uploads/2011/04/loading-time.pdf">report is from 2011</a> but hey wasn't it a simpler, dare I say less demanding, time way back then?</p><p>Some other ways to think about how web performance equates to customer service are through the hypotheticals below:</p><ul><li>Someone has come to your website because they want to find information or want to purchase something. In a physical store you would want to make it as easy as possible for your customer to make that purchase/get that information, right? The same should apply online.</li><li>With more and more people using their mobile phone to surf the web you cannot rely on your visitors having unlimited bandwidth or be running 4G or 5G connections. A lot of people will be on capped data plans, 3G or slower connections, and are likely not using the latest, fastest Apple or Android phone. Serving them a web page that's bloated, processor intensive, or not even optimised for mobile screens leads to a poor customer experience. Heck, the visitor to your site might not come back, and might tell others to avoid your site too.</li></ul><h2>How can you improve?</h2><p>If you've read all the previous posts, you've probably got a pretty good idea of where to start. By just <a href="https://www.fershad.com/blog/posts/web-performance-quick-guide/">doing the basics well</a>, you'll put yourself ahead of the majority of websites out there. Heck, you could even have a nosey around your competitors websites to see how they stack up. Run your site and theirs through <a href="https://web.dev/measure/">Google Lighthouse</a>. It might present a surprising opportunity for you to gain a competitive edge.</p></div>The why of website optimisation: Help the planet2024-02-20T13:25:46Zhttps://fershad.com/writing/website-optimisation-reduce-environmental-impact/<div><p>This is the third post in a series where I'll outline the benefits that can be derived through website optimisation. Not only will we touch on monetary aspects, but we'll also get into environmental impact, and customer experience.</p><div class="callout"><p></p><div><p>You can read the other posts in this series using the links below (they'll be at the end of this post too):</p><ol><li><a href="https://www.fershad.com/blog/posts/website-optimisation-increase-conversions-engagement/">Increase conversions (or engagement)</a> </li><li><a href="https://www.fershad.com/blog/posts/website-optimisation-reduce-operational-costs">Reduce operational costs</a> </li><li><strong>Reduce environmental impact - This post</strong></li><li><a href="https://www.fershad.com/blog/posts/website-optimisation-improve-search-ranking">Improve your search ranking</a> </li><li><a href="https://www.fershad.com/blog/posts/website-optimisation-better-user-experience">Deliver a better user experience</a> </li></ol></div><p></p></div><p>Now, let's get onto how optimising your site can help you reduce its carbon footprint, and go a little way to helping the environment.</p><h2>Data does create CO2</h2><p>Data creation, storage, transfer, and consumption all use electricity. Although it’s not as tangible as fumes from a car, each site on the internet contributes to global CO2 emissions.</p><p>Looking at an average web page size of 4MB, it is estimated that a page of this size creates approximately 1.6 grams of CO2 each time it is visited by a user. Over a year, if this one page is visited 600,000 times it will produce 10KG of carbon emissions. (Source: <a href="https://gerrymcgovern.com/books/world-wide-waste/webwaste/">World Wide Waste, Gerry McGovern</a>). These figures are for a single web page. Most websites consist of multiple pages, and many are larger than the 4MB average. So, the overall yearly carbon footprint of a website might be in the 10s, if not 100s, of kilograms.</p><h2>What should we aim for?</h2><p>Google recommends web pages be around 500kB in total size. At the time of writing, HTTP Archive were measuring the <a href="https://httparchive.org/reports/state-of-images#bytesImg">median size of images on the web pages</a> at around 1000kB (1004kB for desktop, 920kB for mobile). So, optimise your images and <a href="https://www.fershad.com/blog/posts/reduce-page-weight-with-picture-tag">use the latest formats</a> (with fallbacks) whenever you can. Please.</p><p>Getting a website down to one-eighth of its previous size is a heck of an ask. But even just getting page size down to 1MB would allow us to reduce up-to 75% of the CO2 emissions that are created when the page is accessed. Sure, there's still a carbon footprint from hosting and <a href="https://gerrymcgovern.com/books/world-wide-waste/cloudwaste/">running services in the cloud</a>, but a relatively easy to achieve 75% cut in CO2 emissions is something we should all strive for.</p><h2>Clean code can help too</h2><p>I don't have much to say here, I'll just leave this quote here from Gerry McGovern (again).</p><blockquote>By cleaning up its JavaScript code, Wikipedia estimated that they saved 4.3 terabytes a day of data bandwidth for their visitors. By saving those terabytes, we saved having to plant almost 700 trees to deal with the yearly pollution that would have been caused.<strong><br />~ Gerry McGovern, <em>World Wide Waste,</em> 2020</strong></blockquote><p>Here's a link to an <a href="https://phabricator.wikimedia.org/phame/live/7/post/175/wikipedia_s_javascript_initialisation_on_a_budget/">article with more details</a>.</p><h2>How can I check my website's carbon footprint?</h2><p>Wholegrain Digital have built the incredibly handy <a href="https://www.websitecarbon.com/">Website Carbon Calculator</a> for just this purpose. Put in a URL, and it will analyse both the web page itself and hosting provider to determine how much CO2 is produced per visit. It also presents some handy examples to put things in scale.</p><p>If you'd like to switch to a green web host, then <a href="https://www.thegreenwebfoundation.org/directory/">The Green Web Foundation</a> has a list of globally available green hosting companies that you can check out.</p></div>The why of website optimisation: Better search ranking2024-02-20T13:25:46Zhttps://fershad.com/writing/website-optimisation-improve-search-ranking/<div><p>This is the fourth post in a series where I'll outline the benefits that can be derived through website optimisation. Not only will we touch on monetary aspects, but we'll also get into environmental impact, and customer experience.</p><div class="callout"><p></p><div><p>You can read the other posts in this series using the links below (they'll be at the end of this post too):</p><ol><li><a href="https://www.fershad.com/blog/posts/website-optimisation-increase-conversions-engagement/">Increase conversions (or engagement)</a> </li><li><a href="https://www.fershad.com/blog/posts/website-optimisation-reduce-operational-costs">Reduce operational costs</a> </li><li><a href="https://www.fershad.com/blog/posts/website-optimisation-reduce-environmental-impact">Reduce environmental impact</a> </li><li><strong>Improve your search ranking - This post </strong></li><li><a href="https://www.fershad.com/blog/posts/website-optimisation-better-user-experience">Deliver a better user experience</a> </li></ol></div><p></p></div><p>Now, let's get onto how optimising your site can help to boost the ranking of your website in search results.</p><p>Before I begin, I'd like to point out that this post will focus entirely on Google search results. While there are other options out there, especially for the <a href="https://duckduckgo.com/">more privacy conscious</a>, Google rule the roost here and it's what most of your website visitors will be using to find your page organically.</p><h2>Google's shifting to mobile-first, with a focus on speed</h2><p>Google has been putting more and <a href="https://webmasters.googleblog.com/2018/01/using-page-speed-in-mobile-search.html">more emphasis on website speed</a>, and usability in the last few years. On top of moving towards <a href="https://webmasters.googleblog.com/2020/07/prepare-for-mobile-first-indexing-with.html">mobile-first indexing</a>, the search engine giant has also been pushing the narrative that faster loading, more accessible and usable web pages allow it to crawl sites more efficiently and can lead to <a href="https://webmasters.googleblog.com/2010/04/using-site-speed-in-web-search-ranking.html">higher Search Engine Result Page (SERP) rankings</a>.</p><p>With more users performing searches on mobile devices, Google is now giving mobile experience more weighting when compiling search results. This mobile weighting even carries through to desktop searches. Therefore, delivering a fast website experience on mobile can help your site rank better across all Google searches.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Update</p><p></p><div><p>From mid-June Google will be including Core Web Vitals metrics to the page experience signals mix. This means that poor page speed and experience may start to impact your SEO. I've <a href="https://fershad.com/writing/core-web-vitals-google-search/">covered Core Web Vitals in another post</a>.</p><p>Read <a href="https://developers.google.com/search/blog/2021/04/more-details-page-experience">Google's announcement here</a>.</p></div><p></p></div><h2>What can you do about this?</h2><p>Well the first thing to do is to make sure you page is optimised for the best possible mobile experience. You can test pages through Google Search Console, or by entering a URL in <a href="https://search.google.com/test/mobile-friendly">this tool</a>. Hopefully you are presented with a nice big message in green saying "<strong>Page is mobile friendly</strong>". If not, you might be looking at having to redesign your site so that it can be served responsively.</p><p>Next, use <a href="https://developers.google.com/speed/pagespeed/insights/">Google's Page Speed Insights test</a> to see how your page performs on mobile (which is now the default). If there are any issues, you'll see some suggestions on what you can address to improve performance.</p></div>The why of website optimisation: Reduce operational costs2024-02-20T13:25:46Zhttps://fershad.com/writing/website-optimisation-reduce-operational-costs/<div><p>This is the second post in a series where I'll outline the benefits that can be derived through website optimisation. Not only will we touch on monetary aspects, but we'll also get into environmental impact, and customer experience.</p><div class="callout"><p></p><div><p>You can read the other posts in this series using the links below (they'll be at the end of this post too):</p><ol><li><a href="https://www.fershad.com/blog/posts/website-optimisation-increase-conversions-engagement/">Increase conversions (or engagement) </a> </li><li><strong>Reduce operational costs - This post.</strong></li><li><a href="https://www.fershad.com/blog/posts/website-optimisation-reduce-environmental-impact">Reduce environmental impact</a> </li><li><a href="https://www.fershad.com/blog/posts/website-optimisation-improve-search-ranking">Improve your search ranking</a> </li><li><a href="https://www.fershad.com/blog/posts/website-optimisation-better-user-experience">Deliver a better user experience</a> </li></ol><p>\</p></div><p></p></div><p>Now, let's get onto how optimising your site can help you reduce the operational costs associated with having a website.</p><h2>Use less, pay less.</h2><p>There are <em>a lot</em> of options for web hosting these days. One of the most popular is to use a <em>pay-as-you-go</em> service like Amazon Web Services. These kinds of hosting services allow you to spin up a website, get it hosted and online, and only pay for the storage and bandwidth that you use.</p><p>Let me say that again - you pay for the store and bandwidth that you use.</p><p>So, it goes without saying (but let me say it anyway) that the lighter your website then the lower your hosting costs will be.</p><h2>Don't get tier-bumped</h2><p>Other web hosting services have set tiers (plans) that you buy into when you want to get your site online. In all cases there are limits on each tier, whether that is how many sites you can have, how many email addresses you get, or even limits on website visits per month. In almost all cases there are also limits on storage, and bandwidth (though this is less common these days). Exceeding these limits could see your website slowed down, temporarily taken offline, or might even see you get automatically billed for excess usage.</p><p>Sure, the storage limits you're looking at are often in the 10s of gigabytes, but as your site gets larger you'll fill that up pretty quickly if you're not actively managing your media. Bandwidth limits can be hit much faster, especially if your site gets a sudden surge in traffic.</p><h2>Can you use the Jamstack?</h2><p>A lot of the web is still generated dynamically (on request). These are WordPress style websites, that are powered by a CMS database. This database, and the server that it lives on, must be online at all times so that it can be ready to generate and serve web pages whenever someone visits the website.</p><p>A lot of the time, though, the web pages being generated by these CMS databases are seldomly updated once they're published. They're written, put into the CMS (saved to the database), and then they just sit there waiting for someone to land on the web page.</p><p>Alternately, many of these websites could be built and served using static assets. Any content on the pages that needs to be dynamic can be fetched in real-time using APIs. This approach is <a href="https://jamstack.wtf/">known as Jamstack</a>. Jamstack sites can still be powered by CMS platforms, deliver live content updates, accept payments, and even allow for user accounts.</p><p>You'll notice that my description of a site built using the Jamstack methodology there was one word missing from the paragraph - database. Jamstack sites remove the costs of requiring an always online database to power them. There is an argument that depending on what CMS or APIs your website might use that cost might just be shifted rather than removed, and that definitely is something to keep in mind when deciding on whether or not the Jamstack is for you.</p><div class="callout"><p></p><p>Would you like to learn more about the Jamstack, and how it could work for your website? I can help you get a better understanding of what's possible, and whether or not it's for you. <a href="https://www.fershad.com/contact">Contact me directly</a>, or <a href="https://calendly.com/fershad-digital/meet">book an obligation-free consultation</a> on Calendly to start the conversation.</p><p></p></div><h2>What can you optimise to keep size down?</h2><p>To reduce the total size of your website files, look at the assets that are commonly the heaviest on any website. I'd recommend going in this order:</p><ol><li><strong>Videos</strong> - Do you need video on your site? Are you serving modern formats like webM?</li><li><strong>Images</strong> - Are you <a href="https://www.fershad.com/blog/posts/reduce-page-weight-with-picture-tag/">using the latest image formats</a> (webP, AV1)? If you are using JPEG or PNG, are they compressed and optimised?</li><li><strong>JavaScript & CSS</strong> - Are you loading any unnecessary files? Are all your files minified? Can you remove unused code from any files?</li></ol><p>I've written <a href="https://www.fershad.com/blog/posts/web-performance-quick-guide">a few more tips</a> on how you can reduce the size of your website, both for storage and bandwidth consumption.</p></div>The why of website optimisation: Increase site conversions2024-02-20T13:25:46Zhttps://fershad.com/writing/website-optimisation-increase-conversions-engagement/<div><p>This is the first in a series of five posts where I'll outline the benefits that can be derived through website optimisation. Not only will we touch on monetary aspects, but we'll also get into environmental impact, and customer experience.</p><div class="callout"><p></p><div><p>You can read the other posts in this series using the links below (they'll be at the end of this post too):</p><ol><li><strong>Increase conversions (or engagement) - This post.</strong></li><li><a href="https://www.fershad.com/blog/posts/website-optimisation-reduce-operational-costs">Reduce operational costs</a> </li><li><a href="https://www.fershad.com/blog/posts/website-optimisation-reduce-environmental-impact">Reduce environmental impact</a> </li><li><a href="https://www.fershad.com/blog/posts/website-optimisation-improve-search-ranking">Improve your search ranking</a> </li><li><a href="https://www.fershad.com/blog/posts/website-optimisation-better-user-experience">Deliver a better user experience</a> </li></ol></div><p></p></div><p>Now, let's get onto how optimising your website can help you improve conversions and/or engagement online.</p><h2>The results speak for themselves</h2><p>There's a wealth of evidence showing how even small performance improvements on individual, critical web pages can have a net positive impact on online sales & revenue.</p><p>One of the most famous case studies comes from Walmart.</p><blockquote>Walmart saw up to a 2% increase in conversions for every 1 second of improvement in load time. Every 100ms improvement also resulted in up to a 1% increase in revenue.<strong><br />~ Walmart, 2013</strong></blockquote><p>Over the years there have been many more example of companies boosting online revenue and engagement by delivering more performant websites. You can find a large collection of studies at <a href="https://wpostats.com/">https://wpostats.com/</a>. Below are some more examples:</p><blockquote>Zalando saw a 0.7% increase in revenue when they shaved 100ms off their load time.<strong><br />~ Zalando, 2018</strong></blockquote><blockquote>COOK increased conversion rate by 7% after cutting average page load time by 0.85 seconds. Bounce rate also fell by 7% and pages per session increased by 10%.<strong><br />~ COOK, 2017</strong></blockquote><h2>What pages should you optimise?</h2><p>Ideally, you'd look to make optimisation wins across all pages on your website. However, at the bare minimum you should aim to optimise at least the pages that form your website visitor's <em>critical path</em>.</p><h3>What's a critical path?</h3><p>I use the term critical path to define the minimum series of pages that your website visitor would land on while completing a transaction, or engagement, on your website. This is easier to determine for an online store than it is for a content website.</p><p>An online store's critical path may look something like:</p><ol><li>Homepage</li><li>Products listing pages</li><li>Product details pages</li><li>Shopping cart page</li><li>Checkout page</li></ol><p>You can determine the critical path for your website by thinking about what specific act you want website visitors to most perform. In the case of an online store, that's obviously making a purchase. A freelance writer might want to focus on getting visitors to contact them.</p><h2>Where to start optimising?</h2><p>As you can see from the examples above, focusing on optimising load time leads to better conversion and engagement results. To get started you can look at if you effectively use caching across your site. A proper caching strategy can help improve load times, especially if a user is going to be going through multiple pages of your site to complete a purchase. I have also covered other areas to look at in an earlier blog post <em><a href="https://www.fershad.com/blog/posts/web-performance-quick-guide/">A quick guide to easy web performance wins</a>.</em></p><p>If you have a technical team on hand, then drop your web pages into Google's <a href="https://web.dev/measure/">online Lighthouse testing tool</a>. It will give you a digestible summary, as well as detailed suggestions on where you can improve the web page tested.</p></div>Introducing Optimised2024-02-20T13:25:46Zhttps://fershad.com/writing/introducing-optimised-newsletter/<div><p>On October 16th I'll be kicking off <a href="https://optimised.email/"><strong>Optimised</strong></a>, a fortnightly email newsletter focused on website performance. I'll be diving into what it is, why it matters, and ways you can address performance issues on your own websites. There'll be a mix of content for both business minded and technical readers alike.<br /><br />Optimised won't just be a one-way street though. I hope it can be a chance for us to start a conversation. I'd love to hear, and share, your own website performance questions, and stories.</p><p>Head over to the <a href="https://optimised.email/">Optimised landing page</a> to find out more, and subscribe!</p></div>Reduce the Weight of Your Web Pages with the Picture Tag2024-02-20T13:25:46Zhttps://fershad.com/writing/reduce-page-weight-with-picture-tag/<div><p>Images often combine to make up the largest group of resources on a web page (in terms of file size). Therefore, optimising images is one of the easiest steps any website owner can take when looking at improving the performance of their website.</p><p>Compressing JPEG and PNG images is a great first step to take and can result in some easy web performance wins. However, newer image formats such as WebP and AV1 (.avif) can help massively reduce the size of image files without sacrificing visual quality.</p><p>Browser support for new formats, however, can take time. Most times developers will reach for polyfill to aid users on unsupported browsers. Other times they'll just make the decision not to bother with the new, better format entirely.</p><p>With images though, we're able to use the newest formats while also supporting users who may be on browsers that can't serve them. We can do this thanks to the HTML Picture Element.</p><p>To do this you simply need to present your images in a block like the below:</p><pre class="language-html"><code class="language-html"> <picture>
<source srcset="img/example.avif" type="image/avif">
<source srcset="img/example.webp" type="image/webp">
<img src="img/example.jpg" alt="Don't forget ALT text">
</picture></code></pre><p>The way this works is:</p><ol><li>The browser will examine the source elements inside the picture tag sequentially.</li><li>If it is not able to support the first source file, it will go to the second (you can have as many sources as you want)</li><li>If it can't support the second source file either, then it will fall back to the <code class="language-markup">img</code> element.</li></ol><p>By using a compressed JPEG image as fallback, and the picture element as shown above you should be able to achieve a significant reduction in the total size of your web page.</p></div>Generate CSV Files from Data with 11ty2024-02-20T13:25:46Zhttps://fershad.com/writing/generate-csv-files-with-11ty/<div><p>11ty (<a href="http://11ty.dev/">Eleventy</a>) is one heck of a powerful, incredibly flexible static site generator. One of the great things about using it is that after you run the initial NPM install step, Eleventy pretty much gets out of the way and lets you get about building your website using tools, languages, and processes that work for you. More, and more I find it being the first solution that springs to mind when a new website needs to be spun up.</p><p>In a recent project I found myself having to do something a bit more than building a website. I needed to take a large amount of data and output it in a format that a client could easily filter and manipulate. Without hesitation my mind turned to Eleventy for the project.</p><p><em>Huh? But isn't Eleventy just for making websites?</em></p><p>Well, yes and no. While it's main use is for building out websites Eleventy can let you do so much more. Most people will first pick up Eleventy for building a personal website or blog. However, the more you use it, the more you being to understand that Eleventy is capable of so much more.</p><p>This is in-part thanks to its powerful data cascade that lets your feed data to your page templates. Eleventy can consume data from local JSON or Javascript files, as well as from external APIs. What Eleventy also allows you define your template's output path and format. This is incredibly powerful, allowing us to write templates in HTML, Liquid, Nunjucks etc., and use these templates to generate other file formats, like CSV in this case.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">TL;DR</p><p></p><p>You can download the <a href="https://github.com/fishintaiwan/11ty-csv-demo">final code for this project</a> on Github.</p><p></p></div><h2>Let's Code</h2><p>Let's go through a short little tutorial that will show you how you can generate CSV data files using Eleventy.</p><p>Create new folder for your project and install Eleventy locally within that folder using NPM.</p><pre class="language-text"><code class="language-text">npm init -y
npm install --save-dev @11ty/eleventy</code></pre><p>Let's now create a place for our data. Within your project folder create a <code class="language-markup">_data</code> folder. In there, let's create our data file. In this project I'll use some <a href="https://gist.github.com/nanotaboada/6396437">dummy books data</a>, so I'll create a file called <code class="language-markup">books.json</code>.</p><pre class="language-json"><code class="language-json">/* _data/books.json */
{
"books": [
{
"isbn": "9781593275846",
"title": "Eloquent JavaScript, Second Edition",
"subtitle": "A Modern Introduction to Programming",
"author": "Marijn Haverbeke",
"published": "2014-12-14T00:00:00.000Z",
"publisher": "No Starch Press",
"pages": 472,
"description": "JavaScript lies at the heart of almost every modern web application, from social apps to the newest browser-based games. Though simple for beginners to pick up and play with, JavaScript is a flexible, complex language that you can use to build full-scale applications.",
"website": "http://eloquentjavascript.net/"
},
...
}</code></pre><p>With our data in place we can now create our template. I'll be using Liquid for this template, but you can use Nunjucks, or even HTML if you feel comfortable. In the root director of our project we'll create <code class="language-markup">books.liquid</code> which will be our template. Let's start with the most important part of the template - the permalink. Let's add the below frontmatter to our template.</p><pre class="language-html"><code class="language-html"><!-- books.liquid -->
---
permalink: 'books.csv'
---</code></pre><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">💡 Important</p><p></p><p>With our data in place we can now create our template. I'll be using Liquid for this template, but you can use Nunjucks, or even HTML if you feel comfortable. In the root director of our project we'll create <code class="language-markup">books.liquid</code> which will be our template. Let's start with the most important part of the template - the permalink. Let's add the below frontmatter to our template.</p><p></p></div><p>Finally, let's get some data into the template. We'll output the ISBN, title, author, and website of the books in our CSV files. To achieve this, add the code below to your template (after the frontmatter).</p><pre class="language-html"><code class="language-html"><!-- books.liquid -->
...
ISBN,Title,Author,Website
{%- for book in books -%}
{{ "" }}
{{ book.isbn }},"{{ book.title }}","{{ book.author }}",{{ book.website }}
{%- endfor -%}</code></pre><p>There are a few important things to note about the code above:</p><ol><li>Since a newline will create a new row in our CSV file we've used <a href="https://shopify.github.io/liquid/basics/whitespace/">Liquid's whitespace control tag syntax</a> to prevent this from happening twice with each loop (this would result in us having empty lines between each row of data).</li><li>We've added a line of blank content <code class="language-markup">{{ "" }}</code> to each loop so that we split the content onto a new row.</li><li>Some items have been wrapped in quotations (<code class="language-markup">""</code>) which helps to escape any commas (<code class="language-markup">,</code>) that might be in our data.</li></ol><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">💡 Important</p><p></p><p>We also don't put any spaces between the data we're generating. This is important for ensuring we can escape commas in our content.</p><p></p></div><p>To generate the CSV file simply run the <code class="language-markup">npx eleventy</code> command. You'll find the <code class="language-markup">books.csv</code> file has now been created in the <code class="language-markup">_site</code> folder of your project.</p><p>You can download the <a href="https://github.com/fishintaiwan/11ty-csv-demo">final code for this project</a> on Github.</p></div>Boost branding and engagement with Open Graph meta tags2024-02-20T13:25:46Zhttps://fershad.com/writing/branding-engagement-og-tags/<div><p>There may only be a handful of people out there who read every single post that's on their Facebook, Twitter or LinkedIn feed. Most of us just scroll through, giving the content a casual glance as we go by. But as content creators, we ideally want people to notice and click through to our content when it is shared on social media platforms, or even through chat apps. How can we help achieve that?</p><p>One way that can help boost the visibility of our content is by using Open Graph meta tags on our web pages.</p><h2>What is Open Graph?</h2><p>Open Graph is a protocol that was developed by Facebook in the early 2010s. By including just a few lines of code into the <code class="language-markup"><head></code> tag of a web page, Open Graph allows us to share "rich objects" with websites our content is shared on. In this way, web page creators can provide social media platforms with specific information about the page and how it should be displayed to users of that platform.</p><p>Though it was first developed by Facebook, the Open Graph protocol has been widely adopted as the standard for social sharing. Twitter, LinkedIn, and even chat apps like WhatsApp and LINE all support it.</p><p>Facebook has published a whole lot more information on the <a href="https://developers.facebook.com/docs/sharing/webmasters/">protocol specifics</a>. In this article we'll be going through the four key tags that you <em>really should</em> include on all of your pages. We'll also touch on some extra tags that help provide sites more context.</p><h2>How can Open Graph improve branding & engagement?</h2><p>When a web page doesn't have Open Graph meta tags it's left up to the social networks & apps to decide what content they feel is appropriate to show when the page is shared. This leads to an inconsistent appearance across different channels, and also poor presentation in general.</p><p>Consider the blog post below. Without Open Graph tags on the page, Facebook might choose to show it like this:</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/f2b3025a652a48326480528cea4fe587d0896416-2160x2160.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/f2b3025a652a48326480528cea4fe587d0896416-2160x2160.png?auto=format" alt="Screenshot of Facebook Sharing Debugger preview for July 2020 Review blog post without OG tags." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Screenshot of Facebook Sharing Debugger preview for July 2020 Review blog post without OG tags.</figcaption></figure><p>That's not very attractive, doesn't include any branding, and is almost certainly going to be overlooked by the vast majority of people who have it added to their Facebook feed.</p><p>With some Open Graph tags added to the page, we get a presentation that's more likely to catch the attention of people scrolling through their Facebook feed.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/f872695b822bb357d6f612fb1bad588fc511933e-2160x2160.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/f872695b822bb357d6f612fb1bad588fc511933e-2160x2160.png?auto=format" alt="Screenshot of Facebook Sharing Debugger preview for July 2020 Review blog post with Open Graph tags." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Screenshot of Facebook Sharing Debugger preview for July 2020 Review blog post with Open Graph tags.</figcaption></figure><p>So what tags do we need to add to our page to get our shared posts looking like this? There are four key tags to include, plus a few of extra ones specifically for Twitter.</p><blockquote>Without using Open Graph tags you're leaving it completely up to the social media providee or chat app to decide what content and image (if any) are relevant to include with a shared page. That's not ideal when you want to have consistent branding of your content across multiple channels.</blockquote><h2>Key Open Graph meta tags</h2><p>From here on we'll be referring to Open Graph by its acronym <em>OG</em>. The four key OG tags you should include in the <code class="language-markup"><head></code> block of your web pages are:</p><ul><li><strong>OG Image</strong> - This sets the image that is shown by social networks.</li><li><strong>OG Title</strong> - The title that will be shown for the page.</li><li><strong>OG Description</strong> - A short description of the page.</li><li><strong>OG URL</strong> - The URL that you want the shared post to link to.</li></ul><p>Let's stick with the July 2020 Review blog post I've used above. What do the OG tags for this post look like?</p><pre class="language-html"><code class="language-html"><meta property="og:title" content="July 2020 Review | Fershad Irani">
<meta property="og:url" content="https://www.fershad.com/blog/posts/july-2020-review/">
<meta property="og:image" content="https://www.fershad.com/ogImages/post-july-2020-review.png">
<meta property="og:description" content="July was a bit of a strange month. I had plenty of plans heading into the month, but a few unexpected twists promted me to shift focus towards the end of the month. Though I'm not one for surprises, these sudden changes are undoubtedly for the better."></code></pre><p>In the above, my description text is a bit long and so would almost certainly be truncated when shown with the other OG content. This is something to keep in mind when creating your own material.</p><p>The most important tag for displaying the eye-catching, large visual is the <code class="language-markup">og:image</code> tag on line 3 of the code above. Without this we're leaving it up to the social network/app to decide this they want to show an image for the post at all. If they do decide to, then without an <code class="language-markup">og:image</code> set we're leaving it up to chance as to what image is displayed. That's not ideal, especially if we want to create consistent branding on our content that's shared across multiple channels.</p><p>There are a few more things to keep in mind when creating OG images which we'll touch on at the end of this post. Before that, we're going to quickly go through some extra OG tags that you can (and really should) include on your pages.</p><h2>Additional Open Graph meta tags</h2><p>The four tags shown above provide enough information to the website/app that your content is shared on to let them display content the way you want. There are some additional tags you can use to give these sites even more context that they can associate to your content.</p><p>Again these tags should all go within the <code class="language-markup"><head></code> block of your web page code.</p><ul><li><strong>OG Type</strong> - A categorisation of the type of object (content) you are sharing. You can find a <a href="http://ogp.me/#types">full list of types here</a>. Facebook will default to the "website" type if none is provided.</li><li><strong>OG Locale</strong> - This lets you define the language of the content. "en_US" is the default.</li><li><strong>OG Site Name</strong> - Allows you to specify the name of your site.</li><li><strong>OG Audio</strong> - Lets you link to an audio track related to the shared page.</li><li><strong>OG Video</strong> - Lets you link to a video file related to the shared page.</li><li><strong>OG Image Alt</strong> - Assigns alt text to your shared image. This is especially important for accessibility.</li></ul><p>Adding this to the basic code earlier would give us the below. Please note that the Audio & Video elements below are not real in this case.</p><pre class="language-html"><code class="language-html"><!-- Key OG tags -->
<meta property="og:title" content="July 2020 Review | Fershad Irani">
<meta property="og:url" content="https://www.fershad.com/blog/posts/july-2020-review/">
<meta property="og:image" content="https://www.fershad.com/ogImages/post-july-2020-review.png">
<meta property="og:description" content="July was a bit of a strange month. I had plenty of plans heading into the month, but a few unexpected twists promted me to shift focus towards the end of the month. Though I'm not one for surprises, these sudden changes are undoubtedly for the better.">
<!-- Additional OG tags -->
<meta property="og:type" content="article">
<meta property="og:locale" content="en_GB" />
<meta property="og:site_name" content="Fershad Irani">
<meta property="og:audio" content="https://www.fershad.com/example.mp3" />
<meta property="og:video" content="https://www.fershad.com/example.mp4" />
<meta property="og:image:alt" content="Page image for July 2020 Review">
</code></pre><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">A note about Twitter</p><p></p><div><p>Twitter actually has its own sharing standard, but does use OG meta tags when that standard is not provided. There are a couple of extra lines of code you can add to make sure your content is well presented when shared on Twitter.</p><pre class="language-html"><code class="language-html"><!--
Additional tags for Twitter
Add this to the OG tags shown previously in this post.
-->
<meta name="twitter:card" content="summary_large_image">
<meta name="twitter:image" content="https://www.fershad.com/ogImages/post-july-2020-review.png">
<meta name="twitter:image:alt" content="Page image for July 2020 Review">
<meta name="twitter:description" content="July was a bit of a strange month. I had plenty of plans heading into the month, but a few unexpected twists promted me to shift focus towards the end of the month. Though I'm not one for surprises, these sudden changes are undoubtedly for the better."></code></pre><p>Using <code class="language-markup">summary_large_image</code> to define the content of the card here will tell Twitter to show a large version of the image, similar to what we get on Facebook. </p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif,q_auto/https://cdn.sanity.io/images/twtrbzfo/production/62a72bee85b6c1cdf7feebaae4bbdd8605ccc4ad-2160x2160.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto,q_auto/https://cdn.sanity.io/images/twtrbzfo/production/62a72bee85b6c1cdf7feebaae4bbdd8605ccc4ad-2160x2160.png?auto=format" alt="Preview of large Twitter card." loading="lazy" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Preview of large Twitter card.</figcaption></figure></div><p></p></div><h2>Guidelines for creating useable images for Open Graph</h2><p>Before wrapping up I'd like to give you (or your creative team) some simple guidelines to follow when creating images that you will use for social sharing. This will be a quick summary, Facebook has a <a href="https://developers.facebook.com/docs/sharing/webmasters/images">full specification</a> that you can look at too.</p><ul><li>Make sure the image is hosted publicly (so that other sites can access it)</li><li>Use a widely supported image format (JPEG or PNG are good choices)</li><li>The recommended aspect ratio is 1.91/1</li><li>The minimum size for a large image is 600 x 315 pixels (1200 x 630 pixels is recommended)</li></ul><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">How to Preview Open Graph Content</p><p></p><div><p>Both Facebook & Twitter have tools that allow you to enter a URL and show you a preview of the Open Graph content for that web page.</p><ul><li><a href="https://developers.facebook.com/tools/debug/">Facebook Sharing Debugger</a></li><li><a href="https://cards-dev.twitter.com/validator">Card Validator</a></li></ul></div><p></p></div><h2>Conclusion</h2><p>Using Open Graph tags in your web page code, and sharing them with well-composed images is an easy way to make your content stand out to users scrolling through their social media feeds. Besides being a way to boost click-through from shared content, it is also the easiest way to ensure branding remains consistent as your content is shared over multiple online channels.</p><p>You can use the OG and Twitter card tags I've provided above can help you take the first steps towards boosting user engagement with the content you/your brand shares on social media. If you need a hand then <a href="https://www.fershad.com/contact/">get in touch</a>.</p></div>DisplayLink - Homepage refresh & navigation improvements2024-02-20T13:25:46Zhttps://fershad.com/writing/displaylink-homepage-refresh-navigation-improvements/<div><p>DisplayLink's homepage sees approximately 80k page views every month. Most users pass through the homepage on their way to download drivers for DisplayLink chipsets that power multi-display setups in homes and offices around the world.</p><p>Early in 2020, the team at DisplayLink approached me to rebuild their homepage, which had been largely unchanged for several years. Decluttering the homepage, and clearly outlining DisplayLink's core offerings and value proposition were central to this redesign. </p><p>Another aspect of the project focused on improving navigation across the entire DisplayLink website. The website design at the time featured a nested hamburger menu, with content being hidden up to 3-levels deep at times. With most of the visits to the website coming from desktop users, the team wanted to simplify the navigation to better guide visitors to key sections of the site.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/fa449616337b587a22955ee5c0c01096af7e66ad-2701x1337.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/fa449616337b587a22955ee5c0c01096af7e66ad-2701x1337.png?auto=format" alt="Nested hamburger menu" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Screenshot of the nested hamburger menu on the old DisplayLink website</figcaption></figure><h2>Design considerations</h2><p>As the majority of visitors to DisplayLink's website are there to download drivers, we made the decision to make links to the respective driver download pages a prominent part of the new homepage.</p><p>A hero section at the top of the homepage provided a space for the marketing team to promote the latest DisplayLink products and features. This replaced a 5-slide carousel which our analysis showed was not driving click-throughs or interactions.</p><p>Directly under that, the new design features a series of quick-links to different driver download pages. The aim of this was to improve the most common user journey on the site. By including links to the download pages directly on the homepage, we allowed users to start downloading a driver in 2 fewer clicks.</p><p>Further down the page we made the decision to highlight the many global partners DisplayLink works with, as well as the different use cases for its products. DisplayLink's marketing team believed that these details were important in educating visitors to the site about the company. </p><h2>Technical optimisations</h2><p>As part of these design changes, there were several performance improvements made behind the scenes. Those changes included:</p><ul><li>Implementing lazy-loading for images</li><li>Serving modern image formats (WebP)</li><li>Providing multiple images sizes for different viewport widths</li><li>Replacing jQuery with plain JavaScript</li><li>Removing several third-party scripts that were no longer used</li></ul><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Testimonial</p><p></p><div><blockquote>Fershad Digital has transformed the journey and engagement of visitors on our website. The new homepage design delivers significant improvements in key web performance measurements as well as introducing a more attractive and intuitive interface that helps people navigate the site better.</blockquote><p>Ben Hall, Marketing Manager at DisplayLink</p></div><p></p></div><h2>Results - faster performance, smaller size</h2><p>These improvements saw large improvements in speed, size, and performance. Total page size was reduced by 73%. This resulted in a 60% improvement in page load time. When tested using Google's Lighthouse, the result was a greater than 2.5x improvement in the Performance score.</p><h3>Lighthouse Results</h3><h4><strong>Before</strong></h4><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/85a31716b832d8fa3196664bc37d4bab8f78bb55-1333x1226.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/85a31716b832d8fa3196664bc37d4bab8f78bb55-1333x1226.png?auto=format" alt="Screenshot of Google Lighthouse Results" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">DisplayLink Homepage - Lighthouse scores before update</figcaption></figure><h4><strong>After</strong></h4><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/f7325891ca00f3649dc177848bec2d113cbd8662-1357x1226.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/f7325891ca00f3649dc177848bec2d113cbd8662-1357x1226.png?auto=format" alt="Screenshot of Google Lighthouse Results" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">DisplayLink Homepage - Lighthouse scores after update</figcaption></figure><p>With the success of this update the DisplayLink team did have bigger plans to modernise the rest of their website. Unfortunately, shortly after this project was completed DisplayLink was acquired and future upgrades planned for the website had to be shelved.</p></div>A quick guide to easy web performance wins2024-02-20T13:25:46Zhttps://fershad.com/writing/web-performance-quick-guide/<div><p>This post stems from a conversation I had with a good friend of mine last week. We were talking about how to boost website traffic, and conversions. As I explained to him, and as we'll get to in this post, there's more to the game than just fresh, keyword laden content.</p><p>Web page performance is one of those things that is easily overlooked when putting together a new landing page, online store, or blog. However, website performance is becoming a <a href="https://webmasters.googleblog.com/2018/01/using-page-speed-in-mobile-search.html">larger metric</a> within Google’s search ranking algorithm. As a result, optimising websites to be fast for all users (mobile & desktop alike) should be a priority of any new site build. Even if you have an existing website and want to improve your search ranking or conversion rate, web page peformance should form part of any meaningful SEO strategy you have.</p><p>Besides the SEO implications, good web page performance is simply <em>better customer service</em>. Here are a couple of ways you can look at it:</p><ul><li>Someone has come to your website because they want to find information or want to purchase something. In a physical store you would want to make it as easy as possible for your customer to make that purchase/get that information, right? The same should apply online.</li><li>With more and more people using their mobile phone to surf the web you cannot rely on your visitors having unlimited bandwidth or be running 4G or 5G connections. A lot of people will be on capped data plans, 3G or slower connections, and are likely not using the latest, fastest Apple or Android phone. Serving them a web page that's bloated, processor intensive, or not even optimised for mobile screens leads to a poor customer experience. Heck, the visitor to your site might not come back, and might tell others to avoid your site too.</li></ul><p>There's a lot of evidence that shows how even small performance improvements on individual, critical web pages can have a net positive impact on online sales & revenue. I've put one below, and you can find even more at <a href="https://wpostats.com/">https://wpostats.com</a>.</p><blockquote>AliExpress reduced load time by 36% and saw a 10.5% increase in orders and a 27% increase in conversion for new customers.</blockquote><h2><strong>Steps you can take to check and improve the performance of your website.</strong></h2><p>If you've already got a website, or a in the process of building a new site or landing page, here are a few easy steps you can take to help you improve web page performance. It's worth noting that the list below provides a good place to start and should be easy enough for most people to implement. There's a lot, lot more to web page performance & it's a rabbit hole I'm hoping to get time to dive further down in the months to come.</p><h3>1. <strong>Get a baseline</strong></h3><p>If you've already got a website that you want to optimise it's worth knowing the current performance of the pages on your site. To establish a baseline, I would recommend combining a few tools to give yourself a better overall picture.</p><ul><li>Use <a href="https://developers.google.com/speed/pagespeed/insights/">Google's Page Speed Insights</a> to see how individual pages perform. The <strong>field data</strong> section shows you how users have experienced your page over the past 28-days, while the <strong>lab data</strong> section gives you a moment-in-time snapshot of your site's performance. I recommend you run this test ~5 times to allow you to get an average of the <strong>lab data</strong> (removing any outliers).</li><li>Run your web page through <a href="https://gtmetrix.com/">GTmetrix test</a> too. This will give you a wealth of data, but to start with you can focus on the <strong>Page Details</strong> block which shows the <strong>Fully Loaded Time, Total Page Size</strong>, and <strong>Requests</strong>. Again, I'd recommend running this test ~5 times (primarily to account for possible variance in the <strong>Fully Loaded Time</strong>).</li></ul><p></p><h3>2. <strong>Have a look at your page design</strong></h3><p>Page design is an easy place to start looking for performance gains. Especially if you're starting a new website or building a landing page from scratch, keeping performance impacts in mind as you do initial designs of your pages will help you start off on the right foot. With Google moving towards <a href="https://webmasters.googleblog.com/2020/07/prepare-for-mobile-first-indexing-with.html">mobile first indexing</a> across all sites, it's probably worth starting your website design process with the mobile version of your site.</p><p>Some design considerations that can impact web page performance are:</p><ul><li>Could you replace a video with an image (especially on mobile)?</li><li>Can you avoid auto-playing a video?</li><li>Do you need a rotating carousel that loads images the user might never see?</li><li>Can you move heavier resources below the fold? (such as videos, large images, sections that rely on JavaScript for interactivity)</li></ul><p></p><h3>3. <strong>Minify your CSS and JavaScript assets</strong></h3><p>Minifying assets refers to the process of removing unnecessary characters from code without impacting its functionality. It is considered best practice to minify the CSS and JavaScript assets that are used by your website in production. Minification can reduce file size by as much as 60% in some cases so if you aren't already minifying assets on your page, then this is arguably the easiest area to make significant performance gains.</p><p>Minification should be part of your website build process, and there are <a href="https://developers.google.com/speed/docs/insights/MinifyResources">many tools that can help</a> with this. There are also <a href="https://marketplace.visualstudio.com/search?term=minify&target=VSCode&category=Other&sortBy=Relevance">VS Code extensions</a> that developers can use to manually minify files.</p><p></p><h3>4. <strong>Optimise the images & videos you use</strong></h3><h4>Images</h4><p>Images are typically the heaviest resource on any web page. JPEG and PNG images are the most common file types for static images. However, JPEGs and PNGs can be rather large in terms of file size, especially when they are used for the full-width hero images that are common on websites today. To reduce the file size of JPEG and PNG files, consider running them through an image compression service like <a href="https://imagecompressor.com/">Optimizilla</a>. This is a good first step.</p><p>WebP is a modern image format for the web that is being <a href="https://developers.google.com/speed/webp">championed by Google</a>. WebP offers <a href="https://havecamerawilltravel.com/photographer/webp-website/">significantly lower file size</a> compared to traditional image formats, though there are instances where it <a href="https://siipo.la/blog/is-webp-really-better-than-jpeg">might not be optimal</a>. Chrome, Edge & Firefox <a href="https://caniuse.com/#search=webp">all support WebP</a>, and as of July 2020 Safari is almost there too. The good thing is that using the <code class="language-markup"><picture></code> element you can easily use WebP images on your web page, while providing a fallback to traditional image formats on unsupported browsers. Here's some sample code:</p><pre class="language-html"><code class="language-html"><picture>
<source srcset="yourWebPImage.webp" type="image/webp">
<source srcset="yourJPEGImage.jpg" type="image/jpeg">
<img src="yourJPEGImage.jpg" alt="Don't forget ALT text for your images">
</picture></code></pre><p>Alternately you can use a service like Cloudinary to host/serve your images. This makes serving images in the right format for any given browser ridiculously easy. There's a lot more detail in this <a href="https://cloudinary.com/blog/adaptive_browser_based_image_format_delivery">blog post from the Cloudinary team</a>, but in summary you can use Cloudinary's <code class="language-markup">f_auto</code> flag to do the dirty work for you. That allows the code above to be truncated to:</p><pre class="language-html"><code class="language-html"><img src="https://res.cloudinary.com/demo/image/upload/f_auto/yourJPEGImage.jpg" alt="Don't forget ALT text for your images"/></code></pre><h4>Video</h4><p>The same applies to video. If you're embedding video on your web page, you'll likely have an MP4 version of that video. Just like we have WebP for images, WebM is an alternative, open source, video format that delivers smaller file sizes compared to MP4. To serve up WebM video with a <a href="https://caniuse.com/#search=webm">fallback to MP4 for unsupported browsers</a> you can use code similar to the below:</p><pre class="language-html"><code class="language-html"><video controls>
<source src="yourMP4Video.mp4" type="video/mp4">
<source src="yourWebMVideo.webm" type="video/webm">
<p>Your browser doesn't support HTML5 video.</p>
</video></code></pre><p></p><h3>5. <strong>Lazy-load content that is below the fold</strong></h3><h4>Images</h4><p>Lazy-loading images is a term used to describe a technique where images below the fold of a web page are loaded "just in time" as the user scrolls the page. This gives you the potential to slash the initial load time of a web page since you're deferring the downloading of content that's not immediately visible to the visitor. <a href="https://web.dev/native-lazy-loading/">Implementing lazy-loading</a> is as easy as adding <code class="language-markup">loading=lazy</code> to the <code class="language-markup"><img></code> tags on your page. At the time of writing Safari is the only major browser <a href="https://caniuse.com/#feat=loading-lazy-attr">lacking support</a> for this feature, however the good thing about things like this is that if it's not supported in the browser then the image is simply loaded as normal.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Bonus</p><p></p><p>To avoid "page jank" caused by lazy-loaded images suddenly appearing as the user scrolls it's recommended to use <code class="language-markup">width</code> and <code class="language-markup">height</code> attributes on your images. Jen Simmons has a good <a href="https://www.youtube.com/watch?v=4-d_SoCHeWE&feature=youtu.be">video explaining this</a> in detail. Using the <code class="language-markup">height</code> and <code class="language-markup">width</code> attributes gives the browser an indication of how much space to set aside for the image before it is downloaded. CSS <code class="language-markup">aspect-ratio</code> should be a more elegant solution for this, though <a href="https://caniuse.com/#search=aspect-ratio">support is still limited</a>.</p><p></p></div><h4>Video</h4><p>Just like images, videos can also be lazy-loaded though the technique to do so differs. For most cases in which video playback is initiated by the user you can use the below code:</p><pre class="language-html"><code class="language-html"><video controls preload="none" poster="videoCoverImage.jpg">
<source src="yourWebMVideo.webm" type="video/webm">
<source src="yourMP4Video.mp4" type="video/mp4">
</video></code></pre><p>By setting <code class="language-markup">preload="none"</code> you are telling the browser not to download any video data until the user plays the video. With this in place we use the poster attribute to give the browser a placeholder image to display in place of the video until content is downloaded/playback starts. Jeremy Wagner & Rachel Andrew dive into this a bit deeper <a href="https://web.dev/lazy-loading-video/">in this article</a>.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Bonus</p><p></p><p>Lazy-loading <code class="language-markup">iframe</code> elements is also now <a href="https://caniuse.com/#feat=loading-lazy-attr">supported in all major browsers</a> except Safari. Just like images you can add <code class="language-markup">loading=lazy</code> to an <code class="language-markup"><iframe></code> tag to enable this feature. This means you can defer loading of embedded YouTube content until the user gets to that part of the page. Addy Osmani has a few other great <a href="https://web.dev/iframe-lazy-loading/">case studies in this article</a>.</p><p></p></div><p></p><h3>6. <strong>Delay loading external fonts</strong></h3><p>Using web fonts is common practice across the web these days. Most of the time fonts are fetched from third-party services like Google Fonts. This results in more cross-origin network requests for the browser, and if there's a delay in getting a response back it can also result in text on the page not being rendered. In most cases text is a pretty critical part of a web page. There are a few methods that can be used for better font loading, which result in better perceived loading times for web page visitors.</p><p>The first method is to look at whether you really need to use a web font, or custom font, at all. Could you simply use common system fonts for your website? You can add a range of common system fonts such as <code class="language-markup">font-family: system-ui,-apple-system,BlinkMacSystemFont,Segoe UI,Roboto,Helvetica,Arial,sans-serif;</code> which covers standard fonts on Mac and Windows.</p><p>If you want to use a custom font, then you can tell the browser to load fallback fonts first and swap in the custom font when it's downloaded. To do this you can use the <code class="language-markup">font-display: swap;</code> CSS property in your code. If you'd rather not have fonts swapping after the user may have started reading the page, then you can look into <a href="https://css-tricks.com/almanac/properties/f/font-display/">using other values</a> such as fallback or optional instead.</p><p>One more trick you can use is to a combination of link <code class="language-markup">preconnect</code> and <code class="language-markup">preload</code> to make an early connection to third-party font sources, and then to download the font stylesheet asynchronously as the page loads. Harry Roberts goes into this in much <a href="https://csswizardry.com/2020/05/the-fastest-google-fonts/">greater detail on his website</a>. In summary what we're looking to do here is:</p><ol><li>Preconnect to a third-party service like Google fonts. This will kick-off the communication stream between our site and the third-party.</li><li>We then preload the font stylesheet. In doing this we download the stylesheet from the third-party asynchronously (it doesn't stop the rest of the page from loading).</li></ol><ul><li>Since <a href="https://caniuse.com/#search=preload">browser support for preload</a> is not still not complete, we use <a href="https://www.filamentgroup.com/lab/load-css-simpler/">a trick from Fillament Group</a> that achieves a similar result as a fallback.</li></ul><p>Putting these together we get a code sample similar to the below:</p><pre class="language-html"><code class="language-html"><link rel="preconnect" href="https://fonts.gstatic.com" crossorigin />
<link rel="preload" as="style" href="https://fonts.googleapis.com/css2?family=Montserrat&display=swap" />
<link rel="stylesheet" href="https://fonts.googleapis.com/css2?family=Montserrat&display=swap" media="print" onload="this.media='all'" /></code></pre><p></p><h3>7. <strong>Check third-party resources you load on the page</strong></h3><p>Third-party resources are things that you load onto your website that are hosted on other domains. Normally these take the shape of tracking and analytics scripts, customer service features, or JavaScript libraries. Each request to a third-party takes time and can adversely impact the loading of your page. What's even worse is that if the third-party is having network issues you may be waiting an incredibly long time for a response and your page load might even timeout. While you can use the <code class="language-markup">preconnect</code> and <code class="language-markup">preload</code> techniques mentioned above, there are some other strategies you can also use when dealing with third-party resources.</p><p>The first thing to do is consider whether you even need the resource at all. Sometimes analytics scripts are added to a page as part of a test or campaign, then just left there. It's good practice to audit your sites on a regular basis to pick up any zombie resources that might have been left lying around.</p><p>The next consideration is whether the asset that you're fetching from the third-party could be self-hosted instead. Say you're fetching a version of jQuery that's stored on a Content Delivery Network (CDN). Yes there are some benefits to this approach, but there are <a href="https://csswizardry.com/2019/05/self-host-your-static-assets/#self-host-your-static-assets">also several risks</a> such as an outage, network disruption, and also the penalty that comes with having a cross-origin connection. Rather than being exposed to these factors you could host the assets locally.</p><p></p><h3>8. <strong>Inline critical CSS and load the rest later</strong></h3><p>Each time a browser encounters a CSS file on your web page it must fetch and parse the file before it can continue rendering content on the page. That means having a large stylesheet, or even a series of smaller ones, on your web page will result in a performance penalty. One way around this is to extract CSS that is vital for content that is first visible to website visitors, and to inline that CSS directly into the HTML of the page. Other CSS can be loaded later so as not to block the rendering of page content.</p><p>The <a href="https://web.dev/extract-critical-css/">most common technique</a> is to consider vital content as anything that appears "above the fold". This is content that will be first visible when a given page is loaded. Any CSS required to present this content properly should be removed from any stylesheet, minified, and insert within a style tag within the <code class="language-markup"><head></code> of the page. Other CSS can then be loaded using the preload technique and fallback above.</p><p></p><h3>9. <strong>Can you use SVG or CSS for icons?</strong></h3><p>A lot of business link to their social media channels from their website. In doing this they often look to use a font icon library to provide them with the branded social media icons to include on their page. However, font icon libraries can be hefty, and are often hosted by third-parties. An alternative to using them is to consider native CSS or even SVG options. SVG is a vector-based graphic format that is lightweight and can even be embedded directly into your page's HTML (reducing resource requests). There are plenty of options when it comes to CSS or SVG icons. One that I turn to frequently is <a href="https://css.gg/">CSS.gg</a>, though <a href="https://tablericons.com/">Tabler Icons</a> is another good alternative.</p><p></p><h3>Conclusion</h3><p>Applying even a few of the steps above should result in good performance improvements on your website. While there is a lot more to web performance, if you make these part of your regular web page planning and development process you will be going a long way to having a more performant page from the moment it's launched.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Interested in a Technical SEO & Performance Audit?</p><p></p><p>I can perform a technical SEO and performance audit on your website which will help you identify even more areas you can optimise. Visit the Services section of my website to <a href="https://www.fershad.com/technical-seo-performance-audit/">learn more about the audit</a>, and what it includes.</p><p></p></div></div>Dark Mode toggle for Svelte2024-02-20T13:25:46Zhttps://fershad.com/writing/svelte-dark-mode-toggle/<div><h2>Toggle Component</h2><p>Inside your Svelte project create a new component named <code class="language-markup">Toggle.svelte</code>. It's going to contain a script tag, and button element.</p><pre class="language-javascript"><code class="language-javascript"><script>
let darkMode = false;
function toggle() {
darkMode = !darkMode;
window.document.body.classList.toggle('dark');
}
</script>
<button on:click={toggle}>
{#if darkMode }
Go light
{:else}
Go dark
{/if}
</button></code></pre><p>What the code above does is:</p><ol><li>Initiates a boolean variable that allows us to check the current theme.</li><li>Creates a function that reverses the variable, and toggles a class on the body DOM element.</li><li>Has a button that calls the function when clicked.</li></ol><p>The button also contains an <code class="language-markup">if</code> statement which changes the value of the button element based on the current theme. Thanks to the reactivity of Svelte, this just works straight out of the box and will change automatically when the <code class="language-markup">toggle()</code> function is run.</p><p>In there REPL I've also added some styling to the button, but I won't got through that here.</p><h2>Import the Toggle</h2><p>To use the toggle component within your app, import it using <code class="language-markup">import Toggle from './Toggle.svelte';</code>. In the REPL we've imported it directly into the <code class="language-markup">App.svelte</code> file, however you might want to use the toggle in a Header, Nav or Footer component. It's up to you.</p><p>Now, for the Toggle to appear within your app simply call it using a capitalized component tag <code class="language-markup"><Toggle />.</code></p><h2>CSS</h2><p>Of course, none of the above will work with having the right CSS in place to change the style. Below is a really simple example of how you might implement this. You can either add the CSS to your Svelte project's <code class="language-markup">global.css</code> file or add it to <code class="language-markup">App.svelte</code>.</p><pre class="language-html"><code class="language-html"><style>
:root{
--bg-color: #FFFFFF;
--text-color: #000000;
}
:global(body) {
background: var(--bg-color);
color: var(--text-color);
}
:global(body.dark) {
--bg-color: #000000;
--text-color: #FFFFFF;
}
</style></code></pre><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Code & Example</p><p></p><div><p>I've build this out in <a href="https://svelte.dev/repl/148690356c4b45df8587cbadc448ec58?version=3.20.1">Svelte's online REPL</a>. The code and example both live there.</p><p><em>Note: In this guide we won't be persisting the theme change. That means any time the page is refreshed the theme will revert back to light mode.</em></p></div><p></p></div></div>Freelancing in Taiwan during the COVID-19 pandemic2024-02-20T13:25:46Zhttps://fershad.com/writing/covid-19-freelancing-in-taiwan/<div><p>Life's different now. You don't need me to tell you that, especially if you're one of the 3 billion-plus people currently living under some form of lockdown. But even if you're not, life in the first half of 2020 has taken on a distinctly different look and feel.</p><p>COVID-19, coronavirus, SARS-CoV-2, or whatever other unsavoury names you want to give it, has flipped the global economy on its head. It's exposed the best and worst of society. And it's forced a large swath of the workforce out of work, or to work from home through no choice of their own. For some, the change to a 'home office' setup has been welcome. For others, <a href="https://www.abc.net.au/news/2020-04-01/tiny-apartment-life-during-the-coronavirus-lockdown/12084538">it's been tough</a> to find the right balance, space, and drive to get things done.</p><h2>The situation in Taiwan</h2><p>Living in Taiwan we've been largely spared the widespread loss of life and disruption that other countries have witnessed. The government has been <a href="https://www.smh.com.au/world/asia/while-other-countries-lost-precious-time-taiwan-mobilised-to-keep-covid-19-at-bay-20200316-p54ah8.html">on the front foot</a> against the virus since late December 2019. That's long before the WHO declared this a pandemic.</p><p>At that time officials were boarding flights arriving from China to check on the well-being of passengers. Shortly after, the central government introduced a rationing system for surgical masks aimed at preventing hoarding. At hospitals, office buildings, shopping centres and restaurants all visitors had their temperature taken at the entrance. Some even required visitors to wear surgical masks while inside.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/1e21a976debc93e880e4b6a3163efa1cc90a45f6-1080x1080.jpg?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/1e21a976debc93e880e4b6a3163efa1cc90a45f6-1080x1080.jpg?auto=format" alt="Line outside a pharmacy in Taiwan. People waiting to buy surgical masks." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">A queue outside a pharmacy in Taipei. Individuals are limited to three surgical masks, and must present their ID to collect them. Availability is based on your ID number.</figcaption></figure><p>From early on Taiwan started tracking those entering the country. It merged health and immigration databases <a href="https://jamanetwork.com/journals/jama/fullarticle/2762689">within a day</a>, allowing physicians to see if a new patient had recently entered the country. As the pandemic drew on, all foreign arrivals in Taiwan were quarantined and tracked for 14 days. It's a system that has <a href="https://time.com/5805629/coronavirus-taiwan/">drawn</a> <a href="https://www.atlanticcouncil.org/blogs/new-atlanticist/lessons-from-taiwans-experience-with-covid-19/">widespread</a> <a href="https://www.bloomberg.com/opinion/articles/2020-04-05/taiwan-s-advance-on-who-in-covid-19-shows-its-place-in-world">praise</a>.</p><p>Thanks to a combination of the above measures, and no doubt a healthy slice of luck, Taiwan has managed to remain relatively unscathed thus far. Life here goes on as close to normal as possible. That said, it is evident that fewer people than usual are eating outside, or going to shopping malls. It feels like there are fewer people on the metro and buses here in Taipei, especially outside of peak hours (yep, people still commute to work). Business owners I know are feeling a pinch, and that's with no 'stay at home laws' currently enforced.</p><h2>Has this changed the way I work?</h2><h3>In 2019</h3><p>I've been working for myself for about 9 months now. Pre-COVID-19 I would spend my working days either at the local library or any number of cafes around Taipei city. Sometimes I'd get there on foot, other times by bike, and for places further afoot I'd hop on the bus or metro.</p><p>Work itself was steady. I started working for myself in August 2019 and picked up a client shortly after that. That kept me busy while I found my feet.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/352c11f72dd931153dd24acb2751e86ffdf15e4f-3024x4032.jpg?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/352c11f72dd931153dd24acb2751e86ffdf15e4f-3024x4032.jpg?auto=format" alt="Photo of the interior of Taiwan National Library." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">I spent a fair bit of time at Taiwan Nation Library which is just a short walk from my apartment.</figcaption></figure><h3>In 2020</h3><p>Not much changed for me in the first half of January 2020. I was still going out to cafes to work. I was still happy to take public transport to get to where I needed to go.</p><p>I started getting more cautious around the end of January. Either side of the Chinese New Year holiday my girlfriend came down with allergic reactions. As a result, we paid multiple visits to our local hospital. There we saw an increased level of caution from the moment you approached the door. At that time even the security officers manning the hospital entrances were decked out in head-to-toe protective gear. This gave me the first real indication that maybe this new virus was something to take a bit more seriously.</p><p>Since then my working habits have somewhat changed. I spend almost my entire time now working at home. I could count on my fingers the numbers of times I've worked in a coffee shop since the end of January. The thought of wearing a surgical mask for hours on end at the library has turned me off going there too. On the flip-side, my cat appreciates the extra company and having a lap to fall asleep on.</p><h3>My average day now</h3><p>My days are pretty much identical. Coffee, breakfast and a bit of work in the morning. I'll normally throw on a few podcast episodes, or have some YouTube videos going in the background.</p><p>Around 12:30pm I'll take a break for lunch, and head out for a long walk. If I've got errands to run I might take my bike instead and knock them off at this time. I've always got to remember to take a surgical mask with me when leaving the house. Some places here won't allow you in without one.</p><p>Sometime after 2pm, I'll be back home. I'm normally greeted by a whinging cat who just wants me to sit down so that she can have a nap. This invariably results in a nap for me too.</p><p>Post-nap there's time for a bit more work, before heading out (on foot if I can) for dinner. Sometimes I'll come home after a feed and do a bit more work, but that's a bad habit I've really go to try to kick.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/076a84289e3802639a22564741c1208e4383cdb9-1080x1080.jpg?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/076a84289e3802639a22564741c1208e4383cdb9-1080x1080.jpg?auto=format" alt="Cat sleeping on sofa." loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">My cat’s been one of the few winners out of the COVID-19 pandemic.</figcaption></figure><h2>It's different, but it could be worse</h2><p>Going through the news every day reminds me just how lucky we have it at the moment here in Taiwan. I hope things don't take a turn for the worse here, and that the rest of the world can very quickly get back to some semblance of normality.</p><p>To be honest I'm not sure when or if I'd return to my pre-COVID-19 working habits. I'm very much a creature of routine, and I actually don't mind being a house hermit. I'm happy that through this period I have landed on a daily pattern that I feel works for me. It eliminates any commute, allows me to stay active, and makes me feel like I'm being productive. It might just be that I've stumbled upon my new normal.</p></div>Rotating buttons in CSS2024-02-20T13:25:46Zhttps://fershad.com/writing/rotating-buttons-in-css/<div><p>Here's a really quick little code note for a cool button style. I implemented this rotating button style in the filter section of my <a href="https://markdown.fershad.com/">Markdown Cheat Sheet app</a>. You can check it out there, or at the CodePen below.</p><p>To trigger the transition effect we have to add an <code class="language-markup">onclick</code> event attribute to each button. This fires off a bit of JavaScript that applies an <code class="language-markup">active</code> class to the button. With this class the can then rotate the button with <code class="language-markup">transform: rotate(45deg)</code>.</p><p>If we leave it at that, however, the content within the button (in this case icons) will also rotate. That's normally not what we want. The trick to keeping the icons straight while the buttons rotate is two-fold:</p><ol><li>We've got to counter the rotation of the button on the icon element (i.e. <code class="language-markup">transform: rotate(-45deg)</code>)</li><li>We've got to match the same transition animation speed as the button transition (I've used <code class="language-markup">0.3s</code> in my demo).</li></ol><p>This gives a smooth rotation of the button element while keeping the icon fixed in place.</p><p class="codepen" data-height="350" data-theme-id="light" data-default-tab="0" data-user="fishintaiwan" data-slug-hash="KKpOoLN" data-preview="true" data-codepen-url="https://codepen.io/fishintaiwan/details/KKpOoLN" data-pen-title="Styled dropdown with smooth expanded effect" style="height:350px;box-sizing:border-box;display:flex;align-items:center;justify-content:center;border:2px solid;margin:1em 0;padding:1em;"></p></div>Building my first app with Svelte 32024-02-20T13:25:46Zhttps://fershad.com/writing/building-my-first-svelte-app/<div><p>Last month I started learning about Svelte. I'd heard a bit about it as a framework, and from what I could see it was quick and easy to get started with.</p><p>To start with I took Scott Tolinski's <a href="https://www.leveluptutorials.com/tutorials/svelte-for-beginners">Svelte for Beginners</a> tutorial series. It took me through everything I needed to know about building web apps with Svelte 3. The fact that we also built a small quiz app with it allowed me to see just what might be possible with Svelte as a framework. It got my mind ticking over.</p><p>What impressed me most about Svelte was that it's pretty much vanilla JavaScript, HTML and CSS. Of course there are smatterings of Svelte's syntax & the Svelte way of doing things, but they're not hard to pick up. So if you know these three languages, then you can jump right in and start building with Svelte.</p><p>So, after doing some more reading and a couple more tutorials I decided to just go ahead and make my first Svelte web app.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Code and Example</p><p></p><div><p><strong>Site:</strong> <a href="https://markdown.fershad.com/">https://markdown.fershad.com</a> </p><p><strong>Source:</strong> <a href="https://github.com/fishintaiwan/markdown-cheatsheet">https://github.com/fishintaiwan/markdown-cheatsheet</a></p></div><p></p></div><h2>The idea</h2><p>Markdown is something I've started using more now that I develop full-time. That said, I always find myself reaching for Google whenever I am writing Markdown notes. I'm often searching for ways to do things. How do I add an image? How do I add a block of code? What about a quote?</p><p>So the idea I had was to create a small, filterable cheat sheet for myself. Making it with Svelte, I planned for it have some kind of filtering functionality. Finally I also wanted it to run as a Progress Web App (PWA).</p><h2>Building in Svelte</h2><p>All told the project took me a bit over two days to complete. Having never used Svelte before, I felt that most of the issue I faced would be around the framework and syntax. I was wrong. So, so wrong. I spent most of my time solving layout issues, and filling up the raw data that the app would use.</p><p>Figuring out how to filter the data was probably the most challenging part of the whole process from a Svelte perspective. I toyed with the idea of using a library like <a href="https://isotope.metafizzy.co/">Isotope</a> to handle this, but wasn't able to get it working with the layout that I wanted. I finally decided to just build my own filter in JavaScript and pass data to it through Svelte <a href="https://svelte.dev/docs#createEventDispatcher">event dispatchers</a>. It works well, albeit without the graceful animation that Isotope can provide.</p><p>Looking back it was incredibly straightforward to build my first app in Svelte, despite having little prior experience with the framework.</p><h2>Some more thoughts on Svelte</h2><p>Before going on here, you should definitely give Ryan Atkinson's <a href="https://github.com/feltcoop/why-svelte">analysis of Svelte's pros and cons</a> a read. Especially if you work on larger projects, or come from a Vue/React background and aren't sure if Svelte is the right choice for you. There's also a very good <a href="https://syntax.fm/show/173/hasty-treat-wes-and-scott-look-at-svelte-3">Syntax.fm episode</a> on it.</p><h3>Compiler = versatility</h3><p>One of the things I live about Svelte is that it's a compiler. It takes components, and outputs them as pure JavaScript. In a Svelte project you can add the <code class="language-markup">bundles.js</code> file that it outputs onto an index.html page you're go to go!</p><p>What this also means is that you can build something with Svelte, then take the output and put in into something like a static website. It's an idea I'm flirting with right now to create a few custom landing pages for my website. The idea of building them in Svelte and just adding the compiled files into my Eleventy project is pretty appealing.</p><h3>You can still use PostCSS, SCSS, Typescript etc.</h3><p>Out of the box Svelte requires that you use code in vanilla JavaScript and CSS. You can quickly extend that to support most popular preprocessors like PostCSS, SCSS, Coffeescript and Typescript. Using the <a href="https://github.com/kaisermann/svelte-preprocess">Svelte Preprocess</a> plugin means you don't have to forgo your preferred development language just because you're using Svelte.</p><h3>Thorough docs, tutorials, and examples</h3><p>If you're trying to get started with Svelte, then have a nosey around <a href="https://svelte.dev/">https://svelte.dev</a>. The site contains a thorough tutorial series which you can go through to learn the fundamentals of the framework. The docs and examples are also great sources of reference as you go through the process of building your first few apps.</p><h2>Closing</h2><p>As you can tell, I'm definitely going to be using Svelte again for projects in the future. While I don't have any plans at the moment to switch this site over to Svelte, I do have a few other small web apps in mind. I can also see myself turning to Svelte if I needed a more interactive landing page or microsite created for a client.</p></div>Two options for making responsive tables for your website2024-02-20T13:25:46Zhttps://fershad.com/writing/options-for-making-responsive-tables/<div><p>I recently found myself having to create a website that presented a lot of data in tabular format. If we were only having to think about tablet and desktop displays then that's easy. However, things get a bit trickier when trying to present tabular data on mobile viewports. Of course, we don't want to overload visitors by cramming all the data into the viewport. At the same time we want to present it in a fashion that is clean, elegant, and makes data discoverable.</p><p>After sketching out a few ideas, I started looking around the internet for ideas. Pretty quickly I came across <a href="https://css-tricks.com/accessible-simple-responsive-tables/">this post form Davide Rizzo on CSS Tricks</a>. It flicked a switch. I started thinking about making a table with div elements, rather than table markup.</p><p>This post goes through two approached that I tried out. Firstly, using table markup and horizontal scrolling, and then looking at how it can happen with CSS Grid.</p><h2>Using Table Markup</h2><p>This is the method I ended up settling on, just because it was the easiest to put in place. The key to this solution is two-fold:</p><ol><li>Wrap your table in a holder div that is <code class="language-markup">position: relative;</code> and with the <code class="language-markup">overflow: auto;</code>.</li><li>Make the table itself set to <strong><code class="language-markup">width: 100%; overflow: hidden; table-layout: auto;</code></strong></li></ol><p>You can see this in action in the Codepen below. If you resize the results window you can see how the table becomes scrollable at a specific breakpoint, while shown in full on wider displays.</p><p class="codepen" data-height="350" data-theme-id="light" data-default-tab="0" data-user="fishintaiwan" data-slug-hash="RwPqLGP" data-preview="true" data-codepen-url="https://codepen.io/fishintaiwan/pen/RwPqLGP" data-pen-title="Styled dropdown with smooth expanded effect" style="height:350px;box-sizing:border-box;display:flex;align-items:center;justify-content:center;border:2px solid;margin:1em 0;padding:1em;"></p><h2>Using CSS Grid</h2><p>This approach is one that I really like, and something I'd like to use for a project in the future. It lets you break out from the shackles of the table markup and present data how you want to at specific breakpoints. In the Codepen below cards uses cards on smaller viewports, and a table at wider breakpoints.</p><p>The key to this approach is changing the <code class="language-markup">grid-template-areas</code> property. There are a few other things you should keep in mind:</p><ol><li>You're going to need a div for each row of data.</li><li>Within that "row" div you'll need to have header div elements and content div elements. All these elements will need to be assigned a <code class="language-markup">grid-area</code> property.</li><li>By assigning all elements inside a row a <code class="language-markup">grid-area</code> property you can then customise the layout for smaller viewports by changing the <code class="language-markup">grid-template-areas</code> property of the wrapper div.</li><li>For wider breakpoints, you'll need to hide all but the first row's headings. I've done that by using the :not(:first-child) pseudo-class selectors.</li></ol><p>Have a look at the Codepen embed below, and resize the viewport to see the change in action.</p><p class="codepen" data-height="350" data-theme-id="light" data-default-tab="0" data-user="fishintaiwan" data-slug-hash="rNVQgmr" data-preview="true" data-codepen-url="https://codepen.io/fishintaiwan/pen/rNVQgmr" data-pen-title="Styled dropdown with smooth expanded effect" style="height:350px;box-sizing:border-box;display:flex;align-items:center;justify-content:center;border:2px solid;margin:1em 0;padding:1em;"></p></div>Make a styled, custom dropdown2024-02-20T13:25:46Zhttps://fershad.com/writing/styled-custom-dropdown/<div><p>I was working on a recent project that required me to create a customized select dropdown. It had to clearly, neatly show two sets of data (jobs and locations), and look good in the process. Each item in the dropdown would link to a corresponding job listing on another page.</p><p>To do this, I styled a set of list items that contained the required information. Styled with CSS, and triggered using Javascript, the result was exactly what my client was hoping for.</p><p>You can see the source code, and the end result in the Codepen below.</p><p class="codepen" data-height="350" data-theme-id="light" data-default-tab="result" data-user="fishintaiwan" data-slug-hash="dyobrNy" data-preview="true" data-codepen-url="https://codepen.io/fishintaiwan/pen/dyobrNy" data-pen-title="Styled dropdown with smooth expanded effect" style="height:350px;box-sizing:border-box;display:flex;align-items:center;justify-content:center;border:2px solid;margin:1em 0;padding:1em;"></p><p></p><p></p></div>Disabling elements with CSS pointer-events and media queries2024-02-20T13:25:46Zhttps://fershad.com/writing/css-pointer-events-disable-elements/<div><p>A recent project I've been working on presented me with an interesting little challenge. I was working with a collapsible section of content that was triggered by a styled label and hidden checkbox. On mobiles, the content had to be collapsible, while on larger displays the content was to remain expanded and unable to collapse</p><p>My initial thought was to reach for javascript. I planned to traverse the DOM to identify particular styles attached to the input checkbox and label at various breakpoints. That started to get messy, and so I began searching for an alternate solution. That's when I stumbled on CSS <code class="language-markup">pointer-events</code>.</p><p>By combining CSS <code class="language-markup">pointer-events: none</code> with media queries I was able to achieve what I needed in many, many fewer lines of code. You can <a href="https://developer.mozilla.org/en-US/docs/Web/CSS/pointer-events">read more about pointer-events at MDN</a>.</p><pre class="language-scss"><code class="language-scss">@media only screen and (min-width: 768px;) {
input[type="checkbox"],
.input-label {
pointer-events: none;
}
}</code></pre><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Gotcha</p><p></p><p>It's worth noting that the input item will still be tab accessible.</p><p></p></div></div>How I would build my 2019 Rugby World Cup fixtures site differently next time.2024-02-20T13:25:46Zhttps://fershad.com/writing/how-would-i-rebuild-my-rugby-world-cup-website/<div><p>Mid-September, over a typhoon holiday long weekend in Taipei, I put together <a></a>a <a href="https://rwc2019.fershad.com/">very simple website</a> for the 2019 Rugby World Cup. My goal was to show all the fixtures of a tournament for a range of time zones. I also wanted to challenge my design skills and further refine my JAMStack development skillset. I didn't want it to be overly complicated. And I wanted to build it without using any libraries or frameworks as much as possible.</p><p>With the 2019 Rugby World Cup now almost at an end, I've been thinking about how I might build a similar website differently next time. In this post, I'll be focusing on two areas in particular - data storage and usability.</p><h2>Updating the site with match scores & fixtures became a little bit of a burden.</h2><p>To keep the site as simple as possible, I created a JSON file to store data about the matches. From the outset, I had planned on updating match scores as games completed, and so score data was also stored within this JSON file. It allowed me to quickly spin up the site with data structured just how I wanted it, and no reliance on an external service.</p><p>However, it also meant that to update match details I would need to update the JSON file manually. That update would then needed to be committed using Git for the site to rebuild.</p><p>This wouldn't have been an issue had I remained at home during the tournament. But I was regularly making trips to Japan to watch games live, and found myself often in a position where I was unable to complete the update process. This meant that the data on the website was sometimes a day or two behind.</p><h3>How would I handle data storage next time?</h3><p>If I had to build this site again, I would look to use an online data store such as <a href="https://airtable.com/invite/r/1p0yKl4x">Airtable</a> to store match and score data. This way, I would be able to update scores through the Airtable app quickly. Using webhooks, IFTTT or Zapier, I'd then be able to automate the build process for the site. This would allow me to keep the website static. Since my phone's always with me, it would speed up the update process significantly (even if Zapier or IFTTT take 15 minutes to pick up changes to Airtable bases).</p><h2>I might make the time zone conversion more dynamic.</h2><p>One of the core parts of the website was the ability for users to change time zones when viewing the match schedule. To build fast, I borrowed heavily from <a href="https://github.com/philhawksworth/html-time">Phil Hawksworth's HTML time</a> project. However, next time, I might just use a Javascript library like <a href="https://momentjs.com/">moment.js</a> to allow for dynamic time zone conversion. I feel this would lead to a smoother user experience since it would remove the need to leave the schedule page to change time zones.</p><h2>I've definitely got to learn more about service workers.</h2><p>As a bit of an afterthought when building the website, I decided to try and make it a progressive web app (PWA). This allows the site to be downloaded by a user and run on their device as though it were a native app. However, to be fair, I did this with very little knowledge of setting up service workers.</p><p>Since I was using Eleventy as the static site generator to build the site, I found <a href="https://www.npmjs.com/package/eleventy-plugin-pwa">this plugin</a> that provides PWA capability for Eleventy sites. I <a href="https://okitavera.me/article/turn-your-eleventy-into-offline-first-pwa/">followed the creator's guide</a> to set it up, and pretty much left it from there. It worked, in that, I was able to download the site to my phone and run it as an app. However, because of the basic implementation, I had used, my app (and website for that matter) was presenting me with cached data and would only update if I manually refreshed the page.</p><p>I'd definitely like to build more PWAs in the future, and so learning more about services workers is definitely on my to-do list for this year!</p><h2>Perhaps next time I'd add some live score functionality.</h2><p>This would depend on being able to find and access data to pull in live scores. I flirted with the idea of trying to hack the official Rugby World Cup website to either scrape data from their match pages or find JSON data I could use. In the end, it was too much effort, and I also didn't want myself getting into a mess over copyright/data infringements.</p><p>A friend of mine also suggested perhaps linking to an official Twitter hashtag for each game, or to an external match centre so that people could follow games live there. Both are ideas I'll explore in the future if I decide to build a similar site.</p><p></p><p>Build this site for the 2019 Rugby World Cup as a personal project sure did result in more learnings that I was anticipating. It's given me things to learn more about and also showed me that I am capable of spinning up a full site with minimal turnaround without relying on any frameworks. I'm hoping to spin up a website for next year's southern hemisphere rugby season, but that's still some time away. In the meantime, I've got a few more things to study up on.</p></div>What I've learnt in two months working remotely.2024-02-20T13:25:46Zhttps://fershad.com/writing/remote-working-after-two-months/<div><p>Last month, Buffer published their annual "State of Remote Work" report for 2019. I stumbled across it recently during a spell of general internet wandering. I read it, wanting to see how my remote working life (albeit in its infancy) stacks up against my peers globally. In doing so, I also took the opportunity to take a look at just how things have panned out for me so far. Below are my musings, paired up with some excerpts from the <a href="https://buffer.com/state-of-remote-work-2019">2019 State of Remote Work report</a>.</p><p><em>Aside: The findings are a good read for <strong>both employees and employers</strong>. It also introduced me to a fantastic new term to describe remote workers - "<strong>office-optional professionals"</strong>.</em></p><h2>Background</h2><p>A little over two months ago, I turned in my office swipe card and took the first steps towards being a solopreneur (freelancer, if you must). It's always a big step moving onto something new. From a business standpoint alone, I've already learnt a heck of a lot in these last two months. That includes:</p><ul><li>The process through which to establish a business here in Taiwan and what requirements there are to do so.</li><li>Understanding Taiwan's corporate tax system and the various ways in which businesses can operate within it.</li><li>Having to arrange for my own National Health Insurance. (Normally in Taiwan employers are obligated to do this for employees).</li><li>Setting up bank accounts, email services, designing and printing business cards ... the list goes on.</li></ul><h2>Working remotely - Two months & counting</h2><p>So, what's it been like to break the shackles of a 9am - 6pm working day? Well, to be fair, it does have many upsides but as a friend of mine who also works from home pointed out - "in the end work is still work". Here's how it has panned out so far.</p><h3>I work from home more than I thought I would.</h3><p>I really wasn't expecting this. I had always pictured myself working out of cafes more often than not. It also surprised me to see that the overwhelming majority of remote workers polled by Buffer also chose home as their primary place of work (graphic below). Part of my preference for staying home is that it is overall the cheapest of all the options. It's also just easier. There's no worrying about finding a place, remembering to pack all your gear, or commuting (especially in Taipei's humidity and rain).</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/bd5b111c47c483c3619cbb12d809b469baf7e377-1664x1178.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/bd5b111c47c483c3619cbb12d809b469baf7e377-1664x1178.png?auto=format" alt="Chart showing the primary working location for remote workers" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Primary working location for remote workers | 2019 State of Remote Work Report, Buffer</figcaption></figure><p>My personal order of preference for this question would be:</p><ol><li>Home</li><li>Libraries</li><li>Coffee shops and cafes</li></ol><p>Being introverted by nature, and really not minding time alone, I haven't yet tried a coworking space. I'm not convinced that it would be the right environment for me to work effectively. Also, right now, it's an expense I can avoid. Who knows, I might be proven totally wrong if/when I give coworking a shot.</p><h3>Switching off. The struggle is real.</h3><p>While I was an office-mandatory professional, I would make a point of not working extra hours or taking work home unless I really needed to. Now, I'm at home most of the time, and so is the work. This has made unplugging from work a real struggle. From Buffer's report, it seems to be something that a lot of other remote workers also have an issue with.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/c8d2ea27f7a7f672fb978853c749c0761c637b2c-1664x1256.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/c8d2ea27f7a7f672fb978853c749c0761c637b2c-1664x1256.png?auto=format" alt="Chart showing the challenges faced by remote workers" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Challenges faced by remote workers | 2019 State of Remote Work Report, Buffer</figcaption></figure><p>In my case, I feel that one contributing factor is not having a separate office space where my work equipment is kept. I share a small 30 square meter studio apartment with my girlfriend and cat. It means that my laptop is always within easy reach, and I've often found myself working on something well past 10pm. It's not something that my girlfriend appreciates, and is definitely something that I am working to curtail. Some of the strategies I'm putting in place to do so are:</p><ul><li>Blocking out time for work in my calendar and sticking to them (this <em>should</em> also help to develop a bit more self-discipline).</li><li>Noting down any ideas that spring into my mind on paper and returning to them the next morning.</li><li>Keeping my laptop somewhere where I cannot see it.</li><li>Trying to establish a reading schedule of non-tech/web development content.</li></ul><h3>Flexibility is great, but you've still got to work.</h3><p>Having flexibility of schedule is definitely one of the major drawcards of working remotely. Errands can be run during the day, morning or afternoon bike rides can be planned, meetups can be easily arranged.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/bbb4466f29b5d250486be62b15eebc549917f69e-1664x1178.png?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/bbb4466f29b5d250486be62b15eebc549917f69e-1664x1178.png?auto=format" alt="Chart showing biggest benefit of working remotely" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Core perceived benefit of working remotely | 2019 State of Remote Work Report, Buffer</figcaption></figure><p>Of course, somewhere amongst all that work has to get done too. And that's been one of my biggest learnings in the last two months. The flexibility of schedule is excellent, just don't get swept up with it.</p><p>I have found myself at times realising that although I've been able to knock off a lot of household chores and errands, work that I had planned to complete for that day remains incomplete. That then feeds into not unplugging from work, and so a vicious cycle begins.</p><p>In a bid to change that, I have started trying to plan out my weeks in advance as best I can. This allows me to either set aside a day to get through all errands that need attending to or to spread them out across different days over a week. I find the latter approach to be a bit better, as it helps break up the days a bit, and also gets me out of the house more regularly.</p><h3>It's great being back on my bike.</h3><p>This one's a personal one, and not in the report obviously. About a month into working remotely, I realised that for me to go anywhere during the day ate up a good deal of time and money. Whether it was just heading out to get lunch, going to a library or cafe, or running some errands, it was just not efficient to rely on public transport or my own two feet.</p><p>To speed things up, and save some money, I've started riding my bike again. Besides the time and cost savings, there are also obvious health benefits to be gained from commuting by bike. Getting around Taipei is relatively easy. It's a reasonably bike-friendly city (though there's still plenty of room for improvement). Being able to ride directly to wherever I need to go, rather than navigate the metro and buses is also a huge plus.</p><p>Oh, and it also has given me greater freedom to explore more of the city I've called home for the past seven years. Not bad.</p><h2>Conclusion</h2><p>Those are just my experiences from the short time that I have been a remote worker. Though not everything has turned out how I imagined, it was reassuring to see just how similar my situation is compared with others around the world.</p><p>Making the leap has been, and continues to be, a great learning experience. So has writing this blog post. It has allowed me to reflect on what I've been through and how things are going. It's an exercise I might look at doing again in a few months.</p><p>All images above are taken from the <a href="https://buffer.com/state-of-remote-work-2019">State of Remote Work 2019</a> report, published by Buffer.</p></div>Storing, using, and keeping environment variables secret in local environments2024-02-20T13:25:46Zhttps://fershad.com/writing/store-use-secret-environment-variables-locally/<div><p>Environment variables are locally stored key=value pairs that can be accessed by your code. They're great for storing API keys, secrets, passwords and other sensitive material. They also help you not expose to those secrets to the public on GitHub, GitLab or Bitbucket. Here's how to create and use environment variables locally on your machine.</p><p>In the root folder of your app, create a file to store the keys. We'll call our's <strong>local-env</strong></p><p>Store any sensitive data you'll use in your app within this file</p><pre class="language-text"><code class="language-text">export MY_API_KEY="ANAOFWQ14124124js214g"
export EMIAL_PASSWORD="apasswordhere"</code></pre><p>Enter the root folder of your project using your terminal, and use the source command to bring in the local environment variables</p><pre class="language-text"><code class="language-text">source local-env</code></pre><p>Now you can call the local environment variables in your app. An example with Node.js would be</p><pre class="language-javascript"><code class="language-javascript">const Airtable_API_Key = process.env.MY_API_KEY</code></pre><p>One last thing to remember is to add the local-env to your .gitignore file so that it's not published next time you push your project.</p></div>Build landing pages to promote webinars across different time zones2024-02-20T13:25:46Zhttps://fershad.com/writing/tutorial-webinar-time-zone-pages/<div><p>In a past life, I worked in the marketing department of a multi-national software company. Next to me worked a team responsible for scheduling and creating webinar series for our users.</p><h2><strong>Problem - Hardcoding time zones</strong></h2><p>A global audience meant promotional material needed regional localisation. This included localising the broadcast times of the webinars. The conversion and presentation of these times were all done manually. What's more, for languages like English that spanned locales including the US, UK & Australia, webinar broadcast times we presented together in a single block. The final result was a block of time zones and locations that were displayed to readers somewhat as follows:</p><blockquote>Live Broadcast <br />8:00am United Kingdom (GMT) <br />5:00pm Australia - Sydney (+10:00 GMT) <br />3:00am United States - New York (-5:00 GMT)</blockquote><p>In this tutorial, we're going to solve this information overload and present users with only a single time zone. We'll also give them the ability to switch different time zones. We'll also be generating all pages dynamically using a static site generator.</p><h2><strong>What we'll build</strong></h2><p>We're going to create a set of static landing pages to promote a three-part webinar series. The webinars are broadcast live from London. We'll be creating landing pages that show the times for the live broadcasts in Sydney, Auckland, London, New York and Los Angeles. We're going to keep things as simple as possible and try not to use any frameworks or libraries. Let's go!</p><p>If you want to jump straight to the good stuff, here's the TL;DR verions: <a href="https://boring-morse-2957ec.netlify.com/">Demo</a> | <a href="https://github.com/fishintaiwan/tutorial-timezone-webinars-landing-pages">Source</a></p><h2><strong>Solution</strong></h2><h3><strong>Part 1 - Setup</strong></h3><p>For this project, we'll be using Eleventy as our static site generator and Liquid as our templating language. Let's get started by creating a project folder and installing Eleventy using Node Package Manager.</p><pre class="language-text"><code class="language-text">mkdir timezone-webinars
cd timezone-webinars
npm install @11ty/eleventy --save-dev</code></pre><p>With Eleventy installed, let's got ahead and create a simple folder structure to house the components of our project.</p><pre class="language-text"><code class="language-text">root
├───filters
└───src
├───_data
└───_includes
└───css</code></pre><p>We're going to prepare some data that we'll use to populate our templates. We'll create two JSON data files & store them in the _data folder.</p><ul><li>Data for the three webinars in the series</li><li>A list of time zones we'll be converting to</li></ul><p>First, the webinars data file. We'll be storing a title, presenter, description, and broadcast time.</p><pre class="language-json"><code class="language-json">[
{ "title": "Setting up your business", "presenter": "Fershad", "description": "Tips, tricks, and a few shortcuts to help you start your own business.", "liveTime": "2019-09-20T10:45:00Z" },
{ "title": "Get your first client", "presenter": "Michael", "description": "Secure your first client, and start off on the road to success.", "liveTime": "2019-09-27T12:30:00Z" },
{ "title": "Handling your taxes", "presenter": "Andrew", "description": "Stay on top of your finances and tax reporting to avoid unexpected suprises.", "liveTime": "2019-10-04T15:00:00Z" }
]</code></pre><p>An important point to note regarding the liveTime field above. We are using <a href="https://en.wikipedia.org/wiki/ISO_8601">ISO 8206 date and time format</a> for the broadcasts time. This way, we can easily convert it later on while accounting for variables such as seasonal clock changes in some locales.</p><p>Now for the time zone data file.</p><pre class="language-json"><code class="language-json">[
{ "location": "Pacific/Auckland", "locale": "en-NZ", "name": "Auckland (UTC+13:00)"},
{ "location": "Australia/Sydney", "locale": "en-AU", "name": "Sydney (UTC+10:00)"},
{ "location": "Europe/London", "locale": "en-GB", "name": "London (UTC+00:00)"},
{ "location": "America/New_York", "locale": "en-US", "name": "New York (UTC-05:00)"},
{ "location": "America/Los_Angeles", "locale": "en-US", "name": "Los Angeles (UTC-10:00)"}
]</code></pre><p>You can get a full list of <a href="https://timezonedb.com/time-zones">time zones</a>, and <a href="https://stackoverflow.com/a/3191729">locales</a> here. Of course, you could also use other methods to store this data such as in an Airtable base or, for more complicated cases, in a Database-as-a-Service such as FaunaDB.</p><p>For completeness, let's create a .eleventy.js config file, and point it to our 'src' folder. Be sure to create/save this file in the root directory for your project. We'll also tell it to output our static files to a folder called 'dist'.</p><pre class="language-javascript"><code class="language-javascript">module.exports = function(eleventyConfig) {
return {
dir: {
input: "src",
output: "dist"
}
};
};</code></pre><p>With all the data ready, we can start to make the template file to create landing pages for our various time zones.</p><h3><strong>Part 2 - Templating</strong> </h3><p>Let's start with a straightforward template to check everything is working as planned. From there, we can go about building it into something that looks more akin to what you'd want to send to someone.</p><p>We'll be using Shopify's Liquid as our templating engine. It's supported out of the box by Eleventy, and I've found it to be pretty good in the time I've been using it.</p><p>Our simple template is going to take the data from the timezones.json file, and create a page for each using the location to form the file path. Save this file in the src folder.</p><pre class="language-html"><code class="language-html">---
layout: base.liquid
pagination:
data: timezones
size: 1
alias: zone
permalink: /{{ zone.location }}/index.html
---
<h1>{{ zone.name}}</h1>
<p>{{ zone.location }} | {{ zone.locale }}</p></code></pre><p>Let's also create a layout file for this template to use. You can see in the frontmatter of the zones.html file we've told Eleventy to look for a file called base.liquid to use for the layout. We'll create and store this file in the _includes folder.</p><pre class="language-html"><code class="language-html"><!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<meta http-equiv="X-UA-Compatible" content="ie=edge">
<title>Business Webinar Series</title>
</head>
<body>
{{ content }}
</body>
</html></code></pre><p>Now if you run the eleventy command, you should see a new _site folder created. Within that should be individual folders for each of the time zones in our JSON file.</p><p>Next, let's bring in the date for the webinars into our template file and create a filter that handles the time zone conversions for us.</p><h3><strong>Part 3 - Time zone filter</strong></h3><p>First up, getting the data for each webinar into the template file for our time zones is pretty straightforward.</p><p>For now, let's loop of the webinar data and output it. We'll worry about layout and formatting later. Add the following to the zones.html file we created earlier.</p><pre class="language-html"><code class="language-html">{% for webinar in webinars %}
<div>
<h2>{{ webinar.title }}</h2>
<p>{{ webinar.description }}</p>
<p>{{ webinar.presenter }}</p>
<p>{{ webinar.liveTime }}</p>
</div>
{% endfor %}</code></pre><p>Rerunning the eleventy command, and inspecting one of the pages generated should allow you to see the information from our webinars JSON file presented in HTML.</p><p>For converting the time zones, we'll use a custom filter. The filter takes the webinar liveTime data, and the time zone data, and output a localised time from that.</p><p>Eleventy allows you to configure and add custom filters, which can then be called in our templates. We're going to create one now in the .eleventy.js config file we created earlier.</p><pre class="language-javascript"><code class="language-javascript">module.exports = function(eleventyConfig) {
eleventyConfig.addFilter("addZone", require("./filters/zone.js") );
return {
....
}
};</code></pre><p>Next, let's create the javascript file itself with the filters folder we set up earlier.</p><pre class="language-javascript"><code class="language-javascript">module.exports = function(time, zone) {
let locale = zone.locale || "en-US";
let liveTime= new Date(time);
let localeTime = liveTime.toLocaleString(locale, { timeZone: zone.location});
return localeTime;
}</code></pre><p>We can now call this filter in our zones.html template. Note that we'll have to pass two arguments to this filter. If you need help with this take a look at <a href="https://www.fershad.com/blog/posts/pass-multiple-arguments-to-eleventy-filter/">this post</a>. We've also used Liquid's date filter to format the date to give a little consistency to the output.</p><pre class="language-html"><code class="language-html">{% for webinar in webinars %}
<div>
<h2>{{ webinar.title }}</h2>
<p>{{ webinar.description }}</p>
<p>{{ webinar.presenter }}</p>
<p>{{ webinar.liveTime | addZone: zone | date: "%A, %B %e, %Y @ %l:%M %p" }}</p>
</div>
{% endfor %}</code></pre><p>Now, if you run the eleventy command and inspect the output files, you should see a different webinar liveTime for each region.</p><p>This output can be formatted and presented however you like. In the next part, we're going to quickly add some additional layout and formatting components to make things look more appealing to our pretend webinar audience.</p><h3><strong>Part 4 - Formatting & Layout</strong></h3><p>Let's create a simple three-column layout for displaying the webinars. We'll make it go down to one column on mobile devices for completeness.</p><p>Start by making changes to our layout and template file.</p><pre class="language-html"><code class="language-html"><head>
...
<link rel="stylesheet" href="/_includes/styles/kanban.css">
</head>
<body>
<div class="container">
{{ content }}
</div>
</body></code></pre><pre class="language-html"><code class="language-html"><h1>Tune in to our webinars!</h1>
<div class="holder">
{% for webinar in webinars %}
<div class="webinar">
<h2 class="title">{{ webinar.title }}</h2>
<p class="presenter">Presenter: {{ webinar.presenter }}</p>
<p class="description">{{ webinar.description }}</p>
<p class="time">{{ webinar.liveTime | addZone: zone | date: "%B %e, %Y at %l:%M %p" }}</p>
</div>
{% endfor %}
</div></code></pre><p>Then we can style it with CSS. We'll create a file called 'styles.css' in the _includes/css folder of our project.</p><pre class="language-css"><code class="language-css">body, html {
margin: 0;
padding: 0;
border: 0;
width: 100vw;
height: 100vh;
font-family: 'Gill Sans', 'Gill Sans MT', Calibri, 'Trebuchet MS', sans-serif;
}
body {
background: linear-gradient(45deg, rgba(152, 152, 152, 0.07) 0%, rgba(152, 152, 152, 0.07) 48%,rgba(136, 136, 136, 0.07) 48%, rgba(136, 136, 136, 0.07) 100%),linear-gradient(45deg, rgba(235, 235, 235, 0.06) 0%, rgba(235, 235, 235, 0.06) 79%,rgba(218, 218, 218, 0.06) 79%, rgba(218, 218, 218, 0.06) 100%),linear-gradient(135deg, rgba(12, 12, 12, 0.04) 0%, rgba(12, 12, 12, 0.04) 30%,rgba(79, 79, 79, 0.04) 30%, rgba(79, 79, 79, 0.04) 100%),linear-gradient(45deg, rgba(173, 173, 173, 0.03) 0%, rgba(173, 173, 173, 0.03) 66%,rgba(245, 245, 245, 0.03) 66%, rgba(245, 245, 245, 0.03) 100%),linear-gradient(135deg, rgba(84, 84, 84, 0.06) 0%, rgba(84, 84, 84, 0.06) 51%,rgba(165, 165, 165, 0.06) 51%, rgba(165, 165, 165, 0.06) 100%),linear-gradient(45deg, rgba(15, 15, 15, 0.02) 0%, rgba(15, 15, 15, 0.02) 14%,rgba(95, 95, 95, 0.02) 14%, rgba(95, 95, 95, 0.02) 100%),linear-gradient(0deg, rgba(34, 34, 34, 0.05) 0%, rgba(34, 34, 34, 0.05) 58%,rgba(98, 98, 98, 0.05) 58%, rgba(98, 98, 98, 0.05) 100%),linear-gradient(90deg, rgb(2, 110, 165),rgb(50, 216, 218));
}
.container {
width: 100%;
height: 100%;
max-width: 1280px;
padding: 0 0.5rem;
margin: 0 auto;
box-sizing: border-box;
display: flex;
flex-direction: column;
justify-content: center;
}
.container > h1 {
text-align: center;
color: #fff;
}
.holder {
width: 100%;
padding: 1.2rem;
}
.webinar {
width: 100%;
max-width: 320px;
margin: 1rem auto;
height: fit-content;
background: #FFFFFF;
border-radius: 5px;
position: relative;
padding: 0 0.8rem;
box-sizing: border-box;
border: 1px #FFFFFF solid;
}
.webinar > .time {
margin: 0;
top: -10px;
position: absolute;
background: #DF64BE;
padding: 5px 10px 5px 5px;
border-radius: 5px;
color: #FFFFFF;
}
.webinar > .title {
margin-bottom: 0.25rem;
}
.webinar > .presenter {
margin-top: 0.25rem;
color: #27749A;
font-size: 90%;
}
@media screen and (min-width: 640px) {
.holder {
display: grid;
grid-gap: 1rem;
grid-template-columns: repeat(auto-fit, 380px);
justify-content: center;
}
}
</code></pre><p>The final thing we'll have to do is to tell Eleventy to copy this CSS file directly to our output folder. Our final config file should look like this:</p><pre class="language-javascript"><code class="language-javascript">module.exports = function(eleventyConfig) {
eleventyConfig.addFilter("addZone", require("./filters/zone.js") );
eleventyConfig.addPassthroughCopy("/_includes/css");
return {
dir: {
input: "src",
output: "dist"
},
addPassthroughCopy: true
};
};</code></pre><h3><strong>Extra</strong></h3><p>As an extra, let's quickly create a simple index page with links to all the pages we created earlier. We'll loop through the array of time zones we created earlier, and link to each page. We can do this because we set the permalink for each webinar page to be the time zone location.</p><pre class="language-html"><code class="language-html">---
layout: base.liquid
---
<div class="prose container">
<h1 class="h1">Select a time zone.</h1>
<ul class="links">
{% for zone in timezones %}
<li><a href="/{{ zone.location }}">{{ zone.location }}</a></li>
{% endfor %}
</ul>
</div></code></pre><p>Take a look at what the <a href="https://boring-morse-2957ec.netlify.com/">final pages look like</a>, and <a href="https://github.com/fishintaiwan/tutorial-timezone-webinars-landing-pages">view the source code on Github</a>. Of course, this is just a starting point. We haven't even begun to tackle topics such as localised text and location-based routing. Maybe I'll get to those in later posts.</p></div>Passing multiple arguments to an Eleventy custom filter2024-02-20T13:25:46Zhttps://fershad.com/writing/pass-multiple-arguments-to-eleventy-filter/<div><p>While working on a <a href="https://www.fershad.com/work/rwc-2019-fixtures-kanban/">recent project</a>, I came across a small problem. I needed to convert an event time from UTC to a particular time zone. I was building my website project using the Eleventy static site generator, so I felt that a custom filter would be the best solution to this problem.</p><p>The filter I made would take the UTC event time and the time zone for conversion as variables. Using javascript's toLocaleString() function, it would export a converted time. I was using Liquid as my templating engine and so first turned to Shopify's docs for a possible solution to the problem of passing multiple variables to a filter. That didn't turn up anything, so I started looking through Eleventy's docs. After quite a bit of digging through I <a href="https://www.11ty.io/docs/languages/liquid/#multiple-filter-arguments">found the answer</a> hidden away in the Liquid templating section of the docs.</p><p>In the end, you pass the first variable to the filter as you normally would, and all subsequent variables after the filter is declared. It looks like this:</p><pre class="language-javascript"><code class="language-javascript">.eleventy.js
eleventyConfig.addFilter("changeTime", require("./filters/time.js") );</code></pre><pre class="language-javascript"><code class="language-javascript">time.js
module.exports = function(match, zone) {
// ... stuff happens here
}</code></pre><pre class="language-html"><code class="language-html">zones.html
...
<p>{{ match.datetime | changeTime: zone | date: "%H:%M" }}</p>
...</code></pre><p>One interesting aside is that you can join filters to each other. In the example above the changeTime filter is run first, and then the result of that is formatted using Liquid's date filter. Pretty neat!</p><p></p></div>Presenting Portable Text2024-02-20T13:25:46Zhttps://fershad.com/writing/presenting-portable-text/<div><p>Over the last few weeks, I've learnt a lot while spinning up this blog using <a href="http://sanity.io/">Sanity.io</a>. It's still a growing baby, so I'm sure there'll be a lot more things to learn in the future as well.</p><p>While getting started, I used the <a href="https://github.com/kmelve/eleventy-sanity-blog-boilerplate">Knut Melvær's Eleventy</a> boilerplate. It's a good starting point, covering everything needed to start up Sanity Studio, getting data using Sanity's API, and presenting it. That said, it is just a basic starter, so there's plenty you can build on top of it. One of the biggest challenges I faced while building upon the boilerplate was understanding and presenting Portable Text.</p><p>Portable Text allows for the serialisation of rich text into almost any markup language. It is what allows Sanity Studio to provide a rich text editor experience.</p><p>Out of the box, it works okay. If you'd like to make changes to the way elements are styled - say you want to add a class to links, and another class to images - then you can use marks or make changes to the type.</p><h2><strong>Marks</strong></h2><p><strong>When to use:</strong> When making modifications to <em>text</em> elements (e.g., spans, links, strong text, etc.)<br /><strong>Example: </strong>To give all links a particular class, you can pass something like the below into the serialiser.</p><pre class="language-javascript"><code class="language-javascript">marks: {
link: props => (
h('a', {className: "your-class", href: props.mark.href}, props.children)
)
}</code></pre><p>What this does is find all marks that are of type "link", and return HTML accordingly.</p><h2><strong>Types:</strong></h2><p><strong>When to use: </strong>When you want to modify block types.<br /><strong>Example: </strong>To show all images with a particular class, and without using the figure element.</p><pre class="language-javascript"><code class="language-javascript">types: {
image: props => (
h('img', {className: "lazy img-responsive", src: urlFor(props.node.asset._ref).url()})
)
}</code></pre><p><em>Note: This also requires you to include <a href="https://www.npmjs.com/package/@sanity/image-url">Sanity's Image URL builder</a> in your JavaScript file.</em></p><p>I'm sure there's a lot more customization that can be done using custom block types, but that's not something I've explored at this stage.</p></div>Why I now run my website on Netlify and Sanity2024-02-20T13:25:46Zhttps://fershad.com/writing/hosting-on-netlify-sanity/<div><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">Update - May, 2021</p><p></p><p>This website is now hosted on Cloudflare Pages.</p><p></p></div><h2><strong>Background & Problem</strong></h2><p>I've been building websites using the Ruby on Rails framework and hosting them on <a href="https://www.heroku.com/">Heroku</a> for some time now. <a href="https://www.heroku.com/pricing">Heroku's free tier</a> is great for deploying websites that have a database-driven backend. However, websites hosted on the free tier go to 'sleep' after a period of inactivity. That means that new visitors might be sitting on a blank screen for some time while the website boots up in the backend. Not a good experience.</p><p>Don't get me wrong, I still host some sites that don't get regular traffic on Heroku's free tier. However, for my portfolio site, I wanted as fast load time as possible. To get that I switched to Heroku's 'Hobby' plan. For $7 per month, I was able to ensure my site didn't go to sleep. That met my needs while I was starting out. The cost wasn't an issue as I was still employed full-time as well.</p><p>Eventually, though, I started having to think about the costs of running a business without the cushion of a full-time salary.</p><div class="callout"><p class="h4 marker-highlight" data-highlight="accent-dark">TL;DR</p><p></p><div><ul><li>Able to save money on website and content hosting with free plans on both Netlify and Sanity.</li><li>Page speed performance improved by removing database layer.</li><li>Allowed me to build a customizable blog for the website fairly quickly.</li></ul><p></p></div><p></p></div><h2><strong>Solution - A content-driven static site</strong></h2><p>While I continued to learn more about web development, frameworks, and all that good stuff, I stumbled across the <a href="https://jamstack.wtf/">JAMstack</a> & static sites. At first, I wasn't convinced. Sure, a static site might be great for a landing page that doesn't need to change frequently, but maybe it wasn't for me. I had a few requirements for my portfolio site, including:</p><ul><li>The ability to easily add and manage projects</li><li>The ability to add new types of dynamic/editable content (like testimonials)</li><li>I also wanted to build my own customizable blog with backend.</li></ul><p>Surely you need to build and host a database for that. Turns out you don't.</p><p>As I learnt more and more about JAMstack sites, the more it seemed like a no brainer for me to move my website off Rails and onto the JAMstack.</p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/56d5d61456b2130edc24a2ebeecefc88e6bc92c0-1920x692.jpg?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/56d5d61456b2130edc24a2ebeecefc88e6bc92c0-1920x692.jpg?auto=format" alt="Have more money in the bank by moving from Heroku to Netlify" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Photo by Sabine Peters on Unsplash</figcaption></figure><h3><strong>Keeping the hosting cost down</strong></h3><p>Obviously, in making this move, I didn't want to increase the costs associated with hosting my website. Google Analytics was telling me that my website traffic was steady, but not all that significant. I kept this in mind while looking at different hosting options. After a bit of research, I was down to two choices - Amazon Web Services (AWS) or Netlify. I'd heard of AWS but didn't know much about Netlify.</p><p>AWS is great for delivering scalable websites and applications. You pay for what you use, which means you can suddenly experience a surge in website traffic and not worry about the site going down. You'll just end up being charged a little extra at the end of the billing month for the extra resources used. This is great for websites that see seasonal or event-related spikes in traffic. <a href="https://aws.amazon.com/getting-started/projects/host-static-website/">On their website</a>, AWS state that it would cost between $1-3 to host a static site.</p><p>That said, I finally settled on hosting my website on <a href="http://netlify.com/">Netlify</a>. For a few reasons:</p><ul><li><a href="https://www.netlify.com/pricing/">Netlify's free tier</a> provided everything I needed to host my site at no cost. The fact that I could also connect to my domain name, integrate forms, and not have my site fall asleep when not in use were all huge plus points as well.</li><li>Deploying to Netlify is ridiculously easy. By connecting to the Github repo for my website means I could automatically update my website by merely making a commit. If you're not updating content often, there's also a more straightforward drag-n-drop web interface.</li><li>All site files are hosted on a <a href="https://www.netlify.com/products/edge/">global content delivery network (CDN)</a>, meaning load times for visitors are significantly reduced.</li><li>Hosting and other services (like forms) could still be scaled up if I ever encountered a spike in traffic/usage. That said though, if I did need to scale up, then it would cost significantly more than AWS might cost.</li><li>It provides SSL certificate out of the box for all sites for free! So HTTPS is instantly available for my website (for Heroku you get an SSL certificate by upgrading your site to the $7/month Hobby plan).</li></ul><p></p><h3><strong>But wait, your site is 'static' so it must be harder to edit content.</strong></h3><p>When you hear the term 'static site' it's easy to think that it is a website on which the content remains relatively consistent for the life of the site. That's definitely what I thought when I started out. As it turns out, though, that's certainly far from the case.</p><p>The 'A' in JAMstack stands for APIs. APIs allow for one website to request data from another, which it can then process and present. It's with this logic that a whole world of new Content Management Systems (CMSs) have sprung up - Headless CMSs. The entire topic of headless CMSs is something I might delve into in another post later one. Put simply though, but utilizing a headless CMS, I am now able to have a content backend for my website without the bloat of Wordpress, or the need to spin up a database.</p><p>Most headless CMS providers also offer a significant degree of flexibility. In my case, this allows me to manage my blog posts and projects through a single backend. It also means that if I need to add any other changeable content later (like testimonials, for example), I can do so with just a few lines of code.</p><p>Of the many headless CMS providers, there are out there, I chose to go with <a href="https://www.sanity.io/">Sanity.io</a>. Again, price played a part in this. For what I need from the service, Sanity's free plan provided more than the monthly capacity I require. It is also straightforward to scale and, through Sanity Studio, provides me with a customizable backend for creating and managing content.</p><p></p><h2><strong>Results</strong></h2><p>All in all, I've been able to meet the core requirements I had for my website when I set out on the road to moving it off the Rails framework. Some wins from the process have been:</p><ul><li>The only cost that I now have associated with running the website is paying for my domain name.</li><li>I was able to add a completely customizable, scalable backend to my website with minimal effort.</li><li>I no longer have to worry about setting up databases, migrations, etc.</li><li>With the migration, there was also an opportunity to make some revisions to the website structure and content.</li></ul><p></p><p>But that's all good stuff to me. What about visitors to my site? Well, take a look at the comparison below from <a href="https://developers.google.com/speed/pagespeed/insights/">Google Page Speed insights.</a> The updated site, running on Netlify, scores better on all performance metrics than the older Rails one. For good measure, I've also included the results of the Rails site running on Heroku's free tier.</p><p><strong>Netlify Free Hosting</strong></p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/c818767d5ffdf58a5d50f0c7151811acec50ec3c-960x209.jpg?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/c818767d5ffdf58a5d50f0c7151811acec50ec3c-960x209.jpg?auto=format" alt="Performance metrics hosting on Netlify" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Performance metrics hosting on Netlify</figcaption></figure><p><strong>Heroku Hobby Plan ($7/month)</strong></p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/e7c87325738665834f59157fc9e6ae1423a03319-960x209.jpg?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/e7c87325738665834f59157fc9e6ae1423a03319-960x209.jpg?auto=format" alt="Performance metrics hosting on Heroku" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Performance metrics hosting on Heroku</figcaption></figure><p><strong>Heroku Free Plan</strong></p><figure>
<picture>
<source type="image/avif" srcset="https://fershad.com/image/fetch/f_avif/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/7b1cff06ef328ed83197db5ef0ef256f7f5d6d74-960x209.jpg?auto=format" />
<img src="https://fershad.com/image/fetch/f_auto/q_auto/https://cdn.sanity.io/images/twtrbzfo/production/7b1cff06ef328ed83197db5ef0ef256f7f5d6d74-960x209.jpg?auto=format" alt="Performance metrics hosting on Heroku's Free Plan" loading="lazy" decoding="async" height="300" width="300" />
</picture><figcaption class="figure-caption text-center">Performance metrics hosting on Heroku's Free Plan</figcaption></figure><p>Overall, I'm glad I took the time to update and move my site. Sure there were a couple of hiccups along the way, mainly with existing redirects. On the whole, though, I definitely feel things are in a better state now than this time last month. Oh, and I've now got $7 spare change each month to play with!</p></div>'Hello, world.' A little bit about me.2024-02-20T13:25:46Zhttps://fershad.com/writing/hello-world-a-little-bit-about-me/<div><p>Yep, a totally cheesy title for a first post but whatever.</p><p>This will hopefully be the first of many blog posts in which I cover content spanning web development, running a business, freelancing, and just general musings. I'm sure with time, this blog will evolve, and the topics covered might expand, or even shrink. That said here's a little bit more about me, my hobbies, my past lives professionally and what I'm doing these days.</p><p>Hi there, my name's Fershad. I'm a web developer living and working in Taipei, Taiwan. Taiwan's home for me now, but before moving here in 2012, I grew up in Australia. Of course, if you've <a href="https://www.fershad.com/about">looked around my website</a> you'd know all this, and a bit more, already.</p><p>I'm of Indian descent and grew up an extremely multi-cultural part of Sydney. This allowed me to appreciate the uniqueness of individuals and cultures, something that I believe has fueled my desire to learn, travel, and grow as I've got older. Growing up, I played a lot of soccer, squash, and some cricket. Out of the three, I'd say squash gave me the biggest buzz. Though nothing can beat the sense of team that both soccer and cricket provide (beers after cricket on a Saturday were a plus too). After moving to Taiwan, I started playing Touch Football, which ironically is a sport extremely popular in Australia. Since going to my first social session, I've fallen in love with the game to the point where <a href="https://www.fershad.com/work/chinese-taipei-touch-2019-world-cup/">I was able to represent Taiwan (Chinese Taipei)</a> at the most recent FIT World Cup. Touch is fast-paced, skillful and so very, very fun to play. I've made some lifelong friendships both here in Taiwan, and around the World, just by playing, refereeing or coaching the sport I love.</p><p>Away from my hobbies, my professional life has been diverse. I've had jobs as a paralegal, IT analyst for a large Australian bank, and most recently a marketing role with a software company here in Taiwan. Each role has taught me a different skillset, parts of which I've been able to transition to the next role and others which I've taken with me learnings for the future.</p><p>So that's a little about my past, now onto the present day. I've been <a href="https://www.fershad.com/work">building websites</a> on and off for the past couple of years. It's something I've found both challenging and fulfilling. That's why as of July this year I decided the time was right to take the plunge and do it fulltime. Obviously, I want things to work out and be successful. I also want to learn more about myself, what it takes to run a business, as well as have time to pursue some of my other passions like cycling and learning Chinese.</p><p>Hopefully, that works out. Thanks for getting through this first post. To know when future posts are published give me a follow on one of my social media channels - <a href="https://www.facebook.com/fershad.digital">Facebook</a>, <a href="https://www.linkedin.com/company/fershad-digital/">LinkedIn</a>, and <a href="https://www.instagram.com/fershad.digital/">Instagram</a>.</p></div>