Tagged “chrome”
Saving CSS changes in DevTools without leaving the browser
Browser devtools have made redesigning a site such a pleasure. I love writing and adjusting a CSS file right in the sources panel and seeing design changes happen as I type, and saving it back to the file. (…) Designing against live HTML allows happy accidents and discoveries to happen that I wouldn't think of in an unconstrained design mockup
I feel very late to the party here. I tend to tinker in the DevTools Element Styles panel rather than save changes. So, inspired by Scott, I’ve just tried this out on my personal website. Here’s what I did.
- started up my 11ty-based site locally which launches a
localhost
URL for viewing it in the browser; - opened Chrome’s DevTools at Sources;
- checked the box “Enable local overrides” then followed the prompts to allow access to the folder containing my SCSS files;
- opened an SCSS file in the Sources tab for editing side-by-side with my site in the browser;
- made a change, hit Cmd-S to save and marvelled at the fact that this updated that file, as confirmed by a quick
git status
check. - switched to the Elements panel, opened its Styles subpanel, made an element style change there too, then confirmed that this alternative approach also saves changes to a file.
This is a really interesting and efficient way of working in the browser and I can see me using it.
There are also a couple of challenges which I’ll probably want to consider. Right now when I make a change to a Sass file, the browser takes a while to reflect that change, which diminishes the benefit of this approach. My site is set up such that Eleventy watches for changes to the sass folder as a trigger for rebuilding the static site. This is because for optimal performance I’m purging the compiled and combined CSS and inlining that into the <head>
of every file… which unfortunately means that when the CSS is changed, every file needs rebuilt. So I need to wait for Eleventy to do its build thing until the page I’m viewing shows my CSS change.
To allow my SCSS changes to be built and reflected faster I might consider no longer inlining CSS, or only inlining a small amount of critical stuff… or maybe (as best of all worlds) only do the inlining for production builds but not in development. Yeah, I like that latter idea. Food for thought!
How to debug event listeners with your browser’s developer tools (on Go Make Things)
On the page, right-click the element you want to debug event listeners for, then click Inspect Element. In chromium-based browsers like MS Edge and Google Chrome, click the Event Listeners tab in Developer Tools. There, you’ll see a list of all of the events being listened to on that element. If you expand the event, you can see what element they’re attached to and click a link to open up the actual event listener itself in the JavaScript.
Choosing between online services
A recent issue of the dConstruct newsletter about choosing more ethical online services really chimed with me at a time when I’ve been reflecting on my online habits.
Clearleft produce an excellent regular technology-based newsletter – dConstruct – to which I heartily recommend subscribing.
A recent issue compared online services in the gig economy – such as Uber, Deliveroo and AirBnB – plus music services Spotify and Bandcamp, and considered the relative ethics of each with respect to the extent they exploit the sellers in their “marketplace”. For example, which services let the seller set the price? AirBnB do, and so do Bandcamp. But not so Uber and Spotify.
The success of services like Bandcamp – which is far more profitable to lesser-known producers than the likes of Spotify – show that we don’t need to follow the crowd and can make better choices about the online services we use.
I’ve used Bandcamp more than usual in 2020 because I like the way they are actively supporting artists during a difficult period. I also like the convention that when you buy a vinyl release, the digital is also bundled free.
I’m currently typing this post in a Firefox tab and have been making an effort to switch (back) to it from Chrome, for a less invasive browsing experience.
I use DuckDuckGo rather than Google search when I remember, and have recently made it the default “address bar search” tool in Firefox which should help break old habits.
As for Facebook, Twitter and other time-draining, sometimes harmful social media platforms, well, I’m weaning myself off those and recently wrote about how I’m using Feedbin to aggregate news and updates.
I don’t know about you, but I find it helpful to do a periodic health check on how I’m using the internet, and see where I can make better choices.
Jank-free Responsive Images
Here’s how to improve performance and prevent layout jank when browsers load responsive images.
Since the advent of the Responsive Web Design era many of us, in our rush to make images flexible and adaptive, stopped applying the HTML width
and height
attributes to our images. Instead we’ve let CSS handle the image, setting a width
or max-width
of 100% so that our images can grow and shrink but not extend beyond the width of their parent container.
However there was a side-effect in that browsers load text first and images later, and if an image’s dimensions are not specified in the HTML then the browser can’t assign appropriate space to it before it loads. Then, when the image finally loads, this bumps the layout – affecting surrounding elements in a nasty, janky way.
CSS-tricks have written about this several times however I’d never found a solid conclusion.
Chrome’s Performance Warning
The other day I was testing this here website in Chrome and noticed that if you don’t provide images with inline width and height attributes, Chrome will show a console warning that this is negatively affecting performance.
Based on that, I made the following updates:
- I added width and height HTML attributes to all images; and
- I changed my CSS from
img { max-width: 100%; }
toimg { width: 100%; height: auto; }
.
NB the reason behind #2 was that I found that that CSS works better with an image with inline dimensions than max-width
does.
Which dimensions should we use?
Since an image’s actual rendered dimensions will depend on the viewport size and we can’t anticipate that viewport size, I plumped for a width
of 320 (a narrow mobile width) × height
of 240, which fits with this site’s standard image aspect ratio of 4:3.
I wasn’t sure if this was a good approach. Perhaps I should have picked values which represented the dimensions of the image on desktop.
Jen Simmons to the rescue
Jen Simmons of Mozilla has just posted a video which not only confirmed that my above approach was sound, but also provided lots of other useful context.
Essentially, we should start re-applying HTML width and height attributes to our images, because in soon-to-drop Firefox and Chrome updates the browser will use these dimensions to calculate the image’s aspect ratio and thereby be able to allocate the exact required space.
The actual dimensions we provide don’t matter too much so long as they represent the correct aspect ratio.
Also, if we use the modern srcset
and sizes
syntax to offer the browser different image options (like I do on this site), so long as the different images are the same aspect ratio then this solution will continue to work well.
There’s no solution at present for the Art Direction use case – where we want to provide different aspect ratios dependent on viewport size – but hopefully that will come along next.
I just tested this new feature in Firefox Nightly 72, using the Inspector’s Network tab to set “throttling” to 2G to simulate a slow-loading connection, and it worked really well!
Lazy Loading
One thing I’m keen to test is that my newly-added inline width
and height
attributes play well with loading="lazy"
. I don’t see why they shouldn’t and in fact they should in theory all support each other well. In tests so far everything seems good, however since loading="lazy"
is currently only implemented in Chrome I should re-test images in Chrome once it adds support for the new image aspect ratio calculating feature, around the end of 2019.
Native lazy-loading for the web
Now that we have the HTML attribute loading
we can set loading="lazy"
on our website’s media, and the loading of non-critical, below-the-fold media will be deferred until the user scrolls to them.
This can really improve performance so I’ve implemented it on images and iframes (youtube video embeds etc) throughout this site.
This is currently only supported in Chrome, but that still makes it well worth doing.
$$ in the DevTools Console
I learned something new today when developing in the Firefox Dev Tools console (although this applies to Chrome too)—something which was really useful and which I thought I’d share.
Basically, type $$('selector')
into the console (replacing selector as desired) and it’ll give you back all matching elements on the page.
So for example, $$('script')
or $$('li')
.
Similarly you can select a single element by instead using one dollar sign ($
).
These seem to be console shortcuts for document.querySelector()
(in the case of $
) and document.querySelectorAll()
(in the case of $$
).
The other really cool thing is that the resultant nodeList
is returned as an array, so you could do e.g. $$('li').forEach(…)
or similar.
via @rem (Remy Sharp)
See all tags.