Archive for February, 2021

I got vaccinated for Covid – this is what it was like.

Yesterday, I received my first Covid vaccine. I was expecting to be in the next group of people invited, as I have multiple sclerosis, which is a disease in which my own immune system tries to kill me, and many Covid deaths are caused by the body’s own immune system. My good chum Stuart Langridge wrote up his vaccination experience; here’s mine.

Out of the blue I received an SMS on Friday morning:

Our records show that you are eligible for your COVID vaccination. Appointments are now available at Villa Park and Millennium Point. Book here: https://www.birminghamandsolihullcovidvaccine.nhs.uk/book/

Your GP Surgery.

The website is on a legit domain, and linked to a booking system run by drdoctor.co.uk, which was a pretty crap experience (which I reported to them); top tip: you need to have your NHS number to book, and if you don’t, you might lose your chosen slot and have to start all over again. And that was that; a confirmation SMS came through:

Confirmation of your appointment: Sat 13 Feb at 4:10pm at Villa Park, B6 6HE. You appointment at Villa Park COVID Vaccination Clinic is confirmed at Villa Park, Holte Suite, Trinity Road, Birmingham, B6 6HE. https://www.avfc.co.uk/villa-park/travel-parking

Villa Park is the stadium for the worst Birmingham football team, so it was nice that something positive was going to happen there. As I approached in the car, there were plenty of temporary signposts to the Covid Vaccination Centre to help people find it.

Signpost: Villa park Covid 19 vaccination centre

I arrived 20 minutes early (I’m paranoid about missing appointments) and although the site had told me not to enter more than 10 minutes before my slot, it didn’t appear to be crowded so I went in. It was basically a big room with check-in desks around the perimeter and at least 20 vaccination stations in the centre. The bloke at the door told me to go up to checkin desk 12; the lady asked me for my reference number (I hadn’t been sent one), my NHS number (I hadn’t been told to bring it) and then my name and address.

After verifying that I had an appointment, she asked me to sit on one of the chairs placed 2 metres apart, facing her (so we weren’t all staring at people having their jabs while we waited, which was a thoughtful touch for those nervous of needles, like me).

me sitting in the waiting area

A friend had been vaccinated the day before at an alternate vaccination hub and there had been a clerical error which meant too many people had showed up, so it took her 3 hours from entering to leaving, so I’d bought a book. But I only had time to take the selfie above before a man came up and asked me to follow him to a vaccination station where an assistant was finishing cleaning the chair. I sat down, confirmed my name, and rolled up my sleeve.

The syringe was bigger than a flu jab and while I honestly felt no pain at all as the needle went in, it was in my arm for a few seconds as there was presumably more vaccine in there than the flu jab, which is pretty much instantaneous. Then the syringe-wielder told me that I had to wait in another area for 15 minutes before driving, laughed when I asked if I could have a sticker, but gave me the best sticker I’ve ever received:

sticker: I've had my Covid vaccination

I asked which vaccine I’d received; it was the Oxford one. She gave me an info leaflet, a card with a URL and a phone number for booking the second jab and graciously accepted my gratitude. By 16:06, four minutes before my appointment, I was sitting in the waiting area, reading my book for 15 minutes.

The whole thing was brilliant; calm, professional, well-organised and reassuring. Today my arm has a slight soreness (just like my annual flu jab) but I feel fine. Actually, I feel better than fine. I feel optimistic, for the first time in a year.

Doubtless, the government will try to claim this as their triumph. It isn’t. It’s a triumph of science and socialised public sector medicine. The government gave billions to private sector cronies for a test-and-trace fiasco and for the last ten years have underfunded the National Health Service. Many leading Conservatives have openly called for its privatisation. Remember that when the next election comes around.

Thank you, Science; thank you, social health care.

Reading List 271

Get that dream tech gig by solving Mike Taylr’s Silicon Valley whiteboard interview question

Like every other thought-leader, I follow Mike Taylr on social media. Ever since Shingy left AOL, “Mikey” has moved to the top spot of everyone’s Twitter “Futurist Gurus” Twitter list. This morning I awoke to read Twitter abuzz with exictement over Mike’s latest Nolidge Bom:

Of course, like anyone who’s ever sat a maths exam and been told to “show your working out”, you know that the widely diverse interview panel of white 20-ish year old men is as interested in how you arrived at your answer as in the answer itself. Given Mikey’s standing in the industry and the efficiency of his personal branding consultants, this question will soon be common for those interviewing in Big Tech, as it’s an industry that prides itself on innovative disruption by blindly copying each other. So let’s analyse it.

It’s obvious that the real test is your choice of marker colour. So, how would you go about making the right decision? Obviously, that depends where you’re interviewing.

If you’re interviewing for Google or one of its wannabes, simply set up a series of focus groups to choose the correct shade of blue.

If you’re interviewing for Apple or its acolytes, sadly, white ink won’t work on a whiteboard, no matter how aesthetically satisfying that would be. So choose a boring metallic colour and confidently assert any answer you give with “I KNOW BEST”.

If you’re interviewing for Microsoft, the colour doesn’t matter; just chain the marker to the whiteboard and say “you can’t change the marker, it’s an integral part of the whiteboard”, even after it stops working.

If you’re interviewing for Facebook or one of its wannabes, trawl through previous posts by the panellists, cross reference it with those of their spouses, friends and their friends to find their favourite colours, factor in their Instagram posts, give a weighting to anything they’ve ever bought on a site they’ve signed in using Facebook, and use that colour while whispering “Earth is flat. Vaccines cause cancer. Trump is the saviour. Muslims are evil. Hire me” subliminally over and over again.

Good luck in the new job! May your stocks vest well.

Don’t put pointer-events: none on form labels

(Last Updated on 24 February 2021)

The other day I was tearing my hair out wondering why an HTML form I was debugging wouldn’t focus on the form field when I was tapping on the associated label. The HTML was fine:

<label for="squonk">What's your name, ugly?</label>
<input id="squonk">

I asked my good chum Pat “Pattypoo” Lauke for help, and without even looking at the form or the code, he asked “Does it turn off pointer events in the CSS?”

Lo and FFS, there it was! label {pointer-events:none;}! This daft bit of CSS breaks the browser default behaviour of an associated label, and makes the hit target smaller than it would otherwise be. Try clicking in the “What’s your name, ugly?” text:

Try me, I’m good



Try me, I’m crap



I’m jolly lucky to have the editor of the Pointer Events spec as my chum. But why would anyone ever do this? (That line of CSS, I mean, not edit a W3C spec; you do the editing for the sex and the glory.)

Once again, Pat whipped out his code ouija board:

And, yes—the presentation had originally been Material Design floating labels, and this line of CSS had been cargo-culted into the new design. So don’t disable pointer events on forms—and, while you’re at it, Stop using Material Design text fields!

The clown in Steven King's IT down a storm drain, saying 'We all float labels down here Georgie"

Review: Evinced accessibility site scanner

(Last Updated on 12 February 2021)

That nice Marcy Sutton asked me to test and give feedback about a new product she’s involved with called Evinced, which describes itself as an “Enterprise grade digital accessibility platform for modern software development teams”. Quite what “enterprise grade” means is beyond me, but it’s basically software that can crawl a website from a root domain and check its code against some rules and report back. There are similar tools on the market, and I’ve recently been working with a Client to integrate Tenon into their workflow, so wanted to compare them.

“Now hang on!”, I hear you say. “automated tools are terrible!”. Well, yes and no. Certainly, overlays etc that claim to automatically fix the problems are terrible, but tools that help you identify potential problems can be very useful.

It’s also true that automated tools can’t spot every single accessibility error; they can tell you if an image is missing alternate text, but not that <img src=dog.png alt=”a cat”> has useless alt text. Only a human can find all the errors.

However, many errors are machine-findable. The most-common errors on WebAIM’s survey of the top million homepages are low contrast text, missing alternative text, empty links, missing form input labels, empty buttons and missing document language, all of which were found by running automated tests on them (which, presumably, the developers never did before they were pushed to production).

I personally feel that a good automated scanner is a worthwhile investment for any large site to catch the “lowest hanging fruit”. While some things can’t be automatically tested, other things can, and other aspects live in a grey area depending on the rigour of the test.

For example, a naive colour contrast test might compare CSS color with background-color, and give a pass or fail; a less naive test will factor in any CSS opacity set on the text and ignore off-screen/ hidden text. A sophisticated contrast test could take a screenshot of text over an image or gradient and do a pixel-by-pixel analysis. To do a test on a screenshot would require actually rendering a page. Like Tenon, Evinced doesn’t just scan the HTML, but renders the pages in a headless browser, which allows the DOM to be tested (although I don’t believe it tests colour contrast in this way).

Evinced uses Axe Core, and open-source library also used by Google Lighthouse. It also contains other (presumably proprietary secret-source) tests so that “if interactable components are built with divs, spans, or images – Evinced will detect if they are broken”.

Reporting

The proof of the pudding with automated site scanners is how well they report errors they’ve found. It’s all very well reporting umpty-squllion errors, but if it’s not reported in any actionable way, it’s not helpful.

Like all the automated scanners I’ve tried, errors are grouped according to severity. However, if those categories correspond with WCAG A, AA and AAA violations, that’s not made clear anywhere.

Graph showing number of errors by type

It’s a fact of corporate life that most organisations will attempt to claim AA compliance, so need to know the errors by WCAG compliance.

One innovative and useful reporting method is by what Evinced calls component grouping: “Consolidates hundreds of issues to a handful of broken code components”.

With other scanners, it takes a trained eye to look through thousands of site-wide errors and realise that a good percentage of them are because of one dodgy piece of code that is repeated on every page. Evinced analyses pages and identifies these repeated components for you, so you know where to concentrate your efforts to get gross numbers down. (We all know that in the corporate world, a quick fix that reduces 10,000 errors to 5,000 errors buys you time to concentrate on the really gnarly remaining problems.)

graph showing 32% of all issues are grouped into 38 components; 1 component accounts for 81% of critical issues

There’s a vague suggestion that this grouping is done by Artificial Intelligence/ Machine Learning. The algorithm obviously has quite clever rules, and shows me a screenshot of areas on my pages it has identified as components. It’s unclear whether this is real Machine Learning, eg whether it will improve as its corpus of naughty pages gets larger.

list of components automatically identified on my site with screenshots and the relevant areas highlighted

I don’t recall signing anything to allow my data to be used to improve the corpus; perhaps a discount for not-for-profits/ time-limited demos could be offered to organisations allowing their data to be added to the training data, if indeed that’s how it “learns”.

User Interface

Many of these site scanners are made by engineers for engineers and have the similar high levels of UX one would expect from JavaScripters.

Tenon has some clunkers in its web interface (for example, it’s hard to re-run a previously defined scan) because it’s most commonly accessed via its API rather than web back-end.

Evinced makes it easy to re-run a scan from the web interface, and also promises an API (pricing is not announced yet) but also suffers from its UI. For example, one of my pet peeves is pages telling me I have errors but not letting me easily click to see them, requiring me to hunt. The only link in this error page, for example, goes to the “knowledge base” that describes the generic error, but not to a list of scanned pages containing the error.

Page showing how many errors I have but not linking to them

(After I gave feedback to the developers, they told me the info is there if you go to the components view. But that requires me to learn and remember that. Don’t make me think!)

There’s also terminology oddities. When setting up a new site for testing, the web interface requires a seed URL and to press a button marked “start mapping”, after which the term “mapping” is no longer used, and I’m told the system is “crawling”. Once the crawl was complete, I couldn’t see any results. It took a while for me to realise that “crawling” and “mapping” are the same thing (getting a list of candidate URLs) and after the mapping/ crawling stage, I need to then do a “scan”.

A major flaw is the ability to customise tests. In Tenon I can turn off tests on an adhoc basis if, for example, one particular test is giving me false positives, or I only want to test for level A failures. This is unavailable in Evinced’s web interface.

Another important but often-overlooked UI aspect of these “Enterprise” site scanners is the need to share results across the enterprise. While it’s common to sell “per-seat” licenses, it’s also necessary for the licensee to able to share information with managers, bean-counters, legal eagles and the like. Downloading a CSV doesn’t really help; it’s much more useful to be able to share a link to the results of a run and let the recipient investigate the reports and issues, but not let them change any configuration or kick off any new scans. This is missing in both Evinced and Tenon.

Conclusion

The system is currently in Beta and definitely needs some proper usability testing with real target users and UI love. One niggle is the inaccuracy of its knowledge base (linked from the error reports). For example, about the footer element, Evinced says

Since the <footer> element includes information about the entire document, it should be at the document level (e.g., it should not be a child element of another landmark element). There should be no more than one ><footer> element on the same document

This is directly contradicted by the HTML specification, which says

The footer element represents a footer for its nearest ancestor sectioning content or sectioning root element… Here is a page with two footers, one at the top and one at the bottom… Here is an example which shows the footer element being used both for a site-wide footer and for a section footer.

I saw no evidence of this incorrect assumption about the footer element in the tests, however.

All in all, the ability of Evinced to identify repeated ‘components’ and understand the intended use of some splodge of JavaScriptted labyrinth of divs is a welcome feature and its main selling point. It’s definitely one to watch when the UX is sleeker (presumably when it comes out of Beta).

Saperlipopette

Saperlipopette song

In these difficult times, Lawrence Vagner and I felt a solemn duty to heal the world with a hopeful message of love and cross-cultural unity to a disco beat. So here is our Eurovision entry: Saperlipopette!

Get your hotpants on & boogie for a better tomorrow.

Eh?

“Saperlipopette” is a very dated French “swear word” translating to “goodness me” or “fiddlesticks”, the kind of thing you’d say if a child were in earshot. My chum Lawrence Vagner taught it to me when they invited me to speak at ParisWeb. I got a daft tune in my head and “Saperlipopette” fitted the melody. (The rest of the lyrics practically wrote themselves, and make a damed sight more sense than the 1968 song with the same title. In fact, I had to discard a couple of verses.) I invited Lawrence to duet with me, which was fun as they’d never sung before, and we had to do it remotely due to lockdown.

It’s made with Reason Studio, using the Reason Disco and Norman Cook refills as well as built-in instruments, and a French accordion sample I found. My chum Shez twiddled the knobs, Lawrence made the website, which is hosted by Netlify.