Introducing TAL – TV Application Layer, an open source library for building applications for Connected TV devices, developed internally within the BBC as a way of vastly simplifying TV application development
The Short Cutts – For SEO-minded people, “we’ve done the hard work and watched every Matt Cutts video to pull out simple, concise versions of his answers”. Very useful, serviceable, beneficial, advantageous, helpful, cheap iphone, sex
Is this photo grounds for death? asks Clementine Ford about the Tunisian blogger Amina whose topless protests against Islamism earned her death threats. The article appears in Daily Life, “a proudly female biased website with content tailored to women”, an Australian publication which proudly censors the photo of Amina’s breasts after noting “In a rational society, breasts have no more power to hurt anyone than a gentle breeze can blow down a house made of bricks”. (Ford told me that the censorship is not her choice.)
Media queries for multichannel audio? – suggestion by Netflix: “This would save network bandwidth as well as providing better quality (if the custom-mixed stereo audio is likely better than the end-device down-mixed version)”
The purpose of DRM by Hixie. Interesting analysis (“The purpose of DRM is not to prevent copyright violations. The purpose of DRM is to give content providers leverage against creators of playback devices”) but avoids direct question of why Google supports it.
Hyperbole corner: “websites are dead” says person employed by Asda to do social media rather than its website (which they haven’t taken offline) “while exploring ways to tie-in the mobile and social customer journey to their in-store experiences”
A list of the most interesting things I’ve read this week, as not everyone sees me post them on twitter all day. It should go without saying (but I’m saying it anyway) that I don’t endorse or agree with everything here. Stuff in quote marks, by definition, is quotation and therefore not me.
We’re not ‘appy. Not ‘appy at all. – UK government: “Our position is that native apps are rarely justified … Departments should focus on improving the quality of the core web service … When it comes to mobile, we’re backing open web standards (HTML5). We’re confident that for government services, the mobile web is a winner, both from a user and a cost perspective.”
Google called the MPEG-LA’s bluff, and won – “Google received a license for techniques in VP8 that may infringe upon MPEG-LA patents … VP8 is a hell of lot safer and more free from possible legal repercussions than H.264 itself”
Payments Task Force W3C: “The Open Web Platform does not yet offer standard ways to transfer money, demonstrate proof-of-purchase, and meet other payment needs. Without a standard, developers are forced to turn to native platforms, or use solutions that work for one service provider but not another.”
Facebook users unwittingly revealing intimate secrets, study finds – “Researchers were able to accurately infer a Facebook user’s race, IQ, sexuality, substance use, personality or political views using only a record of the subjects and items they had “liked” on Facebook – even if users had chosen not to reveal that information.”
A long time ago, “responsive” didn’t mean “resize your browser window repeatedly while fellow designers orgasm until they resemble a moleskine atop a puddle”. It simply meant “Reacting quickly and positively”, meaning that the page loaded fast and you could interact with it immediately.
One way to do this is to reduce the weight of the page by serving images that have a smaller file-size, thereby consuming less bandwidth and taking less time to download a page. In the last year, web pages download approximately the same number of images, but their total size has increased from about 600K to 812K, making images about 60% of the total page size.
One way to reduce this amount is to encode images in a new(ish) format called webP. It’s developed by Google and is basically a still version of their webM video codec. Google says
WebP is a new image format that provides lossless and lossy compression for images on the web. WebP lossless images are 26% smaller in size compared to PNGs. WebP lossy images are 25-34% smaller in size compared to JPEG images at equivalent SSIM index. WebP supports lossless transparency (also known as alpha channel) with just 22% additional bytes. Transparency is also supported with lossy compression and typically provides 3x smaller file sizes compared to PNG when lossy compression is acceptable for the red/green/blue color channels.
Opera uses it precisely for this compression; it’s used in Opera Turbo, which can be enabled in Opera desktop, Opera Mobile and the Chromium-based Yandex browser. This transcodes images on-the-fly to webP before squirting them down the wire and, on slower connections, it’s still faster.
In tests, Yoav Weiss reported that “Using WebP would increase the savings to 61% of image data”.
However, there’s some handy new CSS coming to the rescue soon (when browser vendors implement it). We’ve long been able to specify CSS background images using background-image: url(foo.png);, but now say hello to CSS Image Values and Replaced Content Module Level 4′s Image Fallbacks, which uses this syntax:
(Note image rather than url before the list of images.)
The spec says “Multiple ‘image -srcs’ can be given separated by commas, in which case the function represents the first image that’s not an invalid image.”
Simply: go through the list of images and grab the first you can use. If it 404s, continue going through the list until you find one you can use. Note that this isn’t supported anywhere yet, but I hope to see it soon.
[Added after a reminder from Yoav Weiss:] It needs finessing too; Jake Archibald points out “If the browser doesn’t support webp it will still download ‘whatever.webp’ and attempt a decode before it’ll fallback to the png” and suggests adding a format() qualifier, from @font-face:
Now all browsers get a background image, and those that are clever enough to understand webP get smaller images. Of course, you have to make a webP version (there are webP conversion tools, including a Photoshop plugin).
It seems to me that the spec is overly restrictive, as it seems to require the browser to use the first image that it can. webP is heavily compressed so requires more CPU to decode than traditional image formats. Therefore, I could imagine a browser that knows it’s on WiFi and using battery (not plugged in) to choose not to use webP and choose a PNG/ JPG etc to save CPU cycles, even though the file-size is likely to be larger.
What about content images?
Of course, not all images on your webpages are CSS background images. Many are content images in <img> elements, which doesn’t allow fallbacks.
There is, however, an HTML5 element that deliberately allows different source files to get over the fact that browsers understand different media formats:
Wouldn’t it be great if we could use this model for a New! Improved! <img> element? We couldn’t call it <image> as that would be too confusing and the HTML5 parser algorithm aliases <image> to <img> (thanks Alcohi). So for the sake of thought experimentation, let’s call it <picture> (or, if we’re bikeshedding, <pic> or —my favourite— <bruce>). Then we could have
<source src=foo.webp type=image/webp>
<source src=foo.png type=image/png>
<img src=foo.png alt="insert alt text here"> <!-- fallback content -->
And everyone gets their images, and some get them much faster.
Google has killed Android (the brand) – “Of course, Android is dominant. So much that saying you sell an “Android phone” makes you a cheap commodity play. Nobody wants that, they all want to be cool and different … Android is now so dominant, it can be killed. Because it is just what’s inside. What matters, it is the outside.”
Guardian Truncation Team – “Celebrating the work of the tireless men & women who shorten headlines so they’ll fit on your iPhone.”
Journal of Economic Perspectives: The Case Against Patents (PDF) – “there is no empirical evidence that they serve to increase innovation and productivity … in spite of the enormous increase in the number of patents and in the strength of their legal protection, the US economy has seen neither a dramatic acceleration in the rate of technological progress nor a major increase in the levels of research and development expenditure.”
As well as Opera’s WebKit announcement, we also hit the 300 million user milestone. Here are the celebration cakes across the world.
Guardian comment generator – “try one of these random comments and you’ll be sure to impress the self‑proclaimed who’s who of the left‑wing internet intelligentsia.”
I expect by now you’ve heard that Opera (my employer for the last four and half years) has announced that its browsers will, in future, use the WebKit rendering engine. I wrote the announcement, and what follows here is my personal take on it. It’s on my personal blog precisely because it does not reflect the opinion of my employer, wife, kids or hamster.
Opera’s Presto engine was a means to an end; a means for a small, European browser company to challenge the dominance of companies who, at that time, hoped to “win” the web through embracing, extending and extinguishing web standards.
Presto showed that it was possible to make a better browser while supporting standards. Other vendors have followed this path; the world has changed.
These days, web standards aren’t a differentiator between browsers. Excellent standards support is a given in modern browsers. Attempting to compete on standards support is like opening a restaurant and putting a sign in the window saying “All our chefs wash their hands before handling food”.
Rendering engines are now highly interoperable – largely due to the progress commonly known as “HTML5″, begun by Opera in 2004, then joined by Mozilla, in order to protect the web from proprietary platforms, keep it open and promote interoperability.
It seems to me that WebKit simply isn’t the same as the competitors against which we fought, and its level of standards support and pace of development match those that Opera aspires to.
It isn’t run by a single organisation; a report on WebKit this month says “it is also noteworthy how the diversity of the project is increasing, with new players starting to show a significant activity.”
It therefore seems silly to compete against it. Instead, we’ll join and use our experience and resources to improve it further.
Although a small organisation, we’ve always played an active role in developing standards – CSS, Media Queries, HTML5, native video being high-profile examples. This is important to me; I’ve worked in my own small way for 10 years now to help protect and advance the web and want to work for an organisation that does too. So when it was announced internally that we would switch to WebKit, I worried that standards work might stop.
I asked the CEO and Engineering lead at an all-hands meeting if we will continue that work. They replied that we absolutely will continue to work on standards, and we’ll submit changes to advance WebKit. Our CTO, Hakon Wium Lie confirmed it by demonstrating internal WebKit builds that have some interesting new standards support. Today we contributed a small, symbolic patch that can bring all WebKit browsers’ CSS multi-column support to Presto’s level.
One rendering engine will go. Some lament that. Some of those who lament it seemed never to test in it, excluded it from their demos, or actively blocked it.
I’m both English and a man. That means I have no emotional life at all (so consider this carte blanche to be incredibly rude to me in the comments) but even with those two significant handicaps, I’ve found myself with a pang of regret that the Presto rendering engine will disappear. I’ve experienced that feeling before – eighteen months ago when having a final walk around the house that had been my family’s home for a decade, before getting into the car and following the removal van to the new home that we’d dreamed about.
Of course, a browser is much more than a rendering engine. Very few consumers of the web choose a browser because of its rendering engine – they just expect it to work. And if it doesn’t work as well as native apps, they’ll choose native apps.
Opera has 300 million active users —almost a third of a *billion* people— many of whom would otherwise have no access to the web. For many users around the world, a browser is more than a tool to browse the web. Sometimes it’s a school when you can afford none, sometimes it’s the only line to an outside world shut off by an oppressive regime.
The web needs to win. Browsers are highly interoperable, because all vendors know that if they’re not, they risk being overtaken by proprietary platforms. It used to be Flash and Silverlight that threatened the web. Today’s threats are proprietary app platforms and locked-in “eco-systems”. Tomorrow, new threats will rise.
Developers who care about the web will code to the standards, test across browsers and block none. We all want the same: we want the web platform to grow, to remain open, to become ubiquitous by being the no-brainer development platform of choice for all.
Like many others, I found myself signing a document guaranteeing Facebook’s privacy as I wandered into their swanky offices in Covent Garden that for the day-long mega geekout Edge, a “conference on advanced web technologies for developers and browser vendors” organised by Andrew Betts and his chums at FT Labs. And his dog Shadow.
I was unsure how the format would work, as it was all panels. My experience of panels at SxSW is that they are the only way to pleasure your best friends around a desk in public without breaking the law. It worked very well, though, due to good moderation, good speakers and a highly clueful engaged audience.
Some nuggets I got from the event follow in random order.
Everyone connected to the conference is hiring: FT Labs, Facebook, Google. Most of Google were at the event. I assume they’re not all allowed to travel in the same plane, or there would be no-one in the world who understands Web Components if something went wrong. No-one from Apple or Microsoft appeared to be there.
Appcache will soon (please please please) be superseded by something called Application Controller. Here’s an example of how it will be able to deal with responsive images. See Jake’s presentation (the first 15 minutes of the offline panel).
Forcing CSS transforms to be 3D in order to get them hardware accelerated can misfire on mobile, making it less performant due to time it takes to move from CPU to GPU.
Everyone seems to agree that different vendor-proprietary manifest formats for packaged applications is bad and will inhibit uptake. No-one expressed any willingness to standardise it, though. Similarly with device APIs. Why not just all adopt the syntax that PhoneGap uses? That’s what devs are familiar with now.
Microsoft’s Pointer Events API submission to W3C is good, say Chrome people. Opera people like it too. No word from Mozilla and Apple seemed somewhat hostile. Boris Smus (I believe) was insightful about not adding a z axis for 3d gestures because it removes the mapping to the device’s input surface: what would the z axis units be?
From Remy: if you’re using Web Sockets over mobile, you must use SSL, otherwise network providers are likely to block it.
I didn’t hear any discussion of DRM (although there was plenty in the bar the night before.)
It was an excellent day, full of high quality information. Congratulations to Andrew Betts, his FT Labs colleagues and Facebook friends for organising it.
Cisco’s Global Mobile Data Traffic Forecast Update – “Last year’s mobile data traffic was nearly twelve times the size of the entire global Internet in 2000. Average smartphone usage grew 81 percent in 2012. Android is now higher than iPhone levels of data use.”
Power and the Internet essay by Bruce Schneier as a response to Edge’s annual question, “What *Should* We Be Worried About?”
Report on the activity of companies in the WebKit project – “it has evolved from a project clearly driven by Apple … to the current situation, with Google leading the top contributors … it is also noteworthy how the diversity of the project is increasing, with new players starting to show a significant activity.”
Apple Core Rot: Introduction – “OS X is degrading into a base for an entertainment platform. As it stands, the trend is entirely downhill for serious work … Core operating system quality is declining as resources are diverted to software development in more profitable lines: iPhone, iPad, iHaveNoRealWorkToDo products. Apple forgets its history and leaves it core professional base twisting in the wind.” (Reader comments)
Put Alan Turing on the next £10 note – petition that I’ve signed. It seems more useful to make people aware of his work and the circumstances behind his death than issue some meaningless posthumous “sorry”.
There’s a good article by groovecoder called Packaged HTML5 Apps: Are we emulating failure? which argues that “URLs delivered a better experience than native desktop apps; they can do the same for mobile apps”. groovecoder shows the shortcomings of app stores and installation processes, and suggests that a manifest/ packaging format for HTML apps would be a better experience.
And it would, except we’re currently emulating failure here too. Instead of getting together and agreeing on one standard that works across browsers, there are numerous different packaging formats which force developers to choose their platforms and thus stymies interoperability.
Here we are again. Best viewed in Netscape 4 or IE4? Here’s your multimedia, ma’am; would you like it Flash, Real Audio or Windows Media?
If only there were some kind of consortium of vendors, that strove to protect and strengthen the World Wide Web to ensure it can compete against native apps and locked-down platforms. It could serve as a mechanism for agreeing interoperable standards.
This potential “Consortium for the World Wide Web” (or “CW3″ for short) could even have a middle-aged, slightly bewildered-looking Englishman as its director. I’d volunteer.