By late 2015 almost all major browsers began supporting HTTP2, the latest version of the Hypertext Transfer Protocol, which has served as the foundation of the Web since it was first created by Tim Berners-Lee at CERN in 1989.
“A test of blockquote here we go” – Tim Berners Lee
The update to this fundamental protocol opens up an exciting new chapter in the history of the Web and drastically changes how web developers ought to approach the performance optimization of their applications and web sites. However, less than 15% of sites on the
Web are currently taking advantage of HTTP2 technology, despite the widespread browser support. https://w3techs.com/technologies/details/ce-http2/all/all
In the old days of HTTP1, one of the most common causes of slow page speed times, was an excessive number HTTP Requests. Developers would seek to minimize network latency and the total number of requests needed to render a website by combining assets like scripts and stylesheets into as few files as possible. While this strategy proved effective for improving loading speeds over HTTP1 connections, it also created some headaches for developers. The monolithic file structure required for optimal loading times over HTTP1 is the exact opposite of the modular file structure which is ideal for fast and efficient development.
To solve this problem, developers began to rely on build tools like Grunt and Gulp to prepare their assets for optimal delivery. These tools allow developers to maintain a modular, well-organized codebase of human readable files, which can then easily be concatenated, minified, and obfuscated for production by an automated build process.
While build process tools still have their place in the web developer’s toolkit, the performance optimization benefits these tools provide have rapidly become obsolete because of technological improvements to browsers, web servers, and to the protocol that allows the two to “speak the same language”, HTTP.