Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory and Harvard say that they have developed a new system that loads web pages up to 34 percent faster, a huge boost in load times that will bring load speeds from a few seconds down to, uh, a smaller amount of time.
Reducing page-load times has been a perpetual carrot on a stick for web developers; a task that requires balancing constantly changing hardware, software, and network limitations in order to optimize performance. Products like Facebook’s Instant Articles and Google’s AMP are the latest in a line of products meant to sell websites on the idea that shorter load times equals longer engagement.
Current web pages process dependencies inefficiently. One of the researchers involved, professor James Mickens used a traveling salesman analogy:
When you visit one city, you sometimes discover more cities you have to visit before going home. If someone gave you the entire list of cities ahead of time, you could plan the fastest possible route. Without the list, though, you have to discover new cities as you go, which results in unnecessary zig-zagging between far-away cities.
It might be helpful to think of dependencies as ingredients on a shopping list. In the current mode of page-loading, that shopping list doesn’t exist. Say you’re making chili, so you go and buy some ground beef. You go back home. Next recipe item is a can of beans. You get that, return, and your recipe says, “okay, open the can.” So you drive to buy a can opener. Next, go pick up some onions. And so on and so forth, to and fro. That’s how web pages load, like making the world’s cruddiest chili.
What Polaris does is compile all of those dependencies into a graph, a single map for fetching objects more efficiently. Its creators have found that its “gains on load-time are more consistent and more substantive” than the more popular method of compressing data, reducing network activity instead of bandwidth requirements. I’m not sure what that means for chili though.