ESLint is a fantastic linting tool to help enforce your team's code-conventions. I recently introduced ESLint into a codebase to enforce the convention of allowing a script to use dependencies only by explicitly importing them using the module syntax1. Using the built-in ESLint rule Disallow Undeclared Variables (no-undef), it was easy to find (and remove) most dependencies that were declared using the outdated Namespaces convention.
However, I also found some cases where the implicit dependencies exist as properties of other objects, and because of this the no-undef (which only handles variables) doesn't detect them. Thankfully, ESLint can still be used to solve these cases as well.
I wanted to share with you 6 nice features of Chrome Debugger that can help you improve your site’s performance significantly!
A while ago I found that interaction with a component in our site results in many detached dom elements. At first glance, it wasn't clear from the code what was causing it.
After some research I realized that the detached dom elements are still referenced by prevObject- an internal property of all jQuery objects which is used by the jQuery .addBack() and .end() methods.
If you're not familiar with prevObject or these two rarely used methods, you can use jQuery chaining in a way that essentially creates a memory leak.
In my particular case the effect on the page's memory consumption was minuscule, and I'm generally against micro-optimizations. But since jQuery is used by many people, and since I couldn't find much info about this topic online, I thought I'd share.
MVC is one of the most commonly used design patterns in web applications. It can be used both with server and client side rendering. Frameworks like ASP.NET MVC and Angular adopted the pattern and made the development extremely easy and straight-forward.
Despite its popularity, I claim that the MVC pattern is no longer the best solution for creating rich and modern web applications.
A lot of different factors can affect a web page's performance. For this reason, truly effective Web Performance Optimization starts with identifying the most significant perf bottlenecks of your site. This is usually done with tools likeDevTools, WebPagetest, PageSpeed Insights, etc.
Once you've identified a possible lead, and taken the time to refactor and optimize it, it's important to follow-up by properly validating and understanding the impact of your change. Getting this right will help you learn whether that's something you should race to implement across your site, or a best-practice that in your particular case amounts to a micro-optimization.
This type of analysis is not trivial because web performance data is typically noisy. You can reduce noise by running your optimizations as A/B experiments side-by-side with the existing implementation, and by visualizing your data with a suitable graph such has a histogram.
This post explores these techniques in-depth.