NOTE: International and wanting a printed book shipped to you? See the FAQ down below! :)
I am humbly asking for your support in writing a book called "The jsPerf Guide". Your generous donations will help me write and deliver the book in a timely fashion, as well as keep it updated in the future.
As if lack of information didn't make it hard enough to get it right, the makers of the JS engines are constantly improving and optimizing how the engine works. So, what was faster in your code last week may not be faster anymore. Good luck keeping up with that!
No surprise then that jsPerf.com received such high praise when it launched. The site allows you to easily (almost too much so! -- more on that below) create head-to-head tests of different code approaches, then "crowd-source" distribute the testing across a wide variety of browsers and devices, to find which approach is indeed faster.
One of the most useful results from such a test is to see how a particular approach may work fine and similar in most browsers but have a major downside in performance in a particular browser/device scenario. That kind of information was previously almost impossible to reliably gather.
Why on earth do I need to write a book about jsPerf.com, then? Should be pretty simple and self-explanatory, right?
Turns out that with much power comes much responsibility. The site is so powerful and so easy, that it's too easy to set up bogus and invalid tests which give you, often quite convincingly, totally wrong conclusions.
You've heard of GIGO (garbage-in-garbage-out), right? Same goes with this site. Set up a garbage test (on purpose or by accident), you'll get garbage results. That's not so terrible, except that some unsuspecting dev will find your test results 6 months from now, and won't realize the mistake, and the misleading will continue to propagate.
The most important point of this book is to dive deep into what it takes to actually create valid, useful, effective head-to-head performance tests. This won't just be academic minutia, but we'll go over a whole bunch of tests already in jsPerf.com and examine common patterns for how people are creating good tests and how others are really bad.
Besides learning how to use jsPerf.com effectively, we're going to actually look at how jsPerf works, with the underlying Benchmark.js library. This library is phenomenally detailed in its statistically-sound approach to accurately testing even the slimmest of performance differences in tests.
But it's a little known jewel that this library can be used outside of the site, and that will be the second half of the book. We'll look at how you can use Benchmark.js in your own build processes (via node.js or other server-side JS environments) to setup and run automated performance regression tests for critical pieces of your code. Never again get surprised when a code-push to production shows a 10% drop in customer conversions because of a slow site!
The book will include a brief history of how jsPerf.com and Benchmark.js were built and evolved, as well as a Q&A "interview" with the respective authors. Expect plenty of coverage of all your favorite "wtfjs" moments!
Then we'll cover in-depth how to write tests that get the results you want and need, avoiding the dreaded bogus-test fairy.
We'll then talk about how to integrate Benchmark.js into our processes so we can automatically test code performance just like we lint it for code quality.
If this project is fully funded, I will write the book, and O'Reilly has agreed to publish it and take care of all digital and physical fulfillment of the books. If it's not funded, I guess I'll probably cry in the corner for a few days, then I'll move on to another book project. But let's not test that outcome, OK? Let's fund this baby!
The book is planned to be about 50 pages in length, and will be primarily published in ebook format. However, if you're keen for a print copy of the book, donating here on Kickstarter is your best chance to get one hot off the O'Reilly presses.
Neither the author/maintainer of jsPerf.com nor Benchmark.js wish to consider this book an "official" endorsed project, but they're both happy and excited to see the book written! All opinions expressed in the book (with the exception of the Q&A quotes, which will be properly attributed) will be solely mine, and will not reflect the opinions of jsPerf.com, Benchmark.js or its authors or maintainers.
The "jsPerf"-printed USB sticks are planned to be promotional-gift grade 1GB swivel-type USB drives with the jsPerf name custom printed on them. The point is so you can show your jsPerf pride, not have a super mega awesome USB stick, which we all already own ourselves.
The reward levels which include consultation sessions will be fulfilled entirely by Kyle Simpson through Getify Solutions, Inc, and are neither affiliated with, warrantied by, nor endorsed by O'Reilly, jsPerf, or Benchmark.js.
Remote consultations will be provided via standard online conferencing (Skype, etc). On-site consultations will be provided at the location of your choosing, solely at my cost. All scheduling for consultation sessions are subject to availability.
Excess Proceeds (Hopefully!)
All excess proceeds above the funding goal will be used in the following way:
- 50% of the excess proceeds will help future updates of this book and then be rolled into the next book project I do (coming soon)!
- The rest will be used for awesome. More on that later.
Who am I?