Toggle navigation
MeasureThat.net
Create a benchmark
Tools
Feedback
FAQ
Register
Log In
the cost of eventListeners
(version: 0)
meassuring the cost of just calling an event listener.
Comparing performance of:
running async 1000 event listeners vs running async 501 event listeners
Created:
5 years ago
by:
Guest
Jump to the latest result
Script Preparation code:
var audio = document.createElement("audio"); function andAOne(e){ e.count = (e.count|| 0) +1; } function stop501(e){ e.count === 501 && e.stopPropagationImmediately(); }
Tests:
running async 1000 event listeners
for (var i = 0; i <1000; i++) audio.addEventListener("ratechange",andAOne.bind({})); audio.playbackRate = 2;
running async 501 event listeners
for (var i = 0; i <1000; i++) audio.addEventListener("ratechange",andAOne.bind({})); audio.addEventListener("ratechange",stop501); audio.playbackRate = 2;
Rendered benchmark preparation results:
Suite status:
<idle, ready to run>
Run tests (2)
Previous results
Fork
Test case name
Result
running async 1000 event listeners
running async 501 event listeners
Fastest:
N/A
Slowest:
N/A
Latest run results:
No previous run results
This benchmark does not have any results yet. Be the first one
to run it!
Autogenerated LLM Summary
(model
llama3.2:3b
, generated one year ago):
Let's break down the provided benchmark definition and test cases. **Benchmark Definition** The benchmark measures the cost of calling an event listener in JavaScript. The script preparation code creates an `audio` element, defines two functions: `andAOne` and `stop501`, and sets up a playback rate for the audio. The HTML preparation code is empty, which suggests that the benchmark only runs in the browser's context and doesn't rely on any specific HTML structure. **Test Cases** There are two test cases: 1. **Running async 1000 event listeners**: This test case adds an event listener to the `audio` element 1000 times, using the `andAOne` function as the listener, and sets the playback rate to 2. The goal is to measure the cost of creating and executing this many event listeners asynchronously. 2. **Running async 501 event listeners**: This test case adds an event listener to the `audio` element 501 times, using a combination of both `andAOne` and `stop501` functions as listeners, and also sets the playback rate to 2. The goal is to measure the cost of creating and executing these many event listeners asynchronously. **Options Compared** The benchmark compares two approaches: 1. **Single listener**: Using only one event listener (in this case, `andAOne`) for all 1000 or 501 additions. 2. **Dual listeners**: Using both `andAOne` and `stop501` as event listeners for all 1000 or 501 additions. **Pros and Cons** * **Single listener**: This approach is simpler to implement and may have fewer overheads, but it might not accurately represent real-world scenarios where multiple event listeners are used. * **Dual listeners**: Using both functions as event listeners can provide a more accurate representation of real-world usage patterns. However, this approach introduces additional overhead due to the need to handle both events. **Libraries and Special JS Features** There is no explicit mention of any libraries or special JavaScript features being used in these benchmark definitions. The script preparation code only uses standard JavaScript constructs such as `document.createElement`, functions, and event listeners. **Other Considerations** To get a more accurate representation of the cost of event listeners, it's essential to consider factors like: * Browser version and platform * Device type (e.g., desktop vs. mobile) * Network conditions * Other background activities that might interfere with benchmark results In addition to these considerations, running multiple iterations of each test case can help increase the accuracy of the benchmark results. **Alternatives** Some alternative approaches to measuring the cost of event listeners include: 1. **Manual execution**: Manually executing each event listener 1000 or 501 times to get an accurate count. 2. **Profiling tools**: Using profiling tools like Chrome DevTools or Firefox Developer Edition to analyze the benchmark results and identify performance bottlenecks. 3. **Benchmarking frameworks**: Utilizing specialized benchmarking frameworks like Benchmark.js or jsperf to run the benchmarks in a controlled environment. These alternatives can provide more accurate and detailed insights into the cost of event listeners, but they may require additional setup and expertise.
Related benchmarks:
toFixed vs Math.round()
toFixed vs toPrecision vs Math.round() vs Math.floorfast vs new Math.trunc vs numeraljs
demoasd
toFixed vs Math.round() with numbers222
Benchmark math.round *100/100 vs toFixed(2)
Comments
Confirm delete:
Do you really want to delete benchmark?